WO2020103024A1 - 一种作业控制系统、作业控制方法、装置、设备及介质 - Google Patents

一种作业控制系统、作业控制方法、装置、设备及介质

Info

Publication number
WO2020103024A1
WO2020103024A1 PCT/CN2018/116661 CN2018116661W WO2020103024A1 WO 2020103024 A1 WO2020103024 A1 WO 2020103024A1 CN 2018116661 W CN2018116661 W CN 2018116661W WO 2020103024 A1 WO2020103024 A1 WO 2020103024A1
Authority
WO
WIPO (PCT)
Prior art keywords
surveying
mapping
area
point
shooting
Prior art date
Application number
PCT/CN2018/116661
Other languages
English (en)
French (fr)
Inventor
刘鹏
金晓会
Original Assignee
广州极飞科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州极飞科技有限公司 filed Critical 广州极飞科技有限公司
Priority to EP18940965.9A priority Critical patent/EP3885940A4/en
Priority to AU2018450271A priority patent/AU2018450271B2/en
Priority to CA3120732A priority patent/CA3120732A1/en
Priority to PCT/CN2018/116661 priority patent/WO2020103024A1/zh
Priority to JP2021527156A priority patent/JP2022509082A/ja
Priority to CN201880080716.3A priority patent/CN111868656B/zh
Priority to KR1020217016659A priority patent/KR20210106422A/ko
Publication of WO2020103024A1 publication Critical patent/WO2020103024A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3826Terrain data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3852Data derived from aerial or satellite images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • G01C21/3881Tile-based structures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • G01C21/3889Transmission of selected map data, e.g. depending on route
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • Embodiments of the present disclosure relate to the technical field of surveying and mapping, for example, to a job control system, job control method, device, equipment, and medium.
  • the user when the user uses the control terminal of the drone to determine the operation route of the drone, the user generally uses the operation stick of the control terminal or enters a complete route instruction through the operation interface.
  • the operation route of the drone can be controlled by moving the operation lever of the remote control of the drone up or down, or the user manually enters the operation route of the drone in the operation interface of the control terminal.
  • the inventor found that the related art has the following defects: it takes a long time for a user to use a joystick of a control terminal or enter a complete route command through an operation interface and the user experience is low.
  • Embodiments of the present disclosure provide an operation control system, operation control method, device, equipment, and medium, so as to improve the generation efficiency of the drone operation route and the intelligence of the drone operation control.
  • An embodiment of the present disclosure provides an operation control system, including: a control terminal and an operation drone, wherein:
  • the control terminal is configured to acquire map tile data corresponding to the work area, generate an area map of the work area based on the map tile data for display, and locate according to at least one area selected by the user for the area map Point, determine at least one operation plot in the operation area, and generate an operation route corresponding to the operation plot to send to the operation drone;
  • the operation drone is configured to receive the operation route and perform flight operations in the at least one operation plot according to the operation route.
  • the system further includes: a mapping drone;
  • the control terminal is further configured to use the work area as a surveying area, determine surveying parameters matching the surveying area, and send the surveying parameters to the surveying and mapping drone, the surveying parameters include: Multiple sampling points for surveying and mapping in the surveying and mapping area by the drone;
  • the surveying and mapping unmanned aerial vehicle is configured to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to the multiple surveying and sampling points, The set is set to generate map tile data of the surveying area.
  • the system further includes: a ground terminal;
  • the ground terminal is configured to obtain the surveying and photographing photo collection, combine and / or stitch together multiple photos in the surveying and photographing photo collection, to obtain a surveying and mapping map corresponding to the surveying and mapping area, and according to the surveying and mapping map Generating map tile data corresponding to the surveying and mapping area;
  • the control terminal is configured to acquire map tile data corresponding to the work area from the ground terminal.
  • An embodiment of the present disclosure also provides a method for controlling operation on the control terminal side, including:
  • An operation route corresponding to the operation plot is generated and sent to the operation drone, so that the operation drone performs flight operations according to the operation route.
  • the method before obtaining map tile data corresponding to the operation area, the method further includes:
  • the operation area as a surveying area to determine a surveying parameter matching the surveying area, wherein the surveying parameter includes: a plurality of surveying sampling points surveyed and mapped by the surveying drone in the surveying area;
  • determining the mapping parameters matching the mapping area includes:
  • the reference photographing position point and the plurality of auxiliary photographing position points are used as a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area.
  • determining the mapping parameters matching the mapping area includes:
  • the plurality of photographing location points are used as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined and / or spliced to form a mapping map of the surveying and mapping areas.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are four vertices of a rectangle centered on the center shooting point;
  • the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is rectangular.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • detecting a user's touch operation in the human-computer interaction interface and determining a screen position point according to the touch operation includes at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the method before sending location query information to the surveying and mapping drone, the method further includes:
  • the flight control instruction is set to control the mapping drone to move in the air in a set direction and / or a set distance.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • the positioning key point including: a corner point of the surveying and mapping area and a center point of the surveying and mapping area;
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point.
  • one or more combined mapping and shooting areas are determined within the mapping area, including:
  • the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area includes:
  • mapping each of the surrounding shooting points into the surveying and mapping combined shooting area maps each of the surrounding shooting points into the surveying and mapping combined shooting area.
  • the formed multiple mapping points serve as the photographing location point.
  • the method before determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, the method further includes:
  • mapping area information In the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the method before sending the mapping parameters to the mapping drone, the method further includes:
  • the shooting parameters include a single photo shooting area of the surveying and mapping drone at a set flight height, and each shooting point corresponds to a single photo shooting area ;
  • the surveying and mapping parameters further include: the flying altitude, which is set to instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying altitude.
  • determining the preset relative position relationship between each shooting point in the combined shooting point set includes:
  • the lower left corner, the upper right corner, and the lower right corner of the center photo respectively generate four surrounding photos that meet the photo overlap index with the center photo;
  • the preset relative position relationship between each shooting point in the combined shooting point set is determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system.
  • the method before acquiring the shooting parameters of the camera device carried by the surveying and mapping drone, the method further includes:
  • the set flying height is calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • obtaining the shooting parameters of the camera equipment carried by the surveying and mapping drone including:
  • An embodiment of the present disclosure also provides a method for controlling operation on the side of an operation drone, including:
  • An embodiment of the present disclosure also provides an operation control device on the control terminal side, including:
  • the map tile data acquisition module is set to acquire map tile data corresponding to the operation area
  • a map display module configured to generate an area map of the work area based on the map tile data for display
  • a work lot determination module configured to determine at least one work lot in the work area according to at least one area positioning point selected by the user for the area map;
  • the operation route generation and transmission module is configured to generate an operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone performs flight operations according to the operation route.
  • An embodiment of the present disclosure also provides an operation drone side operation control device, including:
  • the operation route receiving module is set to receive the operation route sent by the control terminal;
  • the flight operation module is configured to perform flight operations in the at least one operation block according to the operation route.
  • An embodiment of the present disclosure also provides a control terminal.
  • the control terminal includes:
  • One or more processors are One or more processors;
  • Storage device set to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the control terminal-side job control method provided by any embodiment of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored, and when the program is executed by a processor, a job control method on the control terminal side provided by any embodiment of the present disclosure is implemented.
  • An embodiment of the present disclosure also provides a working drone, the working drone includes:
  • One or more processors are One or more processors;
  • Storage device set to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the operation drone-side operation control method provided by any embodiment of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored.
  • the program is executed by a processor, the operation drone side operation control method provided by any embodiment of the present disclosure is implemented.
  • Embodiment 1 is a schematic diagram of a job control system provided by Embodiment 1 of the present disclosure
  • Embodiment 2 is a flowchart of a method for controlling a job on a terminal side provided by Embodiment 2 of the present disclosure
  • FIG. 3a is a flowchart of a method for controlling a job on a terminal side according to Embodiment 3 of the present disclosure
  • 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set according to Embodiment 3 of the present disclosure
  • FIG. 4a is a flowchart of a method for controlling a job on a terminal side according to Embodiment 4 of the present disclosure
  • FIG. 4b is a schematic diagram of the distribution of each photographing location point provided by Embodiment 4 of the present disclosure.
  • FIG. 5 is a flowchart of a method for controlling operation on the side of an unmanned aerial vehicle according to Embodiment 5 of the present disclosure
  • FIG. 6 is a schematic diagram of a job control device for controlling a terminal side provided by Embodiment 6 of the present disclosure
  • Embodiment 7 is a schematic diagram of an operation drone side operation control device provided by Embodiment 7 of the present disclosure.
  • Embodiment 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 1 is a schematic diagram of a job control system provided by Embodiment 1 of the present disclosure. As shown in FIG. 1, the structure of the job control system includes: a control terminal 10 and a job drone 20 of which:
  • the control terminal 10 is configured to acquire map tile data corresponding to the work area, generate an area map of the work area according to the map tile data for display, and according to at least one area anchor point selected by the user for the area map , Determine at least one operation plot in the operation area, and generate an operation route corresponding to the operation plot to send to the operation drone 20; the operation drone 20 is set to receive the operation route, and according to The operation route performs flight operations in the at least one operation block.
  • the control terminal 10 may be any device that controls the mapping UAV, such as a UAV remote control or a control device with a human-machine interactive interface.
  • the embodiments of the present disclosure do not limit the device type of the control terminal.
  • the operating drone 20 may be a drone set to operate on the surveying and mapping area according to operational requirements, such as detecting the conditions of crops, soil, vegetation, or water quality in the surveying and mapping area, or spraying pesticides in the surveying and mapping area.
  • the map tile data is related data set to generate a tile map, and is formed by slicing the map data.
  • the map tile data can be set to generate a tile map.
  • the pyramid model formed by the tile map is a multi-resolution hierarchical model. From the bottom layer to the top layer of the tile pyramid, the resolution is getting lower and lower, but the geographical range expressed constant.
  • the operation control system is composed of a control terminal 10 and a measurement operation drone 20.
  • the control terminal 10 can obtain map tile data corresponding to the operation area. Since the map tile data includes a variety of map data with different resolutions, the control terminal 10 can according to the resolution requirements of the operation drone according to the map tile.
  • the area map corresponding to the slice data generation operation area is displayed.
  • the user can select at least one area anchor point for the area map.
  • the area positioning point may be set to determine at least one work plot within the work area. For example, a square work plot of 10m * 10m is generated with the area positioning point as the center.
  • control terminal 10 determines the work plot, it can automatically generate a work route corresponding to the work plot and send it to the work drone. For example, in a square work plot of 10m * 10m, with the vertex at the upper left corner as the starting point, travel 1m in a clockwise direction every 5 seconds according to the side length of the work plot. Different operation plots can generate different operation routes, which is not limited in the embodiments of the present disclosure.
  • the operation drone receives the operation route, it can perform flight operations in the determined operation plot according to the operation route.
  • the system further includes a surveying and mapping drone
  • the control terminal 10 is further configured to use the work area as a surveying and mapping area, determine surveying and mapping parameters matching the surveying and mapping area, and send the surveying and mapping parameters to the surveying and mapping
  • the surveying and mapping parameters include: a plurality of surveying and sampling points that the surveying and mapping UAV surveys and maps in the surveying and mapping area.
  • the surveying and mapping unmanned aerial vehicle is configured to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to the plurality of surveying and sampling points, the surveying and mapping photos The set is set to generate map tile data of the surveying area.
  • the surveying and mapping unmanned aerial vehicle may be a drone set to survey and survey the surveying and mapping area to obtain data related to the surveying and mapping area, such as acquiring multiple surveying and mapping photos of the surveying and mapping area.
  • the surveying and mapping unmanned aerial vehicle is equipped with photographing equipment, and is set to obtain multiple surveying and mapping photos corresponding to the surveying and mapping area.
  • the control terminal 10 may also use the work area as a surveying area, and determine a plurality of surveying sampling points for surveying and mapping by the drone in the surveying area, and send surveying parameters formed by the surveying sampling points to the surveying and mapping unmanned machine.
  • the surveying and mapping UAV can receive the surveying and mapping parameters determined by the control terminal and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to multiple surveying and sampling points included in the surveying and mapping parameters.
  • the surveying and mapping UAV can generate the map tile data of the surveying area using the surveying and photographing photo collection, and the control terminal 10 can acquire the map tile data corresponding to the work area from the surveying and mapping UAV.
  • the device further includes a ground terminal configured to obtain the surveying and photographing photo collection, and combine and / or stitch together multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area And generate map tile data corresponding to the survey area according to the survey map; the control terminal is configured to acquire map tile data corresponding to the work area from the ground terminal.
  • a ground terminal configured to obtain the surveying and photographing photo collection, and combine and / or stitch together multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area And generate map tile data corresponding to the survey area according to the survey map
  • the control terminal is configured to acquire map tile data corresponding to the work area from the ground terminal.
  • the ground terminal may be matched with the surveying and mapping drone, and is set as a device for processing data acquired by the surveying and mapping drone, such as a notebook computer or a tablet computer.
  • the embodiments of the present disclosure do not limit the device types of the control terminal and the ground terminal.
  • the surveying and mapping drone can send the acquired surveying photo collection to the ground terminal, so that the ground terminal combines and / or stitches the multiple photos in the surveying photo collection to obtain the corresponding Mapping map.
  • the ground terminal combines and / or stitches the multiple photos in the surveying photo collection to obtain the corresponding Mapping map.
  • the generation of map tile data may also be completed by the ground terminal, and the control terminal 10 may also obtain map tile data corresponding to the work area from the ground terminal.
  • An embodiment of the present disclosure constitutes a job control system by a control terminal and a working drone, wherein the control terminal is configured to acquire map tile data corresponding to the working area to generate an area map of the working area for display, according to the user
  • the selected at least one area positioning point determines at least one operation plot within the operation area, and generates an operation route corresponding to the operation plot to send to the operation drone, so that the operation drone is located in at least one operation site according to the operation route
  • an operation control system and operation control method are proposed.
  • the control terminal is used to automatically generate the corresponding operation route for the operation plot selected by the user, which solves the high operating cost and low route generation efficiency of the existing UAV Problems, improve the efficiency of UAV operation route generation and intelligent UAV operation control.
  • FIG. 2 is a flowchart of a method for controlling a terminal-side job control provided by Embodiment 2 of the present disclosure. This embodiment can be applied to the case where an operation route is automatically generated.
  • This method can be executed by a control-terminal-side job control device. It can be implemented by means of software and / or hardware, and can be generally integrated in a control device (for example, a drone remote control) to be used in conjunction with an operational drone responsible for flight operations.
  • the method includes the following operations:
  • Step 210 Acquire map tile data corresponding to the work area.
  • control terminal may acquire map tile data corresponding to the operation area from the mapping drone or the ground terminal.
  • Step 220 Generate an area map of the work area according to the map tile data for display.
  • the map tile data includes a variety of map data with different resolutions
  • the control terminal obtains the map tile data corresponding to the operation area, it can according to the resolution requirements of the operation drone according to the map tiles
  • the area map corresponding to the data generation operation area is displayed.
  • Step 230 Determine at least one work plot in the work area according to at least one area positioning point selected by the user for the area map.
  • Step 240 Generate an operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone performs flight operations according to the operation route.
  • the user when manipulating the control terminal, the user may select at least one area positioning point for the area map.
  • the area positioning point may be set to determine at least one work plot within the work area. For example, a square work plot of 10m * 10m is generated with the area positioning point as the center.
  • the control terminal determines the operation plot, it can automatically generate an operation route corresponding to the operation plot and send it to the operation drone. For example, in a square work plot of 10m * 10m, with the vertex at the upper left corner as the starting point, travel 1m in a clockwise direction every 5 seconds according to the side length of the work plot.
  • Different operation plots can generate different operation routes, which is not limited in the embodiments of the present disclosure.
  • the operation drone After the operation drone receives the operation route, it can perform flight operations in the determined operation plot according to the operation route.
  • the method may further include: using the operation area as a mapping area, and determining mapping parameters that match the mapping area, wherein, the The surveying and mapping parameters include: a plurality of surveying and sampling points that the surveying and mapping drone surveys and maps in the surveying and mapping area; the surveying and mapping parameters are sent to the surveying and mapping drone, wherein the surveying and mapping parameters are set to indicate the surveying and mapping The drone performs flight shooting in the surveying and mapping area to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points to generate map tile data of the surveying and mapping area.
  • the mapping area is an area with a clear latitude and longitude range, which can be an area of any shape and any size.
  • the embodiments of the present disclosure do not limit the shape and size of the mapping area.
  • UAV aerial surveying and mapping technology carries out the observation of the current situation of the aerial photography area through the remote video transmission technology through the equipped video capture equipment, and at the same time uses the aerial image stitching technology to stitch the captured photos to obtain the overall image of the aerial photography area.
  • the traditional UAV aerial survey method generally adopts the method of parallel line traversal to carry out mobile surveying in the surveying area when taking photos, and in order to ensure the success of stitching, it is usually required that there is a certain degree of overlap between each two consecutive photos.
  • a photo is required to have a certain degree of overlap with other photos in the horizontal and vertical directions.
  • the degree of overlap is generally required to be greater than 50%.
  • the traditional UAV aerial survey method is to survey and map the aerial photography area of a large area of land. During the mapping process, multiple photos with high overlap are taken. When stitching the above photos taken by the drone, it takes a long time and the efficiency is low. In addition, if the photos obtained by the drone are uploaded to the server for splicing, the data upload and processing process will take longer. At the same time, when the traditional UAV aerial survey method is applied to small plot surveying and mapping, it not only has complicated operations, but also has long processing time and high hardware cost.
  • the control terminal may also use the operation area as the surveying area, and determine a plurality of surveying sampling points for the surveying and mapping drone to survey in the surveying area, and send the surveying parameters formed by the surveying sampling points to the surveying and mapping drone .
  • the surveying and mapping UAV can receive the surveying and mapping parameters determined by the control terminal and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to multiple surveying and sampling points included in the surveying and mapping parameters.
  • the surveying and mapping UAV can generate the map tile data of the surveying area using the surveying and photographing photo collection, and the control terminal can obtain the map tile data corresponding to the operation area from the surveying and mapping UAV.
  • the surveying and mapping drone can also send the surveying and mapping photo collection to the ground terminal, so that the ground terminal combines and / or stitches multiple photos in the surveying and photographing photo collection to obtain the corresponding to the surveying and mapping area Mapping map.
  • the surveying and mapping drone can also locally combine and / or stitch multiple photos in the surveying and photographing photo collection to obtain a surveying map corresponding to the surveying and mapping area.
  • the control terminal by using the control terminal to obtain map tile data corresponding to the work area to generate an area map of the work area for display, according to the at least one area anchor point selected by the user for the area map, the at least one area within the work area is determined.
  • An operation plot and generate an operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone carries out flight operations in at least one operation plot according to the operation route, and proposes an operation control system and Operation control method, using the control terminal to automatically generate the corresponding operation route for the operation plot selected by the user, solving the problems of high operation cost and low route generation efficiency of the existing drone, and improving the generation efficiency and no Intelligent man-machine operation control.
  • FIG. 3a is a flowchart of a method for controlling a job on a control terminal side provided by Embodiment 3 of the present disclosure. This embodiment is refined based on the above embodiment. In this embodiment, determination and mapping by the control terminal are given. One of the ways to realize the mapping parameters of region matching. Correspondingly, as shown in FIG. 3a, the method of this embodiment may include:
  • Step 310 Use the operation area as a surveying area to determine a surveying parameter matching the surveying area, wherein the surveying parameter includes: a plurality of surveying sampling points surveyed by the surveying drone in the surveying area.
  • step 310 may include the following operations:
  • Step 311 Acquire a reference photographing position point corresponding to the surveying and mapping area, and establish a mapping relationship between a photographing point in the combined photographing point set and the reference photographing position point.
  • the reference photographing location point is a location point in the surveying area, which has matching geographic location coordinates.
  • the above-mentioned location points can be selected by the user in the surveying area (for example, click, or directly input latitude and longitude, etc.), or can be automatically determined according to the area shape of the surveying area (for example, the center point of the surveying area or the surveying area) Corners, etc.).
  • the combined shooting point set may be a set of shooting points preset according to a preset distribution rule, and the set may include a plurality of shooting points, and there may be a relative direction and a relative distance relationship between each two shooting points.
  • the combined shooting point set includes 5 shooting points, located at the center and four vertices of the rectangle, respectively. Among them, the relative distance between each vertex and the center point is 100m.
  • each vertex is located in four directions: east, south, west, and north.
  • all the sampling points corresponding to the surveying area may be obtained according to the combined shooting point set.
  • one of the points in the surveying area may be first determined as a reference photographing position point, and then the reference photographing position point and one of the shooting points in the combined shooting point set may be mapped to each other.
  • the relative position relationship between each shooting point in the combined shooting point set is determined, but it does not establish a corresponding relationship with the actual geographic location information, so it cannot be directly mapped into the actual surveying area, as long as If one shooting point in the combined shooting point set is given actual geographical location information, then the geographical position information of all the shooting points in the combined shooting point set can all be determined.
  • multiple photos taken according to multiple shooting points in the combined shooting point have overlapping areas.
  • you can change The multiple photos are combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment.
  • FIG. 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set provided in Embodiment 3 of the present disclosure.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are shots taken at the center The points are the four vertices of the rectangle at the center; wherein, the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is a rectangle.
  • the combined shooting point set may include five shooting points, which are a center shooting point and four surrounding shooting points, respectively.
  • the center shooting point may be the center of a rectangle, and correspondingly, the surrounding shooting points may be four vertices of the rectangle corresponding to the center shooting point.
  • Each shooting point has a certain positional relationship, and the setting of the positional relationship satisfies certain conditions, that is, when each photo taken according to each shooting position point determined by each shooting point is combined, a Full rectangular photo.
  • the combination process is to overlay each photo according to the overlapping image between each other.
  • each auxiliary photographing point may rotate around the reference photographing position point according to the user's operation, or move according to the user's sliding operation or the like.
  • the five shooting points in the selected combined shooting point set are a central shooting point and four surrounding shooting points, as long as each surrounding shooting point can ensure that the overlapping degree with the central shooting point (for example, 60% or 70%, etc.), and there is no need to meet such a high degree of overlap between the two surrounding shooting points, which greatly reduces the total number of surveying photos taken by a surveying area of a fixed size, and furthermore It can greatly reduce the time and hardware cost of subsequent photo synthesis or stitching.
  • the solution of the embodiment of the present disclosure is applied to a small plot, for example, after combining or stitching multiple photos taken at each shooting point in a combined shooting point set, one plot can be completely covered.
  • the solution of the disclosed embodiment can be significantly superior to the related art method of parallel line traversal for selecting points in terms of the number of points for surveying and mapping and the difficulty of splicing later.
  • obtaining the reference photographing location point corresponding to the surveying area may include: detecting a user's touch operation in the human-computer interaction interface, and determining a screen location point according to the touch operation; Obtaining a geographic location coordinate that matches the location point of the screen from the map data of the mapping area currently displayed in the human-computer interaction interface as the reference location point.
  • the reference photographing position point may be determined according to the point specified by the user in the human-computer interaction interface.
  • the map data may be latitude and longitude information.
  • detecting a user's touch operation in the human-machine interaction interface and determining a screen position point according to the touch operation may include at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • determining a screen position point according to the user's touch operation in the human-computer interaction interface may have multiple implementation manners.
  • the touch point corresponding to the single touch operation of the user may be determined as the screen position point.
  • a point on the line segment generated by the user's stroke touch operation may also be used as the screen position point. For example, take the midpoint of the line segment as the screen position point. It is also possible to use a point inside the user's picture frame touch operation as the screen position point, for example, the middle point in the frame body as the screen position point.
  • acquiring the reference photographing location point corresponding to the surveying and mapping area may include: acquiring the center point of the surveying and mapping region as the reference photographing location point.
  • the reference photographing location point may also be automatically generated by a control terminal that controls unmanned mapping.
  • the center point of the surveying and mapping area where the drone is located is directly used as the reference photographing position point.
  • acquiring the reference photographing location point corresponding to the surveying area may further include: acquiring geographic location coordinates input by the user as the reference photographing location point.
  • the geographic location coordinates input by the user can also be directly used as the reference photographing location point.
  • the user can input the geographic location coordinates through a soft keyboard in the human-computer interaction interface, a numeric keyboard in the control terminal, or voice input.
  • obtaining the reference photographing location point corresponding to the surveying and mapping area may include: sending location query information to the surveying and mapping drone, and using the geographic location coordinates fed back by the surveying and mapping drone as The reference photographing position point; wherein, the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the reference photographing location point may also be determined through the location information specified by the user.
  • the user can send location query information to the surveying and mapping drone through the control terminal.
  • the user triggers a set identifier on the human-machine interaction interface of the control terminal to send location query information to the surveying and mapping drone to query the current position of the surveying and mapping drone.
  • the surveying and mapping unmanned aerial vehicle obtains the current geographic location coordinates through its own positioning device and feeds them back to the control terminal.
  • the control terminal may directly use the location point corresponding to the received geographic location coordinates as the reference photographing location point.
  • the surveying and mapping UAV sends the geographic location coordinates to the control terminal, its projection point on the ground is located inside the surveying and mapping area.
  • it before sending the position query information to the surveying and mapping drone, it may further include: receiving at least one flight control instruction for the surveying and mapping drone input by the user, and The flight control instruction is sent to the surveying and mapping drone; when it is confirmed that the position confirmation response input by the user is received, a hovering instruction is sent to the surveying and mapping drone to control the surveying and mapping drone Position hovering; wherein, the flight control command is set to control the mapping drone to move in the air in a set direction and / or a set distance.
  • the user can input at least one flight control instruction for the mapping drone to the control terminal.
  • the control terminal sends the flight control instruction input by the user to the surveying and mapping drone, so that the surveying and mapping drone travels according to the flight control instruction.
  • the control terminal may send a hovering instruction to the mapping drone to control The mapping drone hovering at the current position.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point may include: a shooting point selected by the user in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • the user can arbitrarily select one of the shooting points in each shooting point in the combined shooting point set, and combine the shooting point in the combined shooting point set selected by the user with the reference photographing position point Establish a mapping relationship.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: the central shooting point in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: calculating each of the reference photographing position point and the surveying area The distance between the positioning keys, the positioning key points including: the corner points of the surveying area and the center point of the surveying area; acquiring a positioning key point closest to the reference photographing position point as the target reference point; Position information of the target reference point in the surveying and mapping area, select a shooting point matching the position information in the combined shooting point set and establish a mapping relationship with the reference shooting position point.
  • the mapping relationship may also be determined according to the distance relationship between the reference photographing position point and each key point in the surveying area.
  • the corner points of the surveying area and the center point of the surveying area are used as positioning key points, the distance between the reference photographing position point and each positioning key point of the surveying and mapping area is calculated, and a position closest to the reference photographing position point is obtained The key point is used as the target reference point.
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point. For example, if the target reference point is located at the upper left of the surveying area, the shooting point in the upper left corner can be selected from the combined shooting point set and the reference photographing position point to establish a mapping relationship.
  • Step 312 Determine a plurality of auxiliary photographing location points corresponding to the reference photographing location point according to the preset relative position relationship between each photographing point in the combined photographing point set and the mapping relationship.
  • the auxiliary photographing location point may be other location points in the surveying area that are different from the reference photographing location point.
  • the preset relative position relationship and the determined mapping between each shooting point in the combined shooting point set may be determined Relationship, further determining other multiple auxiliary photographing location points corresponding to the reference photographing location point.
  • the combined shooting point set includes a total of 5 shooting points, where the central shooting point in the shooting point set and the reference photographing position point establish a mapping relationship, then the other four shooting points in the combined shooting point set and The position relationship between the center shooting points determines the other four auxiliary shooting position points corresponding to the reference shooting position points.
  • Step 313 Use the reference photographing location point and the plurality of auxiliary photographing location points as a plurality of surveying sampling points for the surveying and mapping drone to survey in the surveying area.
  • the reference photographing position point and the auxiliary photographing position point can be used as the surveying and mapping sampling points for the surveying and mapping of the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • the photographs obtained by the mapping sampling point planning method provided in the embodiments of the present disclosure for each mapping sampling point do not require a certain degree of overlap between each two consecutive photographs, so the processing time of image data can be greatly reduced .
  • Step 320 Send the mapping parameters to the mapping drone.
  • a mapping point in the combined photographing point set is mapped to the reference photographing position point, and at the same time according to each photographing point in the combined photographing point set Predetermined relative position relationship and mapping relationship to determine a plurality of auxiliary photographing position points corresponding to the reference photographing position point, and then use the reference photographing position point and the plurality of auxiliary photographing position points as the surveying and mapping of the UAV in the surveying area Sampling points, a new method of planning for sampling points of surveying and mapping is proposed.
  • the overall planning method of multiple surveying points based on combined shooting point sets is used to replace the existing parallel line movement planning method to solve the problems existing in the existing UAV aerial survey methods.
  • the problems of high cost and low surveying efficiency have achieved the technical effect of reducing surveying and mapping costs and improving surveying and mapping efficiency.
  • FIG. 4a is a flowchart of a method for controlling a terminal-side job control provided by Embodiment 4 of the present disclosure. This embodiment is refined based on the foregoing embodiment. In this embodiment, determination and mapping by the control terminal are given. Another way to realize the mapping parameters of region matching. Accordingly, as shown in FIG. 4a, the method of this embodiment may include:
  • Step 410 Determine the mapping parameters matching the mapping area by using the operation area as the mapping area, wherein the mapping parameters include: a plurality of mapping sampling points mapped by the mapping drone in the mapping area.
  • step 410 may include the following operations:
  • Step 411 Determine one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set.
  • the combined shooting area may be an area where the obtained photos are synthesized after taking pictures according to each shooting point in the set of combined shooting points. That is, the combined shooting area may be an overall photographing area that can be captured by the combined shooting point set.
  • the mapping area information may be related information of the mapping area, such as the area shape or size of the mapping area.
  • the combined shooting area for surveying and mapping may be a shooting area of the same size as the combined shooting area, and one combined shooting area for mapping corresponds to an actual shooting range within the parcel, that is, the size of the area included in the combined shooting area for mapping and the geographical location of the area Two key pieces of information.
  • the mapping sampling point of the mapping drone before determining the mapping sampling point of the mapping drone, first obtain the combined shooting area corresponding to the combined shooting point set, and then according to the combined shooting area and the size of the mapping area and other information, within the mapping area Determine one or more combined shooting areas. If there is only one surveying and mapping combination shooting area, the surveying and mapping combination shooting area can completely cover the surveying and mapping area; if there are multiple surveying and mapping combination shooting areas, the multiple surveying and mapping combination shooting areas can completely cover the surveying and mapping area after synthesis. Exemplarily, assuming that the combined shooting area is a square of 10m * 10m and the surveying area is a rectangle of 10m * 20m, at least two combined surveying and shooting areas should be included to completely cover the surveying and mapping area.
  • multiple photos taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined and / or spliced to form a mapping map of the surveying and mapping areas.
  • the combined shooting area for mapping and the combined shooting area are the same, except that the combined shooting area does not establish a corresponding relationship with the mapped area, and the combined shooting area for surveying and mapping can be a separate shooting area formed by the division in the mapping area.
  • the shape and size of the shooting area are the same as the combined shooting area.
  • the overlapping area between the surveying and mapping combined shooting areas can be set according to actual needs. For example, the overlapping area accounts for 30% or 50% of the surveying and mapping combined shooting area. Be limited.
  • the surveying and mapping drone in order to enable the photos acquired by the surveying and mapping drone to stitch together the images of the complete surveying and mapping area, optionally, the surveying and mapping drone takes multiple photos taken by multiple shooting points in the combined shooting point set There are overlapping areas between photos.
  • multiple photos can be combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment.
  • the overlapping areas between the multiple photos are not the overlapping areas between every two consecutive photos.
  • each photo acquired by the surveying and mapping UAV can be synthesized according to the overlapping part to form a complete image
  • determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set may include: in the surveying and mapping area Select a positioning point; determine a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area; if the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, Select a new positioning point in the surveying area, and return to perform the operation of determining a surveying and mapping combined shooting area in the surveying area based on the positioning point and the combined shooting area until it is determined that the surveying area can be completely covered All surveying and mapping combined shooting area.
  • the positioning point may be a position point in the surveying and mapping area, which is set to position the surveying and mapping combined shooting area in the surveying and mapping area.
  • the positioning point may be a position point selected in the surveying area according to actual needs, such as a corner point or a center point of the surveying area.
  • a surveying and mapping combination shooting area can be determined first in a surveying and mapping area through a positioning point. For example, if the surveying area is rectangular, you can select the top left corner vertex of the surveying area as the positioning point, and the top left corner vertex of the combined shooting area coincides with the positioning point, then the combined shooting area forms a corresponding surveying and mapping combination shot in the surveying area region.
  • the positioning point and the combined shooting area determines a combined surveying and mapping shooting area in the surveying and mapping area in the surveying and mapping area, it is necessary to ensure that the combined surveying and mapping shooting area can cover the surveying and mapping area to the greatest extent.
  • a surveying and mapping combination shooting area cannot completely cover the surveying and mapping area, select a new positioning point in the mapping area and return to perform the operation of determining a surveying and mapping combination shooting area in the surveying and mapping area based on the positioning point and the combined shooting area until the determination is made It can completely cover all the combined shooting areas of the surveying and mapping area.
  • a new anchor point it should be noted that there is an overlapping area between the surveying and mapping combination shooting area determined by the new anchoring point and the adjacent surveying and mapping combination shooting area.
  • it before determining one or more surveying and mapping combined shooting areas in the surveying area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, it may further include: Touch operation in the human-computer interaction interface, and obtain a screen selection area matching the touch operation; in the map data currently displayed in the human-computer interaction interface, obtain a geographic location area matching the screen selection area as The mapping area information.
  • the screen selection area may be an area formed by the user's touch operation in the human-machine interaction interface of the control terminal of the mapping drone, which may be an area of any shape and size (not exceeding the size of the screen), which is implemented in the present disclosure
  • the example does not limit the shape and size of the screen selection area.
  • the mapping area may be designated and generated by the user who controls the mapping drone in real time. For example, by detecting the user's touch operation in the human-computer interaction interface to obtain a screen selection area matching the touch operation, and determining the matching geographic location area for the screen selection area according to the map data currently displayed in the human-machine interaction interface, to The determined geographical area is used as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation may include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the closed area formed by the detected single touch operation of the user may be used as the screen selection area matching the touch operation.
  • the closed area surrounded by the connection line of at least three touch points of the user is determined as the screen selection area.
  • the frame generated by the detected frame touch operation of the user may also be used as the screen selection area.
  • Step 412 Determine multiple photographing location points in the surveying and mapping combined shooting area according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • the photographing location point may be a location point in the surveying area, with matching geographic location coordinates.
  • the photographing position point may be determined according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area may include: Map the central shooting point in the combined shooting point set to the midpoint of the area of the combined shooting area for surveying and mapping, and use the midpoint of the area as a photographing location point; according to each periphery in the combined shooting point set
  • the relative positional relationship between the shooting point and the center shooting point maps each of the surrounding shooting points to the surveying and mapping combined shooting area, and uses the formed multiple mapping points as the shooting position points.
  • each shooting point in the combined shooting point set corresponding to the combined shooting area can be mapped to the surveying and mapping In the combined shooting area, it is taken as the photographing location point.
  • the central shooting point in the combined shooting point set may be first mapped to the midpoint of the area of the combined surveying and mapping shooting area, so that the midpoint of the area of the combined shooting area of surveying and mapping is used as a photographing location point.
  • each surrounding shooting point may be determined according to the relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point Map to the combined shooting area of surveying and mapping respectively, and use the formed multiple mapping points as the photographing location points.
  • the two center points 40 and 50 are the midpoints of the combined shooting area of the surveying and mapping, respectively, the midpoint 40 and the surrounding photographing position points 410 are the combined shooting area of the surveying and mapping In the area, the midpoint 50 and the four surrounding photographing location points 510 are a combined surveying and mapping area.
  • the relative positional relationship between the midpoint of the two surveying and mapping combined shooting areas and the surrounding photographing position points is the same as the preset relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point.
  • Step 413 Use the plurality of photographing location points as a plurality of sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • the photographing location point can be used as a surveying and mapping sampling point for surveying and mapping by the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • Step 420 Send the mapping parameters to the mapping drone.
  • the shooting parameters before sending the surveying and mapping parameters to the surveying and mapping drone, it may further include: acquiring the shooting parameters of the photographing device carried by the surveying and mapping drone, the shooting The parameters include a single photo shooting area of the surveying and mapping drone at a set flight altitude, and each shooting point corresponds to a single photo shooting area; according to the preset photo overlap index and the single photo shooting area, determine the A preset relative position relationship between each shooting point in the combined shooting point set; the mapping parameter further includes: the flying height, the flying height is set to instruct the mapping UAV to use the flying height at Flight photography is performed in the surveying and mapping area.
  • the shooting parameters of the camera device carried by the surveying and mapping drone it may further include: calculating the set flight according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel height.
  • the single photo shooting area is the actual surveying area that can be captured by a single photo.
  • the preset photo overlap index may be an overlap index set according to actual needs, such as 50%, 60%, or 70%. Although the embodiment of the present disclosure does not limit the value of the preset photo overlap index, but The preset photo overlap index should be such that when each photo is synthesized according to the overlapping part, a complete rectangle can be formed.
  • the single photo shooting area of the surveying and mapping drone before synthesizing the photos obtained by the surveying and mapping drone to obtain the final surveying and mapping image, it is necessary to determine the single photo shooting area of the surveying and mapping drone at the set flight altitude, based on the single photo shooting area
  • Each shooting point corresponds to a single photo shooting area, for example, the shooting point is the midpoint or one of the vertices in the single photo shooting area.
  • the preset relative position relationship between each shooting point in the combined shooting point set can be determined according to the preset photo overlap index and the single photo shooting area.
  • the surveying and mapping parameters in the embodiments of the present disclosure may further include a flying height, which is set to instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • a flying height which is set to instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • the set flying height of the mapping UAV can be calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel.
  • the ground pixel resolution flight height * pixel width / lens focal length
  • the flight height ground pixel resolution * lens focal length / pixel width.
  • the pixel width the width of the sensor size of the camera device / the width of the frame.
  • obtaining the shooting parameters of the camera device carried by the surveying and mapping drone may include: according to the pixel width of the camera device, the frame area of the camera device, and the ground pixel resolution To calculate the single photo shooting area of the surveying and mapping UAV at the set flight altitude.
  • the single photo shooting area of the mapping drone at the set flight height may be calculated according to the pixel width of the camera device, the frame size of the camera device, and the resolution of the ground pixel.
  • the single photo shooting area ground pixel resolution * frame size
  • ground pixel resolution flight height * pixel width / lens focal length.
  • single photo shooting length ground pixel resolution * frame length
  • single photo shooting width ground pixel resolution * frame width. For example, if the frame size is 3456 * 4608 and the ground pixel resolution is 0.05m, the single photo shooting area is 172.8m * 230.4m.
  • determining the preset relative positional relationship between each shooting point in the combined shooting point set according to the preset photo overlap index and the single photo shooting area may include: : Determine the size of a single photo according to the frame size of the camera device and the pixel width of the camera device; construct a two-dimensional coordinate system, and select the target point as the center shooting point in the two-dimensional coordinate system; according to the The center shooting point and the size of the single photo generate a center photo in the two-dimensional coordinate system; in the upper left corner, the lower left corner, the upper right corner, and the lower right corner of the center photo, respectively generate Four surrounding photos of the photo overlap index; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, determine the surrounding shooting points corresponding to each of the surrounding photos in the two-dimensional coordinate system Coordinate values of the center; according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the preset relative position relationship between each shooting point in the combined shooting point set
  • the target point may be any point in the two-dimensional coordinate system.
  • the target point may be the origin of the two-dimensional coordinate system.
  • the center photo and its four surrounding photos are not real photos, but a rectangular area with the same size and shape as a single photo.
  • the surrounding shooting point corresponding to each surrounding photo in the two-dimensional coordinate system can be determined according to the mapping relationship between the size of the single photo and the shooting area of the single photo Coordinate value.
  • the single photo size is 10cm * 10cm
  • the photo overlap index is 50%
  • the surrounding photos corresponding to the upper left corner, the lower left corner, the upper right corner, and the lower right corner are assigned to the single photo shooting of the upper left corner, the lower left corner, the upper right corner, and the lower right corner.
  • Area, and the mapping relationship between the size of the single photo and the shooting area of the single photo is 1: 200, then the shooting area of the single photo is correspondingly 20m * 20m. If the midpoint of the surrounding photos is taken as each surrounding shooting point, and the center shooting point adopts the coordinate origin, the coordinate values of each surrounding shooting point can be (-10, 10), (-10, -10), (10 , 10) and (10, -10), the unit is m.
  • the pre-shooting point between each shooting point in the combined shooting point set can be determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system Set relative positional relationship.
  • the relative distance between the surrounding shooting points located at each vertex in the combined shooting point is 20m
  • the relative distance between the center shooting point at the center point and the surrounding shooting points is
  • a new method of planning for sampling points for surveying and mapping which uses the overall planning method of multiple surveying points based on combined shooting point sets to replace the existing parallel line movement planning method, to solve the existing high cost and low surveying efficiency in the existing UAV aerial survey methods
  • the problem has achieved the technical effect of reducing the cost of surveying and mapping and improving the efficiency of surveying and mapping.
  • FIG. 5 is a flow chart of a method for controlling operation on the side of an unmanned aerial vehicle according to Embodiment 5 of the present disclosure.
  • This embodiment can be applied to a situation where an unmanned aerial vehicle performs flight operations according to an operation route. It is executed by a machine-side operation control device, which can be implemented by software and / or hardware, and can generally be integrated in a drone device and used in conjunction with a control terminal responsible for controlling the drone.
  • the method includes the following operations:
  • Step 510 Receive the working route sent by the control terminal.
  • the operation drone can receive the operation route sent by the control terminal.
  • the control terminal obtains the map tile data corresponding to the work area and generates an area map of the work area based on the map tile data for display, according to at least one area anchor point selected by the user for the area map, within the work area Identify at least one working plot and generate a working route corresponding to the working plot.
  • Step 520 Perform flight operations in the at least one operation block according to the operation route.
  • the operation drone after receiving the operation route sent by the control terminal, the operation drone can perform flight operations in at least one matching operation plot according to the operation route.
  • the control terminal acquires map tile data corresponding to the work area to generate an area map of the work area for display, and determines at least one work plot within the work area according to at least one area positioning point selected by the user for the area map , And generate the operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone performs flight operations in at least one operation plot according to the operation route, and proposes an operation control system and operation control method, Use the control terminal to automatically generate the corresponding operation route for the operation plot selected by the user, solve the problems of high operation cost and low route generation efficiency of the existing drone, and improve the generation efficiency of the drone operation route and the drone operation control Intelligence.
  • FIG. 6 is a schematic diagram of a device for controlling a job on a terminal side provided by Embodiment 6 of the present disclosure.
  • the device includes: a map tile data acquisition module 610, a map display module 620, and a work lot determination module 630 and operation route generation and sending module 640, in which:
  • the map tile data acquisition module 610 is set to acquire map tile data corresponding to the operation area
  • the map display module 620 is configured to generate an area map of the work area based on the map tile data for display;
  • the work lot determination module 630 is configured to determine at least one work lot in the work area according to at least one area positioning point selected by the user for the area map;
  • the working route generation and sending module 640 is configured to generate a working route corresponding to the working lot and send it to the working drone, so that the working drone performs flight operations according to the working route.
  • the control terminal by using the control terminal to obtain map tile data corresponding to the work area to generate an area map of the work area for display, according to the at least one area anchor point selected by the user for the area map, the at least one area within the work area is determined.
  • An operation plot and generate an operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone carries out flight operations in at least one operation plot according to the operation route, and proposes an operation control system and Operation control method, using the control terminal to automatically generate the corresponding operation route for the operation plot selected by the user, solving the problems of high operation cost and low route generation efficiency of the existing drone, and improving the generation efficiency and no Intelligent man-machine operation control.
  • the device further includes a surveying and mapping parameter determination module configured to determine the surveying and mapping parameters matching the surveying and mapping area by using the work area as a surveying and mapping area, wherein the surveying and mapping parameters include: A plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area; a surveying and mapping parameter sending module configured to send the surveying and mapping parameters to the surveying and mapping drone, wherein the surveying and mapping parameters are set to indicate that the surveying and mapping drone is in Performing flight shooting in the surveying area to obtain a set of surveying photos corresponding to the plurality of surveying sampling points to generate map tile data of the surveying area.
  • the surveying and mapping parameters include: A plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area; a surveying and mapping parameter sending module configured to send the surveying and mapping parameters to the surveying and mapping drone, wherein the surveying and mapping parameters are set to indicate that the surveying and mapping drone is in Performing flight shooting in the surveying area to obtain a
  • the surveying and mapping parameter determination module includes: a photographing position point acquiring unit, which is set to acquire a reference photographing position point corresponding to the surveying and mapping area, and combine a photographing point in the combined photographing point set with the reference photographing position point Establish a mapping relationship; the auxiliary photographing position point determination unit is set to determine the number corresponding to the reference photographing position point according to the preset relative position relationship between each shooting point in the combined shooting point set and the mapping relationship Auxiliary photographing position points; a first mapping sampling point determination unit, set to use the reference photographing position point and the plurality of auxiliary photographing position points as the multiple surveying and mapping in the surveying and mapping UAV surveying and mapping in the surveying and mapping area Sampling point.
  • the surveying and mapping parameter determination module includes: a surveying and mapping combined shooting area determination unit configured to determine one or more surveying and mapping combined shots in the surveying and mapping area based on the combined shooting area and the mapping area information corresponding to the combined shooting point set Area; photographing position point determination unit, set to determine a plurality of photographing position points in the surveying and mapping combined shooting area according to a preset relative position relationship between each shooting point in the combined shooting point set; second surveying and sampling The point determining unit is configured to use the plurality of photographing position points as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the mapping drone.
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the surveying and mapping combined shooting area is after multiple photos are taken according to the multiple shooting points in the combined shooting point set.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, the surrounding shooting points being four vertices of a rectangle centered on the center shooting point; wherein, according to The shape of the composite photo taken by each shooting point in the combined shooting point set is rectangular.
  • the photographing position point acquisition unit is set to detect the user's touch operation in the human-machine interaction interface, and determine a screen position point according to the touch operation; the currently displayed surveying and mapping area in the human-machine interaction interface A geographic location coordinate matched with the screen location point is obtained from the map data of the reference location as the reference location point.
  • the photographing position point acquiring unit is configured to determine the user's touch point as the screen position point if it is detected that the user's touch operation is a single-point touch operation;
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • the photographing position point acquiring unit is set to acquire the center point of the surveying and mapping area as the reference photographing position point.
  • the photographing location point acquisition unit is configured to send location query information to the surveying and mapping drone, and use the geographic location coordinates fed back by the surveying and mapping drone as the reference photographing location point; wherein, the surveying and mapping The unmanned aerial vehicle is preset at a position matching the surveying area.
  • the device further includes: a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping Man-machine; hovering instruction sending module, set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone to hover at the current position ; Wherein the flight control command is set to control the mapping drone to move in the air in a set direction and / or a set distance.
  • a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping Man-machine
  • hovering instruction sending module set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone to hover at the current position ;
  • the flight control command is set to control the mapping drone to move in the air in
  • the photographing location point acquiring unit is set to acquire the geographic location coordinates input by the user as the reference photographing location point.
  • the photographing location point acquisition unit is configured to establish a mapping relationship between a shooting point selected by the user in the combined shooting point set and the reference photographing location point.
  • the photographing location point acquiring unit is configured to establish a mapping relationship between the central photographing point in the combined photographing point set and the reference photographing location point.
  • the photographing location point acquisition unit is set to calculate the distance between the reference photographing location point and each positioning key of the surveying and mapping area.
  • the positioning key points include: the corner points of the surveying and mapping area and the location Describe the center point of the surveying and mapping area; obtain a positioning key point closest to the reference photographing position as the target reference point; according to the position information of the target reference point in the surveying and mapping area, in the combined shooting point set A shooting point matching the position information is selected to establish a mapping relationship with the reference shooting position point.
  • the surveying and mapping combined shooting area determination unit is set to select a positioning point within the surveying and mapping area; and according to the positioning point and the combined shooting area, determine a surveying and mapping combined shooting area within the surveying and mapping area; If the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area, in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • the photographing position point determination unit is set to map the central shooting point in the combined shooting point set to the midpoint of the area of the surveying and mapping combined shooting area, and use the midpoint of the area as a photographing position point ; According to the relative positional relationship between each surrounding shooting point in the combined shooting point set and the center shooting point, each of the surrounding shooting points are mapped to the combined surveying and mapping shooting area respectively, and will form Multiple mapping points are used as the photographing location point.
  • the device further includes: a screen selection area acquisition module, configured to detect a user's touch operation in the human-computer interaction interface, and obtain a screen selection area matching the touch operation; a survey area information acquisition module, set In order to obtain the geographic location area matching the screen selection area as the mapping area information in the map data currently displayed in the human-computer interaction interface.
  • a screen selection area acquisition module configured to detect a user's touch operation in the human-computer interaction interface, and obtain a screen selection area matching the touch operation
  • a survey area information acquisition module set In order to obtain the geographic location area matching the screen selection area as the mapping area information in the map data currently displayed in the human-computer interaction interface.
  • the screen selection area acquisition module is configured to determine that if the user's touch operation is detected as a single-point touch operation, the closed area surrounded by the connection line of at least three touch points of the user is determined as The screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the device further includes: a shooting parameter acquisition module configured to acquire shooting parameters of a photographing device carried by the surveying and mapping drone, the shooting parameters including the surveying and mapping drone at a set flight altitude Each single shooting point corresponds to a single photo shooting area; the relative position relationship determination module is set to determine the combined shooting point set according to the preset photo overlap index and the single photo shooting area A preset relative position relationship between each shooting point; the mapping parameters further include: the flying height, the flying height is set to instruct the mapping UAV to perform in the mapping area at the flying height Flight shooting.
  • the relative position relationship determination module is set to determine the size of a single photo according to the frame size of the camera device and the pixel width of the camera device; construct a two-dimensional coordinate system, and set the two-dimensional coordinate system Select the target point as the center shooting point; generate a center photo in the two-dimensional coordinate system according to the center shooting point and the size of the single photo; at the upper left corner, lower left corner, upper right corner, and right of the center photo In the lower corner, four surrounding photos that meet the center photo overlap index with the center photo are generated respectively; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, it is determined to correspond to each of the surrounding photos
  • the coordinate values of the surrounding shooting points in the two-dimensional coordinate system according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the set of combined shooting points The preset relative position relationship between each shooting point.
  • the apparatus further includes: a flying height calculation module, configured to calculate the set flying height based on the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • a flying height calculation module configured to calculate the set flying height based on the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • the shooting parameter acquisition module is set to calculate the surveying drone at the set flying height based on the pixel width of the camera device, the frame area of the camera device, and the resolution of the ground pixel Under the single photo shooting area.
  • control terminal-side job control device can execute the control terminal-side job control method provided by any embodiment of the present disclosure, and has functional modules and beneficial effects corresponding to the execution method.
  • control terminal-side job control method provided by any embodiment of the present disclosure, and has functional modules and beneficial effects corresponding to the execution method.
  • functional modules and beneficial effects corresponding to the execution method For technical details that are not described in detail in this embodiment, refer to the method for controlling the operation of the control terminal side provided in any embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of an operation drone-side operation control device provided in Embodiment 7 of the present disclosure. As shown in FIG. 7, the device includes: an operation route receiving module 710 and a flight operation module 720, in which:
  • the operating route receiving module 710 is set to receive the operating route sent by the control terminal;
  • the flight operation module 720 is configured to perform flight operations in the at least one operation block according to the operation route.
  • the control terminal acquires map tile data corresponding to the work area to generate an area map of the work area for display, and determines at least one work plot within the work area according to at least one area positioning point selected by the user for the area map , And generate the operation route corresponding to the operation plot and send it to the operation drone, so that the operation drone performs flight operations in at least one operation plot according to the operation route, and proposes an operation control system and operation control method, Use the control terminal to automatically generate the corresponding operation route for the operation plot selected by the user, solve the problems of high operation cost and low route generation efficiency of the existing drone, and improve the generation efficiency of the drone operation route and the drone operation control Intelligence.
  • the above-mentioned operation drone-side operation control device can execute the operation drone-side operation control method provided by any embodiment of the present disclosure, and has a function module and beneficial effects corresponding to the execution method.
  • the operation control method on the operation drone side provided by any embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 8 shows a block diagram of a control terminal 812 suitable for implementing embodiments of the present disclosure.
  • the control terminal 812 shown in FIG. 8 is just an example, and should not bring any limitation to the functions and use scope of the embodiments of the present disclosure.
  • control terminal 812 is represented in the form of a general-purpose computing device.
  • the components of the control terminal 812 may include, but are not limited to, one or more processors 816, a storage device 828, and a bus 818 connecting different system components (including the storage device 828 and the processor 816).
  • the bus 818 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards) Association, VESA) local area bus and peripheral component interconnect (Peripheral Component Interconnect, PCI) bus.
  • the control terminal 812 typically includes various computer system readable media. These media may be any available media that can be accessed by the control terminal 812, including volatile and non-volatile media, removable and non-removable media.
  • the storage device 828 may include a computer system-readable medium in the form of volatile memory, such as random access memory (Random Access Memory, RAM) 830 and / or cache memory 832.
  • the control terminal 812 may further include other removable / non-removable, volatile / nonvolatile computer system storage media.
  • the storage system 834 may be set to read and write non-removable, non-volatile magnetic media (not shown in FIG. 8 and is commonly referred to as a "hard disk drive").
  • a disk drive configured to read and write to a removable non-volatile disk (such as a "floppy disk") and a removable non-volatile optical disk (such as a compact disk (Compact Disc- Disc) Read-Only Memory (CD-ROM), Digital Video Disc (Digital Video-Disc-Read Only Memory, DVD-ROM) or other optical media) read and write optical disc drive.
  • each drive may be connected to the bus 818 through one or more data media interfaces.
  • the storage device 828 may include at least one program product having a set of (eg, at least one) program modules configured to perform the functions of each embodiment of the present disclosure.
  • a program 836 having a set of (at least one) program modules 826 may be stored in, for example, a storage device 828.
  • Such program modules 826 include but are not limited to an operating system, one or more application programs, other program modules, and program data. These Each of the examples or some combination may include an implementation of the network environment.
  • the program module 826 generally performs the functions and / or methods in the embodiments described in the present disclosure.
  • the control terminal 812 may also communicate with one or more external devices 814 (eg, keyboard, pointing device, camera, display 824, etc.), and may also communicate with one or more devices that enable users to interact with the control terminal 812, and / or Or communicate with any device (such as a network card, modem, etc.) that enables the control terminal 812 to communicate with one or more other computing devices. This communication may be performed through an input / output (I / O) interface 822.
  • the control terminal 812 can also communicate with one or more networks (such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 820.
  • networks such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 820.
  • the network adapter 820 communicates with other modules of the control terminal 812 through the bus 818. It should be understood that although not shown in the figure, other hardware and / or software modules may be used in conjunction with the control terminal 812, including but not limited to: microcode, device driver, redundant processing unit, external disk drive array, disk array (Redundant Arrays) of Independent Disks (RAID) systems, tape drives and data backup storage systems, etc.
  • the processor 816 executes each function application and data processing by running the program stored in the storage device 828, for example, to implement the control terminal side job control method provided by the above-described embodiment of the present disclosure.
  • the processing unit executes the program, it realizes: acquiring map tile data corresponding to the work area; generating an area map of the work area based on the map tile data for display; and displaying the area map according to the user
  • the selected at least one area positioning point to determine at least one operation plot within the operation area; generating an operation route corresponding to the operation plot and sending it to the operation drone, so that the operation drone can Operate on flight routes.
  • the ninth embodiment is a working drone provided to perform the operation control method on the working drone side provided by any embodiment of the present disclosure
  • the working drone includes: one or more A processor; a storage device configured to store one or more programs; when the one or more programs are executed by the one or more processors, such that the one or more processors are implemented as any implementation of the present disclosure
  • the operation control method provided by the example on the operation drone side receiving the operation route sent by the control terminal; performing flight operations in the at least one operation plot according to the operation route.
  • Embodiment 10 of the present disclosure also provides a computer storage medium storing a computer program, which when executed by a computer processor is configured to perform any of the control terminal-side job control methods described in the foregoing embodiments of the present disclosure: acquisition and Map tile data corresponding to the work area; generating an area map of the work area based on the map tile data for display; determining at least one area anchor point selected by the user for the area map within the work area A working plot; generating a working route corresponding to the working plot and sending it to a working drone, so that the working drone performs flight operations according to the working route.
  • the computer program when executed by the computer processor, it is set to perform the operation drone-side operation control method described in any of the above-mentioned embodiments of the present disclosure: receiving the operation route sent by the control terminal; Flight operations are carried out in at least one operating plot.
  • the computer storage media of the embodiments of the present disclosure may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above.
  • computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory, read-only memory (Read Only Memory, ROM ), Erasable programmable read-only memory ((Erasable Programmable Read Only, EPROM) or flash memory), optical fiber, portable compact disk read-only memory, optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal that is propagated in baseband or as part of a carrier wave, in which computer-readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program set to be used by or in combination with an instruction execution system, apparatus, or device .
  • the program code contained on the computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • any appropriate medium including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • the computer program code set to perform the operations of the present disclosure may be written in one or more programming languages or a combination thereof, the programming languages including object-oriented programming languages-such as Java, Smalltalk, C ++, as well as conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code may be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network or a wide area network, or may be connected to an external computer (eg, using an Internet service provider to connect through the Internet).
  • Embodiments of the present disclosure provide an operation control system, operation control method, device, equipment, and medium.
  • the control terminal automatically generates a corresponding operation route for the operation plot selected by the user, which solves the high operation cost and route of the existing UAV Problems such as low generation efficiency have improved the generation efficiency of drone operation routes and the intelligentization of drone operation control.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

一种作业控制系统、作业控制方法、装置、设备及介质,包括:控制终端(10)以及作业无人机(20),其中:控制终端(10),设置为获取与作业区域对应的地图瓦片数据,并根据地图瓦片数据生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机(20);作业无人机(20),设置为接收作业航线,并根据作业航线在至少一个作业地块中进行飞行作业。

Description

一种作业控制系统、作业控制方法、装置、设备及介质 技术领域
本公开实施例涉及测绘技术领域,例如涉及一种作业控制系统、作业控制方法、装置、设备及介质。
背景技术
近年来无人机因其高效、灵活及低成本等特性,已被广泛应用在测绘、应急及救灾等领域。
相关技术中,用户在使用无人机的控制终端确定无人机的作业航线时,一般会使用控制终端的操作杆或通过操作界面输入完整的航线指令。例如,通过无人机遥控器的操作杆的上下左右移动来控制无人机的作业航线,或者用户在控制终端的操作界面中手动输入无人机的作业航线等。
发明人在实现本公开的过程中,发现相关技术存在如下缺陷:用户使用控制终端的操作杆或通过操作界面输入完整的航线指令耗时较长且用户体验较低。
发明内容
本公开实施例提供一种作业控制系统、作业控制方法、装置、设备及介质,以提高无人机作业航线的生成效率及无人机作业控制的智能化。
本公开实施例提供了一种作业控制系统,包括:控制终端以及作业无人机,其中:
所述控制终端,设置为获取与作业区域对应的地图瓦片数据,并根据所述地图瓦片数据生成所述作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块,并生成与所述作业地块对应的作业航线发送至所述作业无人机;
所述作业无人机,设置为接收所述作业航线,并根据所述作业航线在所述至少一个作业地块中进行飞行作业。
可选的,所述系统还包括:测绘无人机;
所述控制终端还设置为:将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
所述测绘无人机设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进 行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,所述测绘照片集合设置为生成所述测绘区域的地图瓦片数据。
可选的,所述系统还包括:地面终端;
所述地面终端,设置为获取所述测绘照片集合,将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图,并根据所述测绘地图生成与所述测绘区域对应的地图瓦片数据;
所述控制终端设置为,从所述地面终端获取与作业区域对应的地图瓦片数据。
本公开实施例还提供了一种控制终端侧作业控制方法,包括:
获取与作业区域对应的地图瓦片数据;
根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;
根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;
生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
可选的,在获取与作业区域对应的地图瓦片数据之前,还包括:
将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
将所述测绘参数发送至所述测绘无人机,其中,所述测绘参数设置为指示所述测绘无人机在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,以生成所述测绘区域的地图瓦片数据。
可选的,确定与测绘区域匹配的测绘参数,包括:
获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;
根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;
将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,确定与测绘区域匹配的测绘参数,包括:
根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;
根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区 域中确定多个拍照位置点;
将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合,和/或拼接形成所述测绘区域的测绘地图。
可选的,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;
其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;
在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
可选的,检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点,包括下述至少一项:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
获取所述测绘区域的中心点作为所述参考拍照位置点。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;
其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
可选的,在向所述测绘无人机发送位置查询信息之前,还包括:
接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令 发送至所述测绘无人机;
在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;
其中,所述飞行控制指令设置为控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
获取用户输入的地理位置坐标作为所述参考拍照位置点。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;
获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;
根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
可选的,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,包括:
在所述测绘区域内选择一个定位点;
根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;
如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
可选的,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,包括:
将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;
根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
可选的,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还包括:
检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;
在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
可选的,检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域,包括:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,在将所述测绘参数发送至所述测绘无人机之前,还包括:
获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;
根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;
所述测绘参数还包括:所述飞行高度,所述飞行高度设置为指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
可选的,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,包括:
根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;
构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;
根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;
在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;
根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;
根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
可选的,在获取测绘无人机所携带的拍照设备的拍摄参数之前,还包括:
根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所 述设定飞行高度。
可选的,获取测绘无人机所携带的拍照设备的拍摄参数,包括:
根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
本公开实施例还提供了一种作业无人机侧作业控制方法,包括:
接收控制终端发送的作业航线;
根据所述作业航线在所述至少一个作业地块中进行飞行作业。
本公开实施例还提供了一种控制终端侧作业控制装置,包括:
地图瓦片数据获取模块,设置为获取与作业区域对应的地图瓦片数据;
地图显示模块,设置为根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;
作业地块确定模块,设置为根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;
作业航线生成发送模块,设置为生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
本公开实施例还提供了一种作业无人机侧作业控制装置,包括:
作业航线接收模块,设置为接收控制终端发送的作业航线;
飞行作业模块,设置为根据所述作业航线在所述至少一个作业地块中进行飞行作业。
本公开实施例还提供了一种控制终端,所述控制终端包括:
一个或多个处理器;
存储装置,设置为存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本公开任意实施例所提供的控制终端侧作业控制方法。
本公开实施例还提供了一种计算机存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开任意实施例所提供的控制终端侧作业控制方法。
本公开实施例还提供了一种作业无人机,所述作业无人机包括:
一个或多个处理器;
存储装置,设置为存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本公开任意实施例所提供的作业无人机侧作业控制方法。
本公开实施例还提供了一种计算机存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开任意实施例所提供的作业无人机侧作业控制方法。
附图说明
图1是本公开实施例一提供的一种作业控制系统的示意图;
图2是本公开实施例二提供的一种控制终端侧作业控制方法的流程图;
图3a是本公开实施例三提供的一种控制终端侧作业控制方法的流程图;
图3b是本公开实施例三提供的一种组合拍摄点集内每个拍摄点的位置分布示意图;
图4a是本公开实施例四提供的一种控制终端侧作业控制方法的流程图;
图4b是本公开实施例四提供的一种每个拍照位置点的分布示意图;
图5是本公开实施例五提供的一种作业无人机侧作业控制方法的流程图;
图6是本公开实施例六提供的一种控制终端侧作业控制装置的示意图;
图7是本公开实施例七提供的一种作业无人机侧作业控制装置的示意图;
图8为本公开实施例八提供的一种控制终端的结构示意图。
实施方式
下面结合附图和实施例对本公开作进一步的详细说明。可以理解的是,此处所描述的实施例仅仅解释本公开,而非对本公开的限定。
为了便于描述,附图中仅示出了与本公开相关的部分而非全部内容。在更加详细地讨论示例性实施例之前应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将每项操作(或步骤)描述成顺序的处理,但是其中的许多操作可以被并行地、并发地或者同时实施。此外,每项操作的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
实施例一
图1是本公开实施例一提供的一种作业控制系统的示意图,如图1所示,该作业控制系统的结构包括:控制终端10以及作业无人机20其中:
控制终端10,设置为获取与作业区域对应的地图瓦片数据,并根据所述地图瓦片数据生成所述作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块,并生成与所述作业地块对应的作业航线发送至作业无人机20;作业无人机20,设置为接收所述作业航线,并根据所述作业航线在所述至少一个作业地块中进行飞行作业。
其中,控制终端10可以是控制测绘无人机的任意设备,如无人机遥控器或带有人机交 互界面的控制设备等。本公开实施例并不对控制终端的设备类型进行限定。作业无人机20可以是设置为根据作业需求对测绘区域进行作业的无人机,如对测绘区域中的农作物、土壤、植被或水质等状况进行检测,或在测绘区域中喷洒农药等。地图瓦片数据是设置为生成瓦片地图的相关数据,由切片算法对地图数据进行切片后形成。地图瓦片数据可以设置为生成瓦片地图,瓦片地图所构成的金字塔模型是一种多分辨率层次模型,从瓦片金字塔的底层到顶层,分辨率越来越低,但表示的地理范围不变。
在本公开实施例中,如图1所示,作业控制系统由控制终端10以及测作业无人机20组成。其中,控制终端10可以获取与作业区域对应的地图瓦片数据,由于地图瓦片数据包括了多种分辨率不同的地图数据,控制终端10可以根据作业无人机对分辨率的需求根据地图瓦片数据生成作业区域对应的区域地图进行显示。用户在操控控制终端10时,可以针对区域地图选择至少一个区域定位点。其中,区域定位点可以设置为在作业区域内确定至少一个作业地块。例如,以区域定位点为中心,生成10m*10m的正方形作业地块。相应的,控制终端10确定作业地块后,可以自动生成与作业地块对应的作业航线发送至作业无人机。例如,在10m*10m的正方形作业地块中,以左上角顶点作为起始点,按照作业地块的边长在顺时针方向上每5秒行进1m。不同的作业地块可以生成不同的作业航线,本公开实施例对此并不进行限制。作业无人机接收到作业航线后,即可根据作业航线在确定的作业地块中进行飞行作业。
可选的,所述系统还包括测绘无人机,控制终端10还设置为,将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。所述测绘无人机设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,所述测绘照片集合设置为生成所述测绘区域的地图瓦片数据。
其中,测绘无人机可以是设置为对测绘区域进行测绘以获取测绘区域相关数据的无人机,如获取测绘区域的多个测绘照片。测绘无人机具备拍照设备,设置为获取测绘区域对应的多个测绘照片。
在本公开实施例中,控制终端10还可以将作业区域作为测绘区域,并确定测绘无人机在测绘区域中测绘的多个测绘采样点,将测绘采样点形成的测绘参数发送至测绘无人机。测绘无人机可以接收控制终端确定的测绘参数,并根据测绘参数在测绘区域中进行飞行拍摄,得到与测绘参数中包括的多个测绘采样点对应的测绘照片集合。测绘无人机可以利用测绘照片集合生成测绘区域的地图瓦片数据,则控制终端10可以从测绘无人机获取与作业区域对 应的地图瓦片数据。
可选的,所述装置还包括地面终端,设置为获取所述测绘照片集合,将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图,并根据所述测绘地图生成与所述测绘区域对应的地图瓦片数据;所述控制终端设置为,从所述地面终端获取与作业区域对应的地图瓦片数据。
其中,地面终端可以是与测绘无人机匹配的,设置为对测绘无人机获取的数据进行处理的设备,如笔记本电脑或平板电脑等。本公开实施例并不对控制终端和地面终端的设备类型进行限定。
在本公开实施例中,测绘无人机可以将获取的测绘照片集合发送至地面终端,以使地面终端将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图。为了能够将测绘照片集合中的多张照片进行照片组合和/或拼接形成完整的图像,多个测绘采样点对应的多张照片之间具有一定的重叠度,但并不要求每连续两张照片之间都具有一定的重叠度,因此能够大幅降低图像数据的处理耗时,从而提高测绘效率。同时,地图瓦片数据的生成也可以由地面终端完成,则控制终端10还可以从地面终端获取与作业区域对应的地图瓦片数据。
本公开实施例通过控制终端以及作业无人机组成一种作业控制系统,其中,控制终端设置为获取与作业区域对应的地图瓦片数据以生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机,以使作业无人机根据作业航线在至少一个作业地块中进行飞行作业,提出了一种作业控制系统和作业控制方法,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。
实施例二
图2是本公开实施例二提供的一种控制终端侧作业控制方法的流程图,本实施例可适用于自动生成作业航线的情况,该方法可以由控制终端侧作业控制装置来执行,该装置可以由软件和/或硬件的方式来实现,并一般可集成在控制设备(例如,无人机遥控器)中,与负责飞行作业的作业无人机配合使用。相应的,如图2所示,该方法包括如下操作:
步骤210、获取与作业区域对应的地图瓦片数据。
在本公开实施例中,控制终端可以从测绘无人机或地面终端处获取与作业区域对应的地图瓦片数据。
步骤220、根据所述地图瓦片数据生成所述作业区域的区域地图进行显示。
相应的,由于地图瓦片数据包括了多种分辨率不同的地图数据,因此控制终端获取到与作业区域对应的地图瓦片数据后,可以根据作业无人机对分辨率的需求根据地图瓦片数据生成作业区域对应的区域地图进行显示。
步骤230、根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块。
步骤240、生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
在本公开实施例中,用户在操控控制终端时,可以针对区域地图选择至少一个区域定位点。其中,区域定位点可以设置为在作业区域内确定至少一个作业地块。例如,以区域定位点为中心,生成10m*10m的正方形作业地块。相应的,控制终端确定作业地块后,可以自动生成与作业地块对应的作业航线发送至作业无人机。例如,在10m*10m的正方形作业地块中,以左上角顶点作为起始点,按照作业地块的边长在顺时针方向上每5秒行进1m。不同的作业地块可以生成不同的作业航线,本公开实施例对此并不进行限制。作业无人机接收到作业航线后,即可根据作业航线在确定的作业地块中进行飞行作业。
在本公开的一个可选实施例中,在获取与作业区域对应的地图瓦片数据之前,还可以包括:将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;将所述测绘参数发送至所述测绘无人机,其中,所述测绘参数设置为指示所述测绘无人机在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,以生成所述测绘区域的地图瓦片数据。
其中,测绘区域是具有明确经纬度范围的区域,其可以是任意形状及任意大小的区域,本公开实施例并不对测绘区域的形状和大小进行限定。
无人机航空测绘技术通过搭载的视频捕捉设备通过图像远程传输技术实施观察航拍区域的现状,同时利用航摄影像拼接技术对拍摄的照片进行拼接,以得到航摄区域的整体影像。传统无人机航测方法在拍摄照片时,一般采取平行线遍历的方式在测绘区域内进行移动测绘,并为了保证拼接成功,通常要求每连续两张照片之间具有一定的重叠度。为了保证后续的正常拼接,要求一张照片在横向以及纵向均与其他照片具有一定的重叠度,一般来说,为了保证后续的正常拼接,重叠度一般要求大于50%。传统无人机航测方法都是针对大面积地块的航摄区域进行测绘的,测绘过程中拍摄多张重叠度较高的照片。将无人机拍摄的上述照片进行拼接时,耗时较长且效率较低。此外,如果将无人机获取的照片上传到服务器进行拼接处 理,则数据上传和处理过程耗时更长。同时,传统无人机航测方法应用在小地块测绘时,不仅操作繁杂,处理时间长且硬件成本也较高。
在本公开实施例中,控制终端还可以将作业区域作为测绘区域,并确定测绘无人机在测绘区域中测绘的多个测绘采样点,将测绘采样点形成的测绘参数发送至测绘无人机。测绘无人机可以接收控制终端确定的测绘参数,并根据测绘参数在测绘区域中进行飞行拍摄,得到与测绘参数中包括的多个测绘采样点对应的测绘照片集合。测绘无人机可以利用测绘照片集合生成测绘区域的地图瓦片数据,则控制终端可以从测绘无人机获取与作业区域对应的地图瓦片数据。测绘无人机还可以将测绘无人机可以将获取的测绘照片集合发送至地面终端,以使地面终端将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图。或者,测绘无人机还可以在本地将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图。为了能够将测绘照片集合中的多张照片进行照片组合和/或拼接形成完整的图像,多个测绘采样点对应的多张照片之间具有一定的重叠度,但并不要求每连续两张照片之间都具有一定的重叠度,因此能够大幅降低图像数据的处理耗时,从而提高测绘效率。
本实施例的技术方案,通过利用控制终端获取与作业区域对应的地图瓦片数据以生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机,以使作业无人机根据作业航线在至少一个作业地块中进行飞行作业,提出了一种作业控制系统和作业控制方法,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。
实施例三
图3a是本公开实施例三提供的一种控制终端侧作业控制方法的流程图,本实施例以上述实施例为基础进行细化,在本实施例中,给出了通过控制终端确定与测绘区域匹配的测绘参数的其中一种实现方式。相应的,如图3a所示,本实施例的方法可以包括:
步骤310、将作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,步骤310可以包括下述操作:
步骤311、获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系。
其中,参考拍照位置点是测绘区域中的一个位置点,具有匹配的地理位置坐标。上述位置点可以是用户在测绘区域中选取的(例如,点选,或者直接输入经纬度等),也可以是根据测绘区域的区域形状自动确定出的(例如,测绘区域的中心点或者测绘区域的角点等)。组合拍摄点集可以是根据预设分布规则预先设置的拍摄点的集合,在该集合中可以包括多个拍摄点,且每两个拍摄点之间可以具有相对方向和相对距离关系。例如,组合拍摄点集中包括5个拍摄点,分别位于矩形的中心和四个顶点处。其中,每个顶点与中心点之间的相对距离为100m。又如,每个顶点分别位于东、南、西、北四个方向。
在本公开实施例中,可以根据组合拍摄点集辅助获取测绘区域对应的所有测绘采样点。可选的,可以首先确定测绘区域中的其中一个点作为参考拍照位置点,然后将该参考拍照位置点与组合拍摄点集内的其中一个拍摄点建立彼此之间的映射关系。
换句话说,组合拍摄点集内的每个拍摄点之间的相对位置关系确定,但是却并未与实际的地理位置信息建立对应关系,因此无法直接映射至实际的测绘区域中去,只要将组合拍摄点集中的一个拍摄点赋予实际的地理位置信息,则该组合拍摄点集中的全部拍摄点的地理位置信息就全部可以确定得到。
典型的,按照所述组合拍摄点集中的多个拍摄点所拍摄的多张照片之间具有重叠区域,相应的,按照所述组合拍摄点集中的多个拍摄点拍摄多张照片后,可以将所述多张照片组合,和/或拼接形成的一个完整的组合区域。该组合区域可以完整覆盖测绘区域,也可以仅覆盖测绘区域的一部分,本实施例对此并不进行限制。
图3b是本公开实施例三提供的一种组合拍摄点集内每个拍摄点的位置分布示意图。在本公开的一个可选实施例中,如图3b所示,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
在本公开实施例中,可选的,如图3b所示,组合拍摄点集内可以包括5个拍摄点,分别为中心拍摄点以及四个周围拍摄点。其中,中心拍摄点可以是一个矩形的中心,相对应的,周围拍摄点可以是中心拍摄点对应矩形的四个顶点。每个拍摄点之间具有一定的位置关系,该位置关系的设定满足一定的条件,即根据每个拍摄点所确定的每个拍照位置点拍摄得到的每个照片进行组合时,可以得到一个完整的矩形照片。其中,组合过程即为将每个照片按照彼此之间的重叠图像进行覆盖。在其他实施例中,在完成默认映射后,每个辅助拍照点可以根据用户的操作而以参考拍照位置点为中心转动,或者根据用户的滑动等操作移动。
相关技术在形成与测绘区域对应的测绘点时,由于采取平行线遍历的方式在测绘区域内进行移动测绘,因此要保证在一个测绘点下拍摄的照片的水平相邻位置以及竖直相邻位置的 其他拍摄点均具有预设的重叠度。这就会使得一张测绘照片中包含的,区别与其他测绘照片的信息量较少,因此,对照片的需求量较大,后期照片的合成以及拼接所需的工作量以及时间都较大。在本实施例中,所选取的组合拍摄点集内的5个拍摄点是一个中心拍摄点以及四个周围拍摄点,每个周围拍摄点只要能保证与中心拍摄点满足上述重叠度(例如,60%或者70%等)要求即可,两两周围拍摄点之间无需满足如此高的重叠度要求,这就大大降低了测绘一块固定大小的测绘区域所拍摄的测绘照片的总数量,进而也就可以大大降低后续的照片合成或者拼接的时间和硬件成本。特别的,如果将本公开实施例的方案应用在小地块中,例如一个组合拍摄点集中的每个拍摄点所拍摄的多张照片进行组合或者拼接后,可以完整覆盖一个地块时,本公开实施例的方案在测绘点的数量以及后期的拼接难度上,可以明显优于相关技术的平行线遍历选点测绘的方式。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
在本公开实施例中,可以根据用户在人机交互界面中指定的点确定参考拍照位置点。可选的,可以检测用户在人机交互界面中的触摸操作,如点击或划击等操作,并根据用户的触摸操作确定人机交互界面中的其中一个屏幕位置点。然后依据人机交互界面中当前显示的测绘区域的地图数据,确定与屏幕位置点匹配的一个地理位置坐标作为参考位置点。其中,地图数据可以是经纬度信息等。
在本公开的一个可选实施例中,检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点,可以包括下述至少一项:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
在本公开实施例中,根据用户在人机交互界面中的触摸操作确定一个屏幕位置点可以具有多种实现方式。可选的,可以将用户的单点触摸操作对应的触摸点确定为屏幕位置点。也可以将用户的划线触摸操作所生成的线段上的一点作为屏幕位置点。例如,将线段的中点作为屏幕位置点。也还可以将用户的画框触摸操作内部的一点作为屏幕位置点,例如,将框体 内的中点作为屏幕位置点。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:获取所述测绘区域的中心点作为所述参考拍照位置点。
另外,在本公开实施例中,参考拍照位置点还可以由控制测绘无人的控制终端自动生成。例如直接将测绘无人机所在的测绘区域的中心点作为参考拍照位置点。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,还可以包括:获取用户输入的地理位置坐标作为所述参考拍照位置点。
在本公开实施例中,还可以直接将用户在输入的地理位置坐标作为参考拍照位置点。可选的,用户可以通过人机交互界面中的软键盘、控制终端中的数字键盘或语音输入等方式输入地理位置坐标。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
在本公开实施例中,还可以通过用户指定的位置信息确定参考拍照位置点。可选的,用户可以通过控制终端向测绘无人机发送位置查询信息。例如,用户在控制终端的人机交互界面触发设定标识向测绘无人机发送位置查询信息以查询测绘无人机的当前位置。测绘无人机接收到位置查询信息后,通过自身的定位装置获取当前地理位置坐标并反馈给控制终端。控制终端可以将接收到的地理位置坐标对应的位置点直接作为参考拍照位置点。相应的,测绘无人机向控制终端发送地理位置坐标时,其在地面的投影点位于测绘区域内部。
在本公开的一个可选实施例中,在向所述测绘无人机发送位置查询信息之前,还可以包括:接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;其中,所述飞行控制指令设置为控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
相应的,如果通过用户指定的位置信息确定参考拍照位置点,则用户可以对控制终端输入针对测绘无人机的至少一项飞行控制指令。控制终端将用户输入的飞行控制指令发送至测绘无人机,以使测绘无人机根据飞行控制指令行驶。在测绘无人机行驶的过程中,如果用户向控制终端输入了位置确认响应,例如,用户输入了停止飞行指令作为位置确认响应,则控制终端可以向测绘无人机发送悬停指令,以控制测绘无人机在当前位置悬停。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,可以包括:将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍 照位置点建立映射关系。
相应的,在获取到参考拍照位置点后,用户可以在组合拍摄点集内的每个拍摄点中任意选择其中一个拍摄点,并将用户选择的组合拍摄点集中的拍摄点与参考拍照位置点建立映射关系。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,还可以包括:将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
在本公开实施例中,可选的,还可以直接将组合拍摄点集内的中心拍摄点与参考拍照位置点建立映射关系。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,还可以包括:计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
在本公开实施例中,可选的,还可以根据参考拍照位置点与测绘区域内每个关键点之间的距离关系来确定映射关系。可选的,将测绘区域的角点以及测绘区域的中心点作为定位关键点,计算参考拍照位置点与测绘区域每个定位关键点之间的距离,并获取距离参考拍照位置点最近的一个定位关键点作为目标参考点。然后根据目标参考点在测绘区域内的位置信息,在组合拍摄点集内选择与位置信息匹配的一个拍摄点与参考拍照位置点建立映射关系。例如,目标参考点位于测绘区域的左上方,则可以在组合拍摄点集内选择左上角的拍摄点与参考拍照位置点建立映射关系。
步骤312、根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点。
其中,辅助拍照位置点可以是测绘区域中区别于参考拍照位置点的其他位置点。
进一步的,当确定参考拍照位置点与组合拍摄点集内的其中一个拍摄点之间的映射关系后,可以根据组合拍摄点集内每个拍摄点之间预设的相对位置关系以及确定的映射关系,进一步确定参考拍照位置点对应的其他多个辅助拍照位置点。
示例性的,假设组合拍摄点集内共包括5个拍摄点,其中拍摄点集内的中心拍摄点与参考拍照位置点建立了映射关系,则可以根据组合拍摄点集内其他4个拍摄点与中心拍摄点之间的位置关系,确定与参考拍照位置点对应的其他4个辅助拍照位置点。
步骤313、将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,得到参考拍照位置点以及每个辅助拍照位置点后,即可将参考拍照位置点以及辅助拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点。测绘无人机可以根据每个测绘采样点进行航拍,并将航拍得到的照片发送至控制对应的控制终端或者地面终端,以使控制终端能够根据获取的照片进行合成得到最终的测绘图像。或者,由于本公开实施例的方案可以大大降低测绘照片的拍摄数量,测绘无人机可以在本机中实现对多照片的合成。
本公开实施例所提供测绘采样点的规划方法所得到的每个测绘采样点对应获取的照片不要求每连续两张照片之间都具有一定的重叠度,因此能够大幅降低图像数据的处理耗时。
步骤320、将所述测绘参数发送至所述测绘无人机。
采用上述技术方案,通过获取与测绘区域对应的参考拍照位置点,将组合拍摄点集内的一个拍摄点与该参考拍照位置点建立映射关系,同时根据组合拍摄点集内每个拍摄点之间预设的相对位置关系以及映射关系,确定与参考拍照位置点对应的多个辅助拍照位置点,进而将参考拍照位置点以及多个辅助拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点,提出了一种新的测绘采样点的规划方法,使用基于组合拍摄点集的多测绘点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例四
图4a是本公开实施例四提供的一种控制终端侧作业控制方法的流程图,本实施例以上述实施例为基础进行细化,在本实施例中,给出了通过控制终端确定与测绘区域匹配的测绘参数的另外一种实现方式。相应的,如图4a所示,本实施例的方法可以包括:
步骤410、将作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,步骤410可以包括下述操作:
步骤411、根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域。
其中,组合拍摄区域可以是按照组合拍摄点集内每个拍摄点进行拍照后,所得照片合成后的区域。也即,组合拍摄区域可以是组合拍摄点集所能捕捉到的整体拍照区域。测绘区域信息可以是测绘区域的相关信息,如测绘区域的区域形状或大小等。测绘组合拍摄区域可以是与组合拍摄区域大小相同的拍摄区域,一个测绘组合拍摄区域对应地块内的一个实际的拍 摄范围,也即,测绘组合拍摄区域同时包括的区域大小,以及区域的地理位置信息这两项关键信息。
在本公开实施例中,在确定测绘无人机的测绘采样点之前,首先获取与组合拍摄点集对应的组合拍摄区域,然后可以根据组合拍摄区域以及测绘区域的大小等信息,在测绘区域内确定一个或多个测绘组合拍摄区域。如果测绘组合拍摄区域为一个,则测绘组合拍摄区域能够完全覆盖测绘区域;如果测绘组合拍摄区域为多个,则多个测绘组合拍摄区域合成后能够完全覆盖测绘区域。示例性的,假设组合拍摄区域为10m*10m的正方形,测绘区域为10m*20m的矩形,则应该包括最少两个测绘组合拍摄区域才能完全覆盖测绘区域。
在本公开的一个可选实施例中,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合,和/或拼接形成所述测绘区域的测绘地图。
也即,测绘组合拍摄区域与组合拍摄区域是一致的,只不过组合拍摄区域没有与测绘区域建立对应关系,而测绘组合拍摄区域可以是测绘区域中分割形成的彼此之间相互独立的拍摄区域,其拍摄区域的形状和大小与组合拍摄区域相同。测绘组合拍摄区域之间的重叠区域可以根据实际需求设定,例如,重叠区域占测绘组合拍摄区域的30%或50%等,本公开实施例并不对测绘组合拍摄区域之间的重叠区域的数值进行限定。
在本公开实施例中,为了使测绘无人机所获取的照片能够拼接组成完整的测绘区域的图像,可选的,测绘无人机按照组合拍摄点集中的多个拍摄点所拍摄的多张照片之间具有重叠区域。相应的,按照组合拍摄点集中的多个拍摄点拍摄多张照片后,可以将多张照片组合,和/或拼接形成的一个完整的组合区域。该组合区域可以完整覆盖测绘区域,也可以仅覆盖测绘区域的一部分,本实施例对此并不进行限制。本公开实施例中的多张照片之间具有重叠区域并不是每连续两张照片之间具有重叠区域。同理,为了保证测绘无人机获取的每个照片能够按照重叠部分合成,形成一个完整的图像,在测绘区域内确定的多个测绘组合拍摄区域之间同样具有重叠区域。可选的,可以是每两个相邻的测绘组合拍摄区域之间具有重叠区域,以使每个测绘组合拍摄区域进行组合,和/或拼接能够形成测绘区域的测绘信息。
在本公开的一个可选实施例中,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,可以包括:在所述测绘区域内选择一个定位点;根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合 拍摄区域;如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
其中,定位点可以是测绘区域内的一个位置点,设置为在测绘区域内对测绘组合拍摄区域进行定位。
在本公开实施例中,定位点可以是根据实际需求在测绘区域内选择的一个位置点,如测绘区域的角点或中心点等。可以通过一个定位点在测绘区域内首先确定一个测绘组合拍摄区域。例如,如果测绘区域为矩形,则可以选择测绘区域的左上角顶点作为定位点,将组合拍摄区域的左上角顶点与定位点重合,则组合拍摄区域在测绘区域内就形成一个对应的测绘组合拍摄区域。利用定位点以及组合拍摄区域,在测绘区域内确定一个测绘组合拍摄区域时,要保障测绘组合拍摄区域能够最大程度地覆盖测绘区域。相应的,在利用定位点以及组合拍摄区域在测绘区域内确定一个测绘组合拍摄区域后,可以判断确定的测绘组合拍摄区域是否能够完整覆盖测绘区域。如果能够完全覆盖,则无需再确定其他的测绘组合拍摄区域。如果一个测绘组合拍摄区域不能完整覆盖测绘区域,则在测绘区域内选择新的定位点,并返回执行根据定位点以及组合拍摄区域,在测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖测绘区域的全部测绘组合拍摄区域。在重新选择新的定位点时,应注意新的定位点所确定的测绘组合拍摄区域与相邻的测绘组合拍摄区域之间具有重叠区域。
在本公开的一个可选实施例中,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还可以包括:检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
其中,屏幕选择区域可以是用户在测绘无人机的控制终端的人机交互界面中的触摸操作所形成的区域,其可以是任意形状和大小(不超过屏幕的大小)的区域,本公开实施例并不对屏幕选择区域的形状和大小进行限定。
在本公开实施例中,测绘区域可以由操控测绘无人机的用户实时指定生成。例如,通过检测用户在人机交互界面中的触摸操作以获取与触摸操作匹配的屏幕选择区域,并根据人机交互界面中当前显示的地图数据为屏幕选择区域确定匹配的地理位置区域,以将确定的地理位置区域作为测绘区域信息。
在本公开的一个可选实施例中,检测用户在人机交互界面中的触摸操作,并获取与所述 触摸操作匹配的屏幕选择区域,可以包括:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,可以将检测到的用户的单点触摸操作所形成的封闭区域作为与触摸操作匹配的屏幕选择区域。例如,将用户的至少三个触摸点的连线所围成的封闭区域确定为屏幕选择区域。还可以将检测到的用户的画框触摸操作所生成的框体作为屏幕选择区域。
步骤412、根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点。
其中,拍照位置点可以是测绘区域中的一个位置点,具有匹配的地理位置坐标。
在本公开实施例中,拍照位置点可以根据组合拍摄点集中每个拍摄点之间预设的相对位置关系确定。
在本公开的一个可选实施例中,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,可以包括:将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
在本公开实施例中,由于一个测绘组合拍摄区域与一个组合拍摄区域是相对应的,因此在确定拍照位置点时,可以将组合拍摄区域对应的组合拍摄点集中的每个拍摄点映射到测绘组合拍摄区域中作为拍照位置点。可选的,在映射时,可以首先将组合拍摄点集中的中心拍摄点映射至测绘组合拍摄区域的区域中点,从而将测绘组合拍摄区域的区域中点作为一个拍照位置点。
进一步的,当确定将测绘组合拍摄区域的区域中点作为一个拍照位置点后,可以根据组合拍摄点集中的每个周围拍摄点与中心拍摄点之间的相对位置关系,将每个周围拍摄点分别映射至测绘组合拍摄区域中,并将形成的多个映射点作为拍照位置点。
图4b是本公开实施例四提供的一种每个拍照位置点的分布示意图。在一个例子中,如图4b所示,两个中心点40和50分别为测绘组合拍摄区域的区域中点,相应的,区域中点40与四个周围拍照位置点410是一个测绘组合拍摄区域,区域中点50与四个周围拍照位置点510是一个测绘组合拍摄区域。两个测绘组合拍摄区域中的区域中点与周围拍照位置点之 间的相对位置关系,与组合拍摄点集中的每个周围拍摄点与中心拍摄点之间预设的相对位置关系是相同的。
步骤413、将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,得到每个拍照位置点后,即可将拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点。测绘无人机可以根据每个测绘采样点进行航拍,并将航拍得到的照片发送至控制对应的控制终端或者地面终端,以使控制终端能够根据获取的照片进行合成得到最终的测绘图像。或者,由于本公开实施例的方案可以大大降低测绘照片的拍摄数量,测绘无人机可以在本机中实现对多照片的合成。
步骤420、将所述测绘参数发送至所述测绘无人机。
在本公开的一个可选实施例中,在将所述测绘参数发送至所述测绘无人机之前,还可以包括:获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;所述测绘参数还包括:所述飞行高度,所述飞行高度设置为指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。在获取测绘无人机所携带的拍照设备的拍摄参数之前,还可以包括:根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
其中,单照片拍摄区域即为单张照片能够捕捉到的实际的测绘区域。预设的照片重叠度指标可以是根据实际需求设定的重叠度指标,如50%、60%或70%等,虽然本公开实施例并不对预设的照片重叠度指标的数值进行限定,但预设的照片重叠度指标要满足将每个照片按照重叠部分进行合成时,能够形成一个完整的矩形。
在本公开实施例中,对测绘无人机获取的照片进行合成以得到最终的测绘图像之前,要先确定测绘无人机在设定飞行高度下的单照片拍摄区域,以根据单照片拍摄区域的大小确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。每个拍摄点都对应一个单照片拍摄区域,例如,拍摄点为单照片拍摄区域的中点或其中的一个顶点。可以根据预设的照片重叠度指标以及单照片拍摄区域,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。
本公开实施例中的测绘参数还可以包括飞行高度,设置为指示测绘无人机以飞行高度在测绘区域中进行飞行拍摄。可以理解的是,当测绘无人机的拍照设备,如摄像头的拍摄参数固定时,测绘无人机的飞行高度直接影响了地面像元分辨率。而地面像元分辨率又直接决定了单张照片所能涵盖的测绘区域的面积。因此,在利用测绘无人机对测绘区域进行航拍之前, 首先要确定测绘无人机的设定飞行高度。可以根据拍照设备的像元宽度、拍照设备的镜头焦距以及地面像元分辨率,计算测绘无人机的设定飞行高度。可选的,由地面像元分辨率=飞行高度*像元宽度/镜头焦距,可以得到飞行高度=地面像元分辨率*镜头焦距/像元宽度。其中,像元宽度=拍照设备传感器尺寸宽度/画幅宽度。
在本公开的一个可选实施例中,获取测绘无人机所携带的拍照设备的拍摄参数,可以包括:根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
在本公开实施例中,进一步的,可以根据拍照设备的像元宽度、拍摄设备的画幅大小以及地面像元分辨率,计算测绘无人机在设定飞行高度下的单照片拍摄区域。可选的,单照片拍摄区域=地面像元分辨率*画幅大小,而地面像元分辨率=飞行高度*像元宽度/镜头焦距。
也即:单照片拍摄长度=地面像元分辨率*画幅长度;单照片拍摄宽度=地面像元分辨率*画幅宽度。例如,画幅大小为3456*4608,地面像元分辨率为0.05m,则单照片拍摄区域为172.8m*230.4m。
在本公开的一个可选实施例中,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,可以包括:根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
其中,目标点可以是二维坐标系中的任意一点,例如,目标点可以为二维坐标系的原点。
可选的,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系时,可以首先根据拍照设备的画幅大小以及拍照设备的像元宽度,确定单照片尺寸。其中,单照片尺寸=画幅大小*像元宽度(也即:单照片长度=画幅长度*像元宽度;单照片宽度=画幅宽度*像元宽度)。然后在二维坐标系中选取一个目标点作为组合拍摄点集中的中心拍摄点。进一步的,根据中心拍摄点以及单照片尺寸在二维坐标系中生成中心照片。例如,将中心拍摄点作为中心照片的中点并根据单照片尺寸生成对应的中心照片。然后可以在中心照片的左上角、左下角、右上角以及右下角四个方位,根据单照片尺寸以及照片重叠度指标分别生成与中心照片匹配的四张周围照片。中心照片和其匹配的四张周围照片均不是真正意义拍摄获取的照片,而是与 单张照片大小和形状相同的一个矩形区域。相应的,获取到中心照片以及匹配的四张周围照片后,可以根据单照片尺寸与单照片拍摄区域之间的映射关系,确定与每个周围照片对应的周围拍摄点在二维坐标系中的坐标值。例如,单照片尺寸为10cm*10cm,照片重叠度指标为50%,左上角、左下角、右上角以及右下角对应的周围照片分配对应左上角、左下角、右上角以及右下角的单照片拍摄区域,且单照片尺寸与单照片拍摄区域的映射关系为1∶200,则单照片拍摄区域相应为20m*20m。如果将周围照片的中点作为每个周围拍摄点,中心拍摄点采用坐标原点,则每个周围拍摄点的坐标值分别可以是(-10,10)、(-10,-10)、(10,10)以及(10,-10),单位为m。相应的,得到每个周围拍摄点对应的坐标值后,即可根据中心拍摄点以及每个周围拍摄点在二维坐标系中的坐标值,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。例如,上述例子中,组合拍摄点集中位于每个顶点处的周围拍摄点之间的相对距离为20m,中心点处的中心拍摄点与周围拍摄点之间的相对距离为
Figure PCTCN2018116661-appb-000001
采用上述技术方案,通过获取与组合拍摄点集对应的组合拍摄区域,以根据组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,进而根据组合拍摄点集中每个拍摄点之间预设的相对位置关系,在测绘组合拍摄区域中确定多个拍照位置点,并将多个拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点,提出了一种新的测绘采样点的规划方法,使用基于组合拍摄点集的多测绘点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例五
图5是本公开实施例五提供的一种作业无人机侧作业控制方法的流程图,本实施例可适用于作业无人机根据作业航线进行飞行作业的情况,该方法可以由作业无人机侧作业控制装置来执行,该装置可以由软件和/或硬件的方式来实现,并一般可集成在无人机设备中,与负责控制无人机的控制终端配合使用。相应的,如图5所示,该方法包括如下操作:
步骤510、接收控制终端发送的作业航线。
在本公开实施例中,作业无人机可以接收控制终端发送的作业航线。其中,控制终端通过获取与作业区域对应的地图瓦片数据,并根据地图瓦片数据生成作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线。
步骤520、根据所述作业航线在所述至少一个作业地块中进行飞行作业。
在本公开实施例中,作业无人机接收到控制终端发送的作业航线后,即可根据作业航线 在匹配的至少一个作业地块中进行飞行作业。
本公开实施例通过控制终端获取与作业区域对应的地图瓦片数据以生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机,以使作业无人机根据作业航线在至少一个作业地块中进行飞行作业,提出了一种作业控制系统和作业控制方法,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。
实施例六
图6是本公开实施例六提供的一种控制终端侧作业控制装置的示意图,如图6所示,所述装置包括:地图瓦片数据获取模块610、地图显示模块620、作业地块确定模块630以及作业航线生成发送模块640,其中:
地图瓦片数据获取模块610,设置为获取与作业区域对应的地图瓦片数据;
地图显示模块620,设置为根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;
作业地块确定模块630,设置为根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;
作业航线生成发送模块640,设置为生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
本实施例的技术方案,通过利用控制终端获取与作业区域对应的地图瓦片数据以生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机,以使作业无人机根据作业航线在至少一个作业地块中进行飞行作业,提出了一种作业控制系统和作业控制方法,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。
可选的,所述装置还包括:测绘参数确定模块,设置为将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;测绘参数发送模块,设置为将所述测绘参数发送至所述测绘无人机,其中,所述测绘参数设置为指示所述测绘无人机在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,以生成所述测绘区域的地图瓦片数据。
可选的,测绘参数确定模块,包括:拍照位置点获取单元,设置为获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;辅助拍照位置点确定单元,设置为根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;第一测绘采样点确定单元,设置为将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,测绘参数确定模块,包括:测绘组合拍摄区域确定单元,设置为根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;拍照位置点确定单元,设置为根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点;第二测绘采样点确定单元,设置为将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合,和/或拼接形成所述测绘区域的测绘地图。
可选的,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
可选的,拍照位置点获取单元,是设置为检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
可选的,拍照位置点获取单元,是设置为如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
可选的,拍照位置点获取单元,是设置为获取所述测绘区域的中心点作为所述参考拍照位置点。
可选的,拍照位置点获取单元,是设置为向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
可选的,所述装置还包括:飞行控制指令发送模块,设置为接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;悬停指令发送模块,设置为在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;其中,所述飞行控制指令设置为控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
可选的,拍照位置点获取单元,是设置为获取用户输入的地理位置坐标作为所述参考拍照位置点。
可选的,拍照位置点获取单元,是设置为将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
可选的,拍照位置点获取单元,是设置为将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
可选的,拍照位置点获取单元,是设置为计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
可选的,测绘组合拍摄区域确定单元,是设置为在所述测绘区域内选择一个定位点;根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
可选的,拍照位置点确定单元,是设置为将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
可选的,所述装置还包括:屏幕选择区域获取模块,设置为检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;测绘区域信息获取模块,设置为在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区 域作为所述测绘区域信息。
可选的,屏幕选择区域获取模块,是设置为如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,所述装置还包括:拍摄参数获取模块,设置为获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;相对位置关系确定模块,设置为根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;所述测绘参数还包括:所述飞行高度,所述飞行高度设置为指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
可选的,相对位置关系确定模块,是设置为根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
可选的,所述装置还包括:飞行高度计算模块,设置为根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
可选的,拍摄参数获取模块,是设置为根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
上述控制终端侧作业控制装置可执行本公开任意实施例所提供的控制终端侧作业控制方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本公开任意实施例提供的控制终端侧作业控制方法。
实施例七
图7是本公开实施例七提供的一种作业无人机侧作业控制装置的示意图,如图7所示, 所述装置包括:作业航线接收模块710以及飞行作业模块720,其中:
作业航线接收模块710,设置为接收控制终端发送的作业航线;
飞行作业模块720,设置为根据所述作业航线在所述至少一个作业地块中进行飞行作业。
本公开实施例通过控制终端获取与作业区域对应的地图瓦片数据以生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机,以使作业无人机根据作业航线在至少一个作业地块中进行飞行作业,提出了一种作业控制系统和作业控制方法,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。
上述作业无人机侧作业控制装置可执行本公开任意实施例所提供的作业无人机侧作业控制方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本公开任意实施例提供的作业无人机侧作业控制方法。
实施例八
图8为本公开实施例八提供的一种控制终端的结构示意图。图8示出了适于用来实现本公开实施方式的控制终端812的框图。图8显示的控制终端812仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图8所示,控制终端812以通用计算设备的形式表现。控制终端812的组件可以包括但不限于:一个或者多个处理器816,存储装置828,连接不同系统组件(包括存储装置828和处理器816)的总线818。
总线818表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MCA)总线,增强型ISA总线、视频电子标准协会(Video Electronics Standards Association,VESA)局域总线以及外围组件互连(Peripheral Component Interconnect,PCI)总线。
控制终端812典型地包括多种计算机系统可读介质。这些介质可以是任何能够被控制终端812访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
存储装置828可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(Random Access Memory,RAM)830和/或高速缓存存储器832。控制终端812可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系 统834可以设置为读写不可移动的、非易失性磁介质(图8未显示,通常称为“硬盘驱动器”)。尽管图8中未示出,可以提供设置为对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如只读光盘(Compact Disc-Read Only Memory,CD-ROM)、数字视盘(Digital Video Disc-Read Only Memory,DVD-ROM)或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线818相连。存储装置828可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本公开每个实施例的功能。
具有一组(至少一个)程序模块826的程序836,可以存储在例如存储装置828中,这样的程序模块826包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块826通常执行本公开所描述的实施例中的功能和/或方法。
控制终端812也可以与一个或多个外部设备814(例如键盘、指向设备、摄像头、显示器824等)通信,还可与一个或者多个使得用户能与该控制终端812交互的设备通信,和/或与使得该控制终端812能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口822进行。并且,控制终端812还可以通过网络适配器820与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器820通过总线818与控制终端812的其它模块通信。应当明白,尽管图中未示出,可以结合控制终端812使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)系统、磁带驱动器以及数据备份存储系统等。
处理器816通过运行存储在存储装置828中的程序,从而执行每种功能应用以及数据处理,例如实现本公开上述实施例所提供的控制终端侧作业控制方法。
也即,所述处理单元执行所述程序时实现:获取与作业区域对应的地图瓦片数据;根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
实施例九
本实施例九是本公开实施例提供的一种设置为执行本公开任一实施例所提供的作业无人机侧作业控制方法的作业无人机,该作业无人机包括:一个或多个处理器;存储装置,设 置为存储一个或多个程序;当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如本公开任一实施例所提供的作业无人机侧作业控制方法:接收控制终端发送的作业航线;根据所述作业航线在所述至少一个作业地块中进行飞行作业。其结构以及细节内容可参照图8和实施例八。
实施例十
本公开实施例十还提供一种存储计算机程序的计算机存储介质,所述计算机程序在由计算机处理器执行时设置为执行本公开上述实施例任一所述的控制终端侧作业控制方法:获取与作业区域对应的地图瓦片数据;根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。或者所述计算机程序在由计算机处理器执行时设置为执行本公开上述实施例任一所述的作业无人机侧作业控制方法:接收控制终端发送的作业航线;根据所述作业航线在所述至少一个作业地块中进行飞行作业。
本公开实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更细化的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器、只读存储器(Read Only Memory,ROM)、可擦式可编程只读存储器((Erasable Programmable Read Only Memory,EPROM)或闪存)、光纤、便携式紧凑磁盘只读存储器、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输设置为由指令执行系统、装置或者器件使用或者与其结合使用的程序。
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写设置为执行本公开操作的计算机程序 代码,所述程序设计语言包括面向对象的程序设计语言-诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言——诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网或广域网-连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
工业实用性
本公开实施例提供了一种作业控制系统、作业控制方法、装置、设备及介质,使用控制终端针对用户选择的作业地块自动生成对应的作业航线,解决现有无人机作业成本高及航线生成效率低等问题,提高了无人机作业航线的生成效率及无人机作业控制的智能化。

Claims (32)

  1. 一种作业控制系统,包括:控制终端以及作业无人机,其中:
    所述控制终端,设置为获取与作业区域对应的地图瓦片数据,并根据所述地图瓦片数据生成所述作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块,并生成与所述作业地块对应的作业航线发送至所述作业无人机;
    所述作业无人机,设置为接收所述作业航线,并根据所述作业航线在所述至少一个作业地块中进行飞行作业。
  2. 根据权利要求1所述的系统,还包括:测绘无人机;
    所述控制终端还设置为:将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    所述测绘无人机设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,所述测绘照片集合设置为生成所述测绘区域的地图瓦片数据。
  3. 根据权利要求1或2所述的系统,还包括:地面终端;
    所述地面终端,设置为获取所述测绘照片集合,将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图,并根据所述测绘地图生成与所述测绘区域对应的地图瓦片数据;
    所述控制终端是设置为,从所述地面终端获取与作业区域对应的地图瓦片数据。
  4. 一种控制终端侧作业控制方法,应用于如权利要求1-3任一项所述的作业控制系统中,包括:
    获取与作业区域对应的地图瓦片数据;
    根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;
    根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;
    生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
  5. 根据权利要求4所述的方法,在获取与作业区域对应的地图瓦片数据之前,还包括:
    将所述作业区域作为测绘区域,确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    将所述测绘参数发送至所述测绘无人机,其中,所述测绘参数设置为指示所述测绘无人 机在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,以生成所述测绘区域的地图瓦片数据。
  6. 根据权利要求5所述的方法,其中,确定与测绘区域匹配的测绘参数,包括:
    获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;
    根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;
    将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
  7. 根据权利要求5所述的方法,其中,确定与测绘区域匹配的测绘参数,包括:
    根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;
    根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点;
    将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
  8. 根据权利要求7所述的方法,其中:
    按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域;
    在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
    按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,以及,在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
    其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片进行下述至少一项:照片组合、照片拼接,形成的拍摄区域;将每个所述测绘组合拍摄区域进行下述至少一项:照片组合、照片拼接,形成所述测绘区域的测绘地图。
  9. 根据权利要求6或7所述的方法,其中,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;
    其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
  10. 根据权利要求6所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;
    在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
  11. 根据权利要求10所述的方法,其中,检测用户在人机交互界面中的触摸操作,并 根据所述触摸操作确定一个屏幕位置点,包括下述至少一项:
    如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
    如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
    如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
  12. 根据权利要求6所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    获取所述测绘区域的中心点作为所述参考拍照位置点。
  13. 根据权利要求6所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;
    其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
  14. 根据权利要求13所述的方法,在向所述测绘无人机发送位置查询信息之前,还包括:
    接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;
    在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;
    其中,所述飞行控制指令设置为控制所述测绘无人机在空中进行下述至少一项:设定方向的移动、设定距离的移动。
  15. 根据权利要求6所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    获取用户输入的地理位置坐标作为所述参考拍照位置点。
  16. 根据权利要求6所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
  17. 根据权利要求9所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
  18. 根据权利要求9所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;
    获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;
    根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
  19. 根据权利要求7所述的方法,其中,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,包括:
    在所述测绘区域内选择一个定位点;
    根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;
    如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
  20. 根据权利要求9所述的方法,其中,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,包括:
    将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;
    根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
  21. 根据权利要求7所述的方法,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还包括:
    检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;
    在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
  22. 根据权利要求21所述的方法,其中,检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域,包括下述至少一项:
    如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;
    如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域;
    如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连 线所围成的封闭区域确定为所述屏幕选择区域,以及,如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
  23. 根据权利要求9述的方法,在将所述测绘参数发送至所述测绘无人机之前,还包括:
    获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;
    根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;
    所述测绘参数还包括:所述飞行高度,所述飞行高度设置为指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
  24. 根据权利要求23所述的方法,其中,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,包括:
    根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;
    构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;
    根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;
    在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;
    根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;
    根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
  25. 根据权利要求23所述的方法,在获取测绘无人机所携带的拍照设备的拍摄参数之前,还包括:
    根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
  26. 根据权利要求23所述的方法,其中,获取测绘无人机所携带的拍照设备的拍摄参数,包括:
    根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
  27. 一种作业无人机侧作业控制方法,应用于如权利要求1-3任一项所述的作业控制系统中,包括:
    接收控制终端发送的作业航线;
    根据所述作业航线在所述至少一个作业地块中进行飞行作业。
  28. 一种控制终端侧作业控制装置,应用于如权利要求1-3任一项所述的作业控制系统中,包括:
    地图瓦片数据获取模块,设置为获取与作业区域对应的地图瓦片数据;
    地图显示模块,设置为根据所述地图瓦片数据生成所述作业区域的区域地图进行显示;
    作业地块确定模块,设置为根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块;
    作业航线生成发送模块,设置为生成与所述作业地块对应的作业航线发送至作业无人机,以使所述作业无人机根据所述作业航线进行飞行作业。
  29. 一种作业无人机侧作业控制装置,应用于如权利要求1-3任一项所述的作业控制系统中,包括:
    作业航线接收模块,设置为接收控制终端发送的作业航线;
    飞行作业模块,设置为根据所述作业航线在所述至少一个作业地块中进行飞行作业。
  30. 一种控制终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如权利要求4-26中任一所述的方法。
  31. 一种作业无人机,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如权利要求27所述的方法。
  32. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如权利要求4-26中任一所述的控制终端侧作业控制方法,或者实现如权利要求27所述的作业无人机侧作业控制方法。
PCT/CN2018/116661 2018-11-21 2018-11-21 一种作业控制系统、作业控制方法、装置、设备及介质 WO2020103024A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP18940965.9A EP3885940A4 (en) 2018-11-21 2018-11-21 TASK CONTROL SYSTEM, TASK CONTROL PROCESS, APPARATUS, DEVICE AND SUPPORT
AU2018450271A AU2018450271B2 (en) 2018-11-21 2018-11-21 Operation control system, and operation control method and device
CA3120732A CA3120732A1 (en) 2018-11-21 2018-11-21 Operation control system, operation control method and device and medium
PCT/CN2018/116661 WO2020103024A1 (zh) 2018-11-21 2018-11-21 一种作业控制系统、作业控制方法、装置、设备及介质
JP2021527156A JP2022509082A (ja) 2018-11-21 2018-11-21 作業制御システム、作業制御方法、装置及びデバイス
CN201880080716.3A CN111868656B (zh) 2018-11-21 2018-11-21 一种作业控制系统、作业控制方法、装置、设备及介质
KR1020217016659A KR20210106422A (ko) 2018-11-21 2018-11-21 작업 제어 시스템, 작업 제어 방법, 장치 및 기기

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/116661 WO2020103024A1 (zh) 2018-11-21 2018-11-21 一种作业控制系统、作业控制方法、装置、设备及介质

Publications (1)

Publication Number Publication Date
WO2020103024A1 true WO2020103024A1 (zh) 2020-05-28

Family

ID=70774286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/116661 WO2020103024A1 (zh) 2018-11-21 2018-11-21 一种作业控制系统、作业控制方法、装置、设备及介质

Country Status (7)

Country Link
EP (1) EP3885940A4 (zh)
JP (1) JP2022509082A (zh)
KR (1) KR20210106422A (zh)
CN (1) CN111868656B (zh)
AU (1) AU2018450271B2 (zh)
CA (1) CA3120732A1 (zh)
WO (1) WO2020103024A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023000278A1 (zh) * 2021-07-22 2023-01-26 深圳市大疆创新科技有限公司 作业规划方法、控制终端及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114038267B (zh) * 2021-11-30 2023-06-06 重庆电子工程职业学院 一种无人机航空摄影测量教学系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120141026A1 (en) * 2010-12-03 2012-06-07 Electronics & Telecommunications Research Institute Method and system for providing tile map service using image fusion
CN104573733A (zh) * 2014-12-26 2015-04-29 上海交通大学 一种基于高清正射影像图的高精细地图生成系统及方法
CN108536829A (zh) * 2018-04-11 2018-09-14 中国中医科学院中药研究所 一种提高无人机航测数据生成瓦片地图效率的方法
CN108536863A (zh) * 2018-04-20 2018-09-14 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定区域更新方法及系统
CN108647252A (zh) * 2018-04-20 2018-10-12 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定特征更新方法及系统
CN108846004A (zh) * 2018-04-20 2018-11-20 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定目标更新方法及系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105352481A (zh) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 高精度无人机影像无控制点测绘成图方法及系统
US20170221241A1 (en) * 2016-01-28 2017-08-03 8681384 Canada Inc. System, method and apparatus for generating building maps
JP6700868B2 (ja) * 2016-03-03 2020-05-27 キヤノン株式会社 撮影制御装置及びその制御方法
CN106043694B (zh) * 2016-05-20 2019-09-17 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
CN106477038B (zh) * 2016-12-20 2018-12-25 北京小米移动软件有限公司 图像拍摄方法及装置、无人机
CN107797565A (zh) * 2017-10-13 2018-03-13 南京涵曦月自动化科技有限公司 一种实时监控无人机控制面板
CN108008735A (zh) * 2017-11-07 2018-05-08 深圳常锋信息技术有限公司 无人机的植保作业控制方法、系统及终端设备
CN108701373B (zh) * 2017-11-07 2022-05-17 深圳市大疆创新科技有限公司 基于无人机航拍的三维重建方法、系统及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120141026A1 (en) * 2010-12-03 2012-06-07 Electronics & Telecommunications Research Institute Method and system for providing tile map service using image fusion
CN104573733A (zh) * 2014-12-26 2015-04-29 上海交通大学 一种基于高清正射影像图的高精细地图生成系统及方法
CN108536829A (zh) * 2018-04-11 2018-09-14 中国中医科学院中药研究所 一种提高无人机航测数据生成瓦片地图效率的方法
CN108536863A (zh) * 2018-04-20 2018-09-14 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定区域更新方法及系统
CN108647252A (zh) * 2018-04-20 2018-10-12 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定特征更新方法及系统
CN108846004A (zh) * 2018-04-20 2018-11-20 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定目标更新方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3885940A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023000278A1 (zh) * 2021-07-22 2023-01-26 深圳市大疆创新科技有限公司 作业规划方法、控制终端及存储介质

Also Published As

Publication number Publication date
CN111868656A (zh) 2020-10-30
EP3885940A4 (en) 2021-10-27
AU2018450271A1 (en) 2021-06-24
KR20210106422A (ko) 2021-08-30
AU2018450271B2 (en) 2022-12-15
CA3120732A1 (en) 2020-05-28
JP2022509082A (ja) 2022-01-20
CN111868656B (zh) 2022-11-08
EP3885940A1 (en) 2021-09-29

Similar Documents

Publication Publication Date Title
WO2020103022A1 (zh) 一种测绘系统、测绘方法、装置、设备及介质
WO2020103020A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2020103023A1 (zh) 一种测绘系统、测绘方法、装置、设备及介质
WO2020103019A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2020103021A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2020103024A1 (zh) 一种作业控制系统、作业控制方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18940965

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021527156

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3120732

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018450271

Country of ref document: AU

Date of ref document: 20181121

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018940965

Country of ref document: EP

Effective date: 20210621