WO2020103023A1 - Système, procédé, appareil, dispositif et support d'arpentage et de cartographie - Google Patents

Système, procédé, appareil, dispositif et support d'arpentage et de cartographie

Info

Publication number
WO2020103023A1
WO2020103023A1 PCT/CN2018/116660 CN2018116660W WO2020103023A1 WO 2020103023 A1 WO2020103023 A1 WO 2020103023A1 CN 2018116660 W CN2018116660 W CN 2018116660W WO 2020103023 A1 WO2020103023 A1 WO 2020103023A1
Authority
WO
WIPO (PCT)
Prior art keywords
surveying
mapping
area
shooting
point
Prior art date
Application number
PCT/CN2018/116660
Other languages
English (en)
Chinese (zh)
Inventor
刘鹏
金晓会
Original Assignee
广州极飞科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州极飞科技有限公司 filed Critical 广州极飞科技有限公司
Priority to AU2018449839A priority Critical patent/AU2018449839B2/en
Priority to EP18941071.5A priority patent/EP3885702A4/fr
Priority to CA3120727A priority patent/CA3120727A1/fr
Priority to KR1020217016658A priority patent/KR20210105345A/ko
Priority to PCT/CN2018/116660 priority patent/WO2020103023A1/fr
Priority to JP2021527154A priority patent/JP7182710B2/ja
Priority to CN201880091778.4A priority patent/CN112469967B/zh
Publication of WO2020103023A1 publication Critical patent/WO2020103023A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/004Map manufacture or repair; Tear or ink or water resistant maps; Long-life maps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching

Definitions

  • the embodiments of the present disclosure relate to the field of surveying and mapping technology, for example, to a surveying and mapping system, a surveying and mapping method, a device, a device, and a medium.
  • UAV aerial surveying and mapping (referred to as aerial surveying) technology can greatly reduce the investment in the work cycle, human and financial resources of traditional aerial surveying and mapping technology, and has more practical significance in the field of surveying and mapping.
  • UAV aerial surveying and mapping technology carries out the observation of the current situation of the aerial photography area through the remote video transmission technology through the equipped video capture equipment, and at the same time uses the aerial image stitching technology to stitch the captured photos to obtain the overall image of the aerial photography area.
  • the traditional UAV aerial survey method generally adopts the method of parallel line traversal to carry out mobile surveying in the surveying area when taking photos, and in order to ensure the success of stitching, it is usually required that there is a certain degree of overlap between each two consecutive photos.
  • a photo is required to have a certain degree of overlap with other photos in the horizontal and vertical directions.
  • the degree of overlap is generally required to be greater than 50%.
  • the opener found that the related technology has the following defects: traditional UAV aerial survey methods are used to survey and map the aerial photography area of a large area of land, and multiple high-overlap photos are taken during the survey .
  • traditional UAV aerial survey methods are used to survey and map the aerial photography area of a large area of land, and multiple high-overlap photos are taken during the survey .
  • stitching the above photos taken by the drone it takes a long time and the efficiency is low.
  • the photos obtained by the drone are uploaded to the server for splicing, the data upload and processing process will take longer.
  • the traditional UAV aerial survey method is applied to small plot surveying and mapping, it not only has complicated operations, but also has long processing time and high hardware cost.
  • the embodiments of the present disclosure provide a surveying and mapping system, a surveying and mapping method, a device, equipment and a medium to reduce the cost of surveying and mapping and improve the efficiency of surveying and mapping.
  • An embodiment of the present disclosure provides a surveying and mapping system, including: a control terminal and a surveying and mapping drone, wherein:
  • the control terminal is configured to determine the mapping parameters that match the mapping area and send the mapping parameters to the mapping drone, the mapping parameters include: the mapping drone in the mapping area Multiple sampling points for surveying and mapping;
  • the surveying and mapping drone is set to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points, and convert A plurality of photos in the collection of surveying photos are combined and / or stitched together to obtain a surveying map corresponding to the surveying area.
  • the surveying and mapping drone is further configured to generate map tile data corresponding to the surveying and mapping area according to a surveying and mapping map corresponding to the surveying and mapping area.
  • system further includes: operating drones;
  • the control terminal is further configured to use the surveying and mapping area as a work area, acquire map tile data corresponding to the work area from the surveying and mapping drone, and generate the work area based on the map tile data Display an area map of the area, determine at least one work plot within the work area based on at least one area anchor point selected by the user for the area map, and generate a work route corresponding to the work plot to send to the Operating drone
  • the operation drone is configured to receive the operation route and perform flight operations in the at least one operation plot according to the operation route.
  • An embodiment of the present disclosure also provides a method for controlling terminal side surveying and mapping, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the surveying and mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area;
  • determining the mapping parameters matching the mapping area includes:
  • the reference photographing position point and the plurality of auxiliary photographing position points are used as a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area.
  • determining the mapping parameters matching the mapping area includes:
  • each shooting point in the combined shooting point set determines a plurality of shooting position points in the surveying and mapping combined shooting area
  • the plurality of photographing location points are used as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined, and / or spliced to form a surveying and mapping map of the surveying and mapping area.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are four vertices of a rectangle centered on the center shooting point;
  • the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is rectangular.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • detecting a user's touch operation in the human-computer interaction interface and determining a screen position point according to the touch operation includes at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the method before sending location query information to the surveying and mapping drone, the method further includes:
  • the flight control instruction is set to: control the mapping drone to move in the air in a set direction and / or a set distance.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • the positioning key point including: a corner point of the surveying and mapping area and a center point of the surveying and mapping area;
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point.
  • one or more combined mapping and shooting areas are determined within the mapping area, including:
  • the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area includes:
  • mapping each of the surrounding shooting points into the surveying and mapping combined shooting area maps each of the surrounding shooting points into the surveying and mapping combined shooting area.
  • the formed multiple mapping points serve as the photographing location point.
  • the method before determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, the method further includes:
  • mapping area information In the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the method before sending the mapping parameters to the mapping drone, the method further includes:
  • the shooting parameters include a single photo shooting area of the surveying and mapping drone at a set flight height, and each shooting point corresponds to a single photo shooting area ;
  • the surveying and mapping parameters further include: the flying height, and the flying height is set to instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • determining the preset relative position relationship between each shooting point in the combined shooting point set includes:
  • the lower left corner, the upper right corner, and the lower right corner of the center photo respectively generate four surrounding photos that meet the photo overlap index with the center photo;
  • the preset relative position relationship between each shooting point in the combined shooting point set is determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system.
  • the method before acquiring the shooting parameters of the camera device carried by the surveying and mapping drone, the method further includes:
  • the set flying height is calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • obtaining the shooting parameters of the camera equipment carried by the surveying and mapping drone including:
  • the embodiment of the present disclosure also provides a method for surveying and mapping the UAV side, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • Receiving surveying and mapping parameters sent by the control terminal wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: a plurality of surveying and mapping that the surveying and mapping drone surveys and maps in the surveying and mapping area Sampling point;
  • performing flight shooting in the surveying area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to the plurality of surveying and sampling points including:
  • the surveying photos corresponding to each of the mapping sampling points are taken to form the surveying photo set.
  • the method before receiving the mapping parameters sent by the control terminal, the method further includes:
  • the control terminal determines a reference photographing location point.
  • mapping parameters further include: flying height;
  • Performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points includes:
  • multiple photos in the set of surveying photos are combined and / or stitched to obtain a surveying map corresponding to the surveying area, including:
  • each central surveying photo and the corresponding surrounding surveying and mapping photo are stitched into a combined photograph
  • the surveying map of the surveying area includes at least one of the following:
  • a digital surface model of the surveying area a three-dimensional map of the surveying area, and a flat map of the surveying area.
  • the method further includes:
  • An embodiment of the present disclosure also provides a surveying and mapping device on the control terminal side, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the mapping parameter determination module is set to: determine the mapping parameters that match the mapping area, wherein the mapping parameters include: a plurality of mapping sampling points mapped by the mapping drone in the mapping area;
  • the mapping parameter sending module is configured to send the mapping parameter to the mapping drone.
  • the embodiment of the present disclosure also provides a surveying and mapping UAV side surveying and mapping device, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the surveying and mapping parameter receiving module is set to receive the surveying and mapping parameters sent by the control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: Multiple sampling points for surveying and mapping in the surveying and mapping area;
  • the surveying and photographing photo collection shooting module is set to: perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a surveying and photographing photo collection corresponding to the plurality of surveying and sampling points;
  • the surveying and mapping map generation module is set to: combine and / or stitch the multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area.
  • An embodiment of the present disclosure also provides a control terminal.
  • the control terminal includes:
  • One or more processors are One or more processors;
  • the storage device is configured to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement any control terminal side surveying and mapping method described in the embodiments of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored, and when the program is executed by a processor, the method for controlling the terminal-side surveying and mapping provided by the embodiment of the present disclosure is implemented.
  • An embodiment of the present disclosure also provides a surveying and mapping drone, which includes:
  • One or more processors are One or more processors;
  • the storage device is configured to store one or more programs
  • the one or more processors implement any method for surveying and mapping unmanned aerial vehicles in the embodiments of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored.
  • the program is executed by a processor, the method for surveying and mapping the UAV side provided by the embodiment of the present disclosure is implemented.
  • FIG. 1 is a schematic diagram of a surveying and mapping system provided by Embodiment 1 of the present disclosure
  • Embodiment 2 is a flowchart of a method for controlling terminal side surveying and mapping provided by Embodiment 2 of the present disclosure
  • FIG. 3a is a flowchart of a method for controlling terminal side surveying and mapping provided in Embodiment 3 of the present disclosure
  • 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set according to Embodiment 3 of the present disclosure
  • FIG. 4a is a flowchart of a method for controlling terminal side surveying and mapping provided by Embodiment 4 of the present disclosure
  • FIG. 4b is a schematic diagram of the distribution of each photographing location point provided by Embodiment 2 of the present disclosure.
  • FIG. 5 is a flowchart of a method for surveying and mapping a UAV side according to Embodiment 5 of the present disclosure
  • FIG. 6 is a schematic diagram of a mapping device for controlling a terminal side provided by Embodiment 6 of the present disclosure
  • FIG. 7 is a schematic diagram of a surveying and mapping UAV side surveying and mapping device provided by Embodiment 7 of the present disclosure.
  • Embodiment 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 1 is a schematic diagram of a surveying and mapping system provided by Embodiment 1 of the present disclosure.
  • the structure of the operational drone surveying and mapping system includes: a control terminal 10 and a surveying and mapping drone 20 of which:
  • the control terminal 10 is set to: determine the mapping parameters that match the surveying and mapping area, and send the mapping parameters to the surveying and mapping UAV 20.
  • the mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping of the drone 20 in the surveying and mapping area;
  • UAV 20 is set up to receive surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a collection of surveying and photographing photos corresponding to multiple surveying and sampling points, and combine the multiple photos in the surveying and photographing photo set into a photo combination and / Or splicing to obtain a mapping map corresponding to the mapping area.
  • the control terminal 10 may be any device that controls the mapping UAV, such as a remote control of the UAV.
  • the embodiments of the present disclosure do not limit the device type of the control terminal.
  • the surveying and mapping unmanned aerial vehicle 20 may be a drone configured to survey and survey the surveying and mapping area to obtain data related to the surveying and mapping area, such as acquiring multiple surveying and mapping photos of the surveying and mapping area.
  • the surveying and mapping unmanned aerial vehicle 20 is provided with a photographing device, and is set to acquire multiple surveying and mapping photos corresponding to the surveying and mapping area.
  • the mapping system is composed of a control terminal 10 and a mapping drone 20.
  • the control terminal 10 is responsible for determining a plurality of sampling points for surveying and mapping in the surveying area by the mapping drone, and sending the mapping parameters formed by the sampling points to the mapping drone 20.
  • the surveying and mapping UAV 20 can receive the surveying and mapping parameters determined by the control terminal and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to a plurality of surveying and sampling points included in the surveying and mapping parameters.
  • the surveying and mapping drone can also combine and / or stitch multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area.
  • the surveying and mapping system provided by the embodiments of the present disclosure can greatly reduce the processing time of image data, thereby improving the efficiency of surveying and mapping.
  • the surveying and mapping drone 20 is further configured to generate map tile data corresponding to the surveying and mapping area according to the surveying and mapping map corresponding to the surveying and mapping area.
  • the map tile data is set as: the relevant data for generating the tile map, which is formed by slicing the map data by the slicing algorithm.
  • the surveying and mapping drone 20 can not only combine and / or stitch multiple photos in the surveying photo collection to obtain a surveying map corresponding to the surveying area, but also use a slicing algorithm according to the obtained surveying map And other technologies to generate map tile data corresponding to the surveyed area.
  • the map tile data can be set to: generate a tile map, the pyramid model formed by the tile map is a multi-resolution hierarchical model, from the bottom layer to the top layer of the tile pyramid, the resolution is getting lower and lower, but the geographical representation The range is unchanged.
  • the map tile data generated by the surveying and mapping drone 20 may be set to: locate the location in the surveying and mapping area.
  • the surveying and mapping system may further include a working drone; the control terminal 10 is further configured to: use the surveying and mapping area as the working area, obtain map tile data corresponding to the working area from the surveying and mapping drone 20, and according to the map tiles The data is generated and displayed on the area map of the operation area.
  • at least one area anchor point selected by the user for the area map at least one operation block is determined in the operation area, and an operation route corresponding to the operation block is generated and sent to the operation drone ;
  • Operation drone set to: receive the operation route, and perform flight operations in at least one operation plot according to the operation route.
  • the operational drone can be set as: a drone that operates the surveying and mapping area according to the operational requirements, such as detecting the crops, soil, vegetation or water quality in the surveying and mapping area, or spraying pesticides in the surveying and mapping area, etc. .
  • the control terminal 10 may also use the surveying area as a working area, and acquire map tile data corresponding to the working area from the surveying and mapping drone 20. Since the map tile data includes a variety of map data with different resolutions, the control terminal 10 can generate an area map corresponding to the work area according to the resolution requirements of the work drone according to the map tile data for display.
  • the user can select at least one area anchor point for the area map.
  • the area positioning point may be set as: determining at least one work plot within the work area. For example, a square work plot of 10m * 10m is generated with the area positioning point as the center.
  • control terminal 10 determines the work plot, it can generate a work route corresponding to the work plot and send it to the work drone. For example, in a square work plot of 10m * 10m, with the vertex at the upper left corner as the starting point, travel 1m in a clockwise direction every 5 seconds according to the side length of the work plot. Different operation plots can generate different operation routes, which is not limited in the embodiments of the present disclosure.
  • the operation drone receives the operation route, it can perform flight operations in the determined operation plot according to the operation route.
  • the working principle of the surveying and mapping system in the embodiment of the present disclosure is to determine a plurality of surveying and sampling points in the surveying area through the control terminal and send them to the surveying and mapping unmanned aerial vehicle.
  • the drone can take flight photos in the surveying area according to the determined surveying sampling points to obtain a collection of surveying photos corresponding to multiple surveying sampling points, and then combine and / or stitch together the multiple photos in the surveying photo set, and finally Obtain a complete survey map corresponding to the survey area.
  • the embodiment of the present disclosure constitutes a new surveying and mapping system through a control terminal and a surveying and mapping drone, wherein the control terminal is set to: determine the surveying and mapping parameters matching the surveying and mapping area, and send the surveying and mapping parameters to the surveying and mapping drone, and the surveying and mapping are unmanned
  • the machine is set up to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a collection of surveying and photographing photos corresponding to multiple surveying and sampling points, and to combine and / or stitch together the multiple photos in the surveying and photographing set, Obtaining a surveying map corresponding to the surveying area, a new surveying system and a surveying method are proposed, and the overall planning method of multiple surveying sampling points based on the new surveying system is used to replace the existing parallel line movement planning method to solve the existing unmanned
  • the problems of high cost and low surveying efficiency in the aircraft aerial surveying method achieve the technical effect of reducing surveying and mapping costs
  • Embodiment 2 is a flowchart of a method for controlling a terminal-side surveying and mapping provided by Embodiment 2 of the present disclosure.
  • This embodiment can be applied to the case of determining a plurality of surveying and sampling points in a surveying area.
  • the method can be executed by a control-terminal-side surveying and mapping device
  • the device can be implemented by software and / or hardware, and can generally be integrated in a control device (for example, a drone remote control) and used in conjunction with a surveying and mapping drone set up to be responsible for aerial photography.
  • the method includes the following operations:
  • Step 210 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • the mapping area is an area with a clear latitude and longitude range, which can be an area of any shape and any size.
  • the embodiments of the present disclosure do not limit the shape and size of the mapping area.
  • the mapping parameters matched in the mapping area that is, the multiple sampling points for mapping in the mapping area by the mapping drone can be determined by the control terminal. Determining multiple surveying and sampling points through the control terminal can effectively improve the surveying and mapping efficiency of the entire surveying and mapping system.
  • Step 220 Send the mapping parameters to the mapping drone.
  • the control terminal determines a plurality of sampling points for surveying and mapping in the surveying and mapping UAV, it can be sent to the surveying and mapping UAV to enable the surveying and mapping UAV to obtain the corresponding set of surveying and photographing photos according to the sampling points.
  • Surveying and mapping drones have a certain degree of overlap in surveying and mapping photos obtained from multiple surveying sampling points, but not every two consecutive photos have a certain degree of overlap, so it can greatly reduce the processing time of image data, thereby Improve surveying and mapping efficiency.
  • a new method for determining the sampling points of surveying and mapping is proposed by using the control terminal to determine a plurality of sampling points for surveying and mapping in the surveying and mapping UAV in the surveying area, and sending the mapping parameters to the surveying and mapping UAV.
  • FIG. 3a is a flowchart of a method for controlling a terminal-side surveying and mapping provided in Embodiment 3 of the present disclosure.
  • one implementation manner of determining surveying and mapping parameters matching a surveying and mapping area is provided.
  • the method of this embodiment may include:
  • Step 310 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • step 310 may include the following operations:
  • Step 311 Acquire a reference photographing position point corresponding to the surveying and mapping area, and establish a mapping relationship between a photographing point in the combined photographing point set and the reference photographing position point.
  • the reference photographing location point is a location point in the surveying area, which has matching geographic location coordinates.
  • the above-mentioned location points can be selected by the user in the surveying area (for example, click, or directly input latitude and longitude, etc.), or can be automatically determined according to the area shape of the surveying area (for example, the center point of the surveying area or the surveying area) Corners, etc.).
  • the combined shooting point set may be a set of shooting points preset according to a preset distribution rule, and the set may include a plurality of shooting points, and there may be a relative direction and a relative distance relationship between each two shooting points.
  • the combined shooting point set includes 5 shooting points, located at the center and four vertices of the rectangle, respectively. Among them, the relative distance between each vertex and the center point is 100m.
  • each vertex is located in four directions: east, south, west, and north.
  • all the sampling points corresponding to the surveying area may be obtained according to the combined shooting point set.
  • one of the points in the surveying area may be first determined as a reference photographing position point, and then the reference photographing position point and one of the shooting points in the combined shooting point set may be mapped to each other.
  • the relative positional relationship between each shooting point in the combined shooting point set is determined, but it does not establish a corresponding relationship with the actual geographic location information, so it cannot be directly mapped into the actual surveying area, as long as the combination If one shooting point in the shooting point set is given actual geographic location information, then the geographic position information of all the shooting points in the combined shooting point set can all be determined.
  • multiple photos taken according to multiple shooting points in the combined shooting point have overlapping areas.
  • you can change The multiple photos are combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment.
  • FIG. 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set provided in Embodiment 3 of the present disclosure.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are shots taken at the center The points are the four vertices of the rectangle at the center; wherein, the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is a rectangle.
  • the combined shooting point set may include five shooting points, which are a center shooting point and four surrounding shooting points, respectively.
  • the center shooting point may be the center of a rectangle, and correspondingly, the surrounding shooting points may be four vertices of the rectangle corresponding to the center shooting point.
  • Each shooting point has a certain positional relationship, and the setting of the positional relationship satisfies certain conditions, that is, when each photo taken according to each shooting position point determined by each shooting point is combined, a Full rectangular photo.
  • the combination process is to overlay each photo according to the overlapping image between each other.
  • each auxiliary photographing point may rotate around the reference photographing position point according to the user's operation, or move according to the user's sliding operation or the like.
  • the five shooting points in the selected combined shooting point set are a central shooting point and four surrounding shooting points, as long as each surrounding shooting point can ensure that the overlapping degree with the central shooting point (for example, 60% or 70%, etc.), and there is no need to meet such a high degree of overlap between the two surrounding shooting points, which greatly reduces the total number of surveying and mapping photos taken by a surveying and mapping area of a fixed size. Can greatly reduce the time and hardware cost of subsequent photo synthesis or stitching.
  • the solution of the embodiment of the present disclosure is applied to a small plot, for example, after combining or stitching multiple photos taken at each shooting point in a combined shooting point set, one plot can be completely covered.
  • the solution of the disclosed embodiment can be significantly superior to the related art method of parallel line traversal for selecting points in terms of the number of points for surveying and mapping and the difficulty of splicing later.
  • obtaining the reference photographing location point corresponding to the surveying area may include: detecting a user's touch operation in the human-computer interaction interface, and determining a screen location point according to the touch operation; Obtaining a geographic location coordinate that matches the location point of the screen from the map data of the mapping area currently displayed in the human-computer interaction interface as the reference location point.
  • the reference photographing position point may be determined according to the point specified by the user in the human-computer interaction interface.
  • the map data may be latitude and longitude information.
  • detecting a user's touch operation in the human-machine interaction interface and determining a screen position point according to the touch operation may include at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • determining a screen position point according to the user's touch operation in the human-computer interaction interface may have multiple implementation manners.
  • the touch point corresponding to the single touch operation of the user may be determined as the screen position point.
  • a point on the line segment generated by the user's stroke touch operation may also be used as the screen position point. For example, take the midpoint of the line segment as the screen position point. It is also possible to use a point inside the user's picture frame touch operation as a screen position point, for example, a middle point in the frame as a screen position point.
  • acquiring the reference photographing location point corresponding to the surveying and mapping area may include: acquiring the center point of the surveying and mapping region as the reference photographing location point.
  • the reference photographing location point may also be automatically generated by a control terminal that controls unmanned mapping.
  • the center point of the surveying and mapping area where the drone is located is directly used as the reference photographing position point.
  • acquiring the reference photographing location point corresponding to the surveying area may further include: acquiring geographic location coordinates input by the user as the reference photographing location point.
  • the geographic location coordinates input by the user can also be directly used as the reference photographing location point.
  • the user can input the geographic location coordinates through a soft keyboard in the human-computer interaction interface, a numeric keyboard in the control terminal, or voice input.
  • obtaining the reference photographing location point corresponding to the surveying and mapping area may include: sending location query information to the surveying and mapping drone, and using the geographic location coordinates fed back by the surveying and mapping drone as The reference photographing position point; wherein, the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the reference photographing location point may also be determined through the location information specified by the user.
  • the user can send location query information to the surveying and mapping drone through the control terminal.
  • the user triggers a set identifier on the human-machine interaction interface of the control terminal to send location query information to the surveying and mapping drone to query the current position of the surveying and mapping drone.
  • the surveying and mapping unmanned aerial vehicle obtains the current geographic location coordinates through its own positioning device and feeds them back to the control terminal.
  • the control terminal may directly use the location point corresponding to the received geographic location coordinates as the reference photographing location point.
  • the surveying and mapping UAV sends geographical coordinates to the control terminal, its projection point on the ground should be inside the surveying and mapping area.
  • it before sending the position query information to the surveying and mapping drone, it may further include: receiving at least one flight control instruction for the surveying and mapping drone input by the user, and The flight control instruction is sent to the surveying and mapping drone; when it is confirmed that the position confirmation response input by the user is received, a hovering instruction is sent to the surveying and mapping drone to control the surveying and mapping drone Position hovering; wherein the flight control command is set to: control the surveying and mapping unmanned aerial vehicle to move in a set direction and / or set distance in the air.
  • the user must input at least one flight control instruction for the mapping drone to the control terminal.
  • the control terminal sends the flight control instruction input by the user to the surveying and mapping drone, so that the surveying and mapping drone travels according to the flight control instruction.
  • the control terminal may send a hovering instruction to the mapping drone to control The mapping drone hovering at the current position.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point may include: a shooting point selected by the user in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • the user can arbitrarily select one of the shooting points in each shooting point in the combined shooting point set, and combine the shooting point in the combined shooting point set selected by the user with the reference photographing position point Establish a mapping relationship.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: the central shooting point in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: calculating each of the reference photographing position point and the surveying area The distance between the positioning keys, the positioning key points including: the corner points of the surveying area and the center point of the surveying area; acquiring a positioning key point closest to the reference photographing position point as the target reference point; Position information of the target reference point in the surveying and mapping area, select a shooting point matching the position information in the combined shooting point set and establish a mapping relationship with the reference shooting position point.
  • the mapping relationship may also be determined according to the distance relationship between the reference photographing position point and each key point in the surveying area.
  • the corner points of the surveying area and the center point of the surveying area are used as positioning key points, the distance between the reference photographing position point and each positioning key point of the surveying and mapping area is calculated, and a position closest to the reference photographing position point is obtained The key point is used as the target reference point.
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point. For example, if the target reference point is located at the upper left of the surveying area, the shooting point in the upper left corner can be selected from the combined shooting point set and the reference photographing position point to establish a mapping relationship.
  • Step 312 Determine a plurality of auxiliary photographing location points corresponding to the reference photographing location point according to the preset relative position relationship between each photographing point in the combined photographing point set and the mapping relationship.
  • the auxiliary photographing location point may be other location points in the surveying area that are different from the reference photographing location point.
  • the preset relative position relationship and the determined mapping between each shooting point in the combined shooting point set may be determined Relationship, further determining other multiple auxiliary photographing location points corresponding to the reference photographing location point.
  • the combined shooting point set includes a total of 5 shooting points, where the central shooting point in the shooting point set and the reference photographing position point establish a mapping relationship, then the other four shooting points in the combined shooting point set and The position relationship between the center shooting points determines the other four auxiliary shooting position points corresponding to the reference shooting position points.
  • Step 313 Use the reference photographing location point and the plurality of auxiliary photographing location points as a plurality of surveying sampling points for the surveying and mapping drone to survey in the surveying area.
  • the reference photographing position point and the auxiliary photographing position point can be used as the surveying and mapping sampling points for the surveying and mapping of the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • the photographs obtained by the mapping sampling point planning method provided in the embodiments of the present disclosure for each mapping sampling point do not require a certain degree of overlap between each two consecutive photographs, so the processing time of image data can be greatly reduced .
  • Step 320 Send the mapping parameters to the mapping drone.
  • a mapping point in the combined photographing point set is mapped to the reference photographing position point, and at the same time according to each photographing point in the combined photographing point set Predetermined relative position relationship and mapping relationship to determine a plurality of auxiliary photographing position points corresponding to the reference photographing position point, and then use the reference photographing position point and the plurality of auxiliary photographing position points as the surveying and mapping of the UAV in the surveying area Sampling points, a new method of planning for sampling points of surveying and mapping is proposed.
  • the overall planning method of multiple surveying points based on combined shooting point sets is used to replace the existing parallel line movement planning method to solve the problems existing in the existing UAV aerial survey methods.
  • the problems of high cost and low surveying efficiency have achieved the technical effect of reducing surveying and mapping costs and improving surveying and mapping efficiency.
  • FIG. 4a is a flowchart of a method for controlling terminal-side surveying and mapping provided in Embodiment 4 of the present disclosure.
  • the method of this embodiment may include:
  • Step 410 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • step 410 may include the following operations:
  • Step 411 Determine one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set.
  • the combined shooting area may be an area where the obtained photos are synthesized after taking pictures according to each shooting point in the set of combined shooting points. That is, the combined shooting area may be an overall photographing area that can be captured by the combined shooting point set.
  • the mapping area information may be related information of the mapping area, such as the area shape or size of the mapping area.
  • the combined shooting area for surveying and mapping may be a shooting area of the same size as the combined shooting area, and one combined shooting area for mapping corresponds to an actual shooting range within the parcel, that is, the size of the area included in the combined shooting area for mapping and the geographical location of the area Two key pieces of information.
  • the combined shooting area corresponding to the set of combined shooting points should be obtained first, and then the area of the mapping area can be determined according to the information of the combined shooting area and the size of the mapping area.
  • the surveying and mapping combination shooting area can completely cover the surveying and mapping area; if there are multiple surveying and mapping combination shooting areas, the multiple surveying and mapping combination shooting areas can completely cover the surveying and mapping area after synthesis.
  • the combined shooting area is a square of 100m * 100m and the surveying area is a rectangle of 100m * 200m
  • at least two surveying and mapping combined shooting areas should completely cover the surveying and mapping area.
  • multiple photos taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined and / or spliced to form a mapping map of the surveying and mapping areas.
  • the combined shooting area for mapping and the combined shooting area are the same, except that the combined shooting area does not establish a corresponding relationship with the mapped area, and the combined shooting area for surveying and mapping can be a separate shooting area formed by the division in the mapping area.
  • the shape and size of the shooting area are the same as the combined shooting area.
  • the overlapping area between the surveying and mapping combined shooting areas can be set according to actual needs. For example, the overlapping area accounts for 30% or 50% of the surveying and mapping combined shooting area. Be limited.
  • the surveying and mapping drone in order to enable the photos acquired by the surveying and mapping drone to stitch together the images of the complete surveying and mapping area, optionally, the surveying and mapping drone takes multiple photos taken by multiple shooting points in the combined shooting point set There should be overlapping areas between photos.
  • multiple photos can be combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment. It should be noted that the overlapping areas between multiple photos in the embodiments of the present disclosure do not have overlapping areas between every two consecutive photos.
  • each photo obtained by the surveying and mapping UAV can be synthesized according to the overlapping part to form a complete image
  • determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set may include: in the surveying and mapping area Select a positioning point; determine a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area; if the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, Select a new positioning point in the surveying area, and return to perform the operation of determining a surveying and mapping combined shooting area in the surveying area based on the positioning point and the combined shooting area until it is determined that the surveying area can be completely covered All surveying and mapping combined shooting area.
  • the positioning point may be a position point in the surveying and mapping area, which is set as: positioning the surveying and mapping combined shooting area in the surveying and mapping area.
  • the positioning point may be a position point selected in the surveying area according to actual needs, such as a corner point or a center point of the surveying area.
  • a surveying and mapping combination shooting area can be determined first in a surveying and mapping area through a positioning point. For example, if the surveying area is rectangular, you can select the top left corner vertex of the surveying area as the positioning point, and the top left corner vertex of the combined shooting area coincides with the positioning point, then the combined shooting area forms a corresponding surveying and mapping combination shot in the surveying area region.
  • a surveying and mapping combination shooting area cannot completely cover the surveying and mapping area, you should select a new positioning point in the surveying and mapping area, and return to perform the operation of determining a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area until it is determined All the combined shooting areas of the surveying and mapping can be covered completely. It should be noted that when re-selecting a new positioning point, it should be noted that there is an overlapping area between the surveying and mapping combination shooting area determined by the new positioning point and the adjacent surveying and mapping combination shooting area.
  • it before determining one or more surveying and mapping combined shooting areas in the surveying area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, it may further include: Touch operation in the human-computer interaction interface, and obtain a screen selection area matching the touch operation; in the map data currently displayed in the human-computer interaction interface, obtain a geographic location area matching the screen selection area as The mapping area information.
  • the screen selection area may be an area formed by the user's touch operation in the human-machine interaction interface of the control terminal of the mapping drone, which may be an area of any shape and size (not exceeding the size of the screen), which is implemented in the present disclosure
  • the example does not limit the shape and size of the screen selection area.
  • the mapping area may be designated and generated by the user who controls the mapping drone in real time. For example, by detecting the user's touch operation in the human-computer interaction interface to obtain a screen selection area matching the touch operation, and determining the matching geographic location area for the screen selection area according to the map data currently displayed in the human-machine interaction interface, to The determined geographical area is used as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation may include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the closed area formed by the detected single touch operation of the user may be used as the screen selection area matching the touch operation.
  • the closed area surrounded by the connection line of at least three touch points of the user is determined as the screen selection area.
  • the frame generated by the detected frame touch operation of the user may also be used as the screen selection area.
  • Step 412 Determine multiple photographing location points in the surveying and mapping combined shooting area according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • the photographing location point may be a location point in the surveying area, with matching geographic location coordinates.
  • the photographing position point may be determined according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area may include: Map the central shooting point in the combined shooting point set to the midpoint of the area of the combined shooting area for surveying and mapping, and use the midpoint of the area as a photographing location point; according to each periphery in the combined shooting point set A preset relative position relationship between the shooting point and the center shooting point, each of the surrounding shooting points is mapped to the surveying and mapping combined shooting area, and the formed multiple mapping points are used as the shooting position points .
  • each shooting point in the combined shooting point set corresponding to the combined shooting area can be mapped to the surveying and mapping In the combined shooting area, it is taken as the photographing location point.
  • the central shooting point in the combined shooting point set may be first mapped to the midpoint of the area of the combined surveying and mapping shooting area, so that the midpoint of the area of the combined shooting area of surveying and mapping is used as a photographing location point.
  • each surrounding shooting point may be determined according to the relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point Map to the combined shooting area of surveying and mapping respectively, and use the formed multiple mapping points as the photographing location points.
  • FIG. 4b is a schematic diagram of the distribution of each photographing location point provided by Embodiment 2 of the present disclosure.
  • the two center points 40 and 50 are the midpoints of the combined shooting area of the surveying and mapping, respectively
  • the midpoint 40 and the surrounding photographing position points 410 are the combined shooting area of the surveying and mapping In the area
  • the midpoint 50 and the four surrounding photographing location points 510 are a combined surveying and mapping area.
  • the relative positional relationship between the midpoint of the two shooting combined shooting areas and the surrounding photographing position points is the same as the preset relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point.
  • Step 413 Use the plurality of photographing location points as a plurality of sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • the photographing location point can be used as a surveying and mapping sampling point for surveying and mapping by the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • Step 420 Send the mapping parameters to the mapping drone.
  • the shooting parameters before sending the surveying and mapping parameters to the surveying and mapping drone, it may further include: acquiring the shooting parameters of the photographing device carried by the surveying and mapping drone, the shooting The parameters include a single photo shooting area of the surveying and mapping drone at a set flight altitude, and each shooting point corresponds to a single photo shooting area; according to the preset photo overlap index and the single photo shooting area, determine the A preset relative position relationship between each shooting point in the combined shooting point set; the mapping parameter further includes: the flying height, and the flying height is set to: instruct the mapping drone to use the flying height Perform flight photography in the surveying area.
  • the shooting parameters of the camera device carried by the surveying and mapping drone it may further include: calculating the set flight according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel height.
  • the single photo shooting area is the actual surveying area that can be captured by a single photo.
  • the preset photo overlap index may be an overlap index set according to actual needs, such as 50%, 60%, or 70%. Although the embodiment of the present disclosure does not limit the value of the preset photo overlap index, but It should be noted that the preset photo overlap index should satisfy that when each photo is synthesized according to the overlapping part, a complete rectangle can be formed.
  • the single photo shooting area of the mapping drone at the set flight altitude must be determined to be taken according to the single photo
  • the size of the area determines the preset relative position relationship between each shooting point in the combined shooting point set.
  • Each shooting point corresponds to a single photo shooting area, for example, the shooting point is the midpoint or one of the vertices in the single photo shooting area.
  • the preset relative position relationship between each shooting point in the combined shooting point set can be determined according to the preset photo overlap index and the single photo shooting area.
  • the surveying and mapping parameters in the embodiments of the present disclosure may further include the flying height, which is set to: instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • the flying height of the mapping UAV directly affects the ground pixel resolution.
  • the resolution of the ground pixels directly determines the area of the mapping area that can be covered by a single photo. Therefore, before using the surveying and mapping unmanned aerial vehicle to take aerial photographs of the surveying and mapping area, it is first necessary to determine the set flying height of the mapping and unmanned aerial vehicle.
  • the set flying height of the mapping UAV can be calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel.
  • the ground pixel resolution flight height * pixel width / lens focal length
  • the flight height ground pixel resolution * lens focal length / pixel width.
  • the pixel width the width of the sensor size of the camera device / the width of the frame.
  • obtaining the shooting parameters of the camera device carried by the surveying and mapping drone may include: according to the pixel width of the camera device, the frame area of the camera device, and the ground pixel resolution To calculate the single photo shooting area of the surveying and mapping UAV at the set flight altitude.
  • the single photo shooting area of the mapping drone at the set flight height may be calculated according to the pixel width of the camera device, the frame size of the camera device, and the resolution of the ground pixel.
  • the single photo shooting area ground pixel resolution * frame size
  • ground pixel resolution flight height * pixel width / lens focal length.
  • single photo shooting length ground pixel resolution * frame length
  • single photo shooting width ground pixel resolution * frame width. For example, if the frame size is 3456 * 4608 and the ground pixel resolution is 0.05m, the single photo shooting area is 172.8m * 230.4m.
  • determining the preset relative positional relationship between each shooting point in the combined shooting point set according to the preset photo overlap index and the single photo shooting area may include: : Determine the size of a single photo according to the frame size of the camera device and the pixel width of the camera device; construct a two-dimensional coordinate system, and select the target point as the center shooting point in the two-dimensional coordinate system; according to the The center shooting point and the size of the single photo generate a center photo in the two-dimensional coordinate system; in the upper left corner, the lower left corner, the upper right corner, and the lower right corner of the center photo, respectively generate Four surrounding photos of the photo overlap index; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, determine the surrounding shooting points corresponding to each of the surrounding photos in the two-dimensional coordinate system Coordinate values of the center; according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the preset relative position relationship between each shooting point in the combined shooting point set
  • the target point may be any point in the two-dimensional coordinate system.
  • the target point may be the origin of the two-dimensional coordinate system.
  • the center photo and its four surrounding photos are not real photos, but a rectangular area with the same size and shape as a single photo.
  • the surrounding shooting point corresponding to each surrounding photo in the two-dimensional coordinate system can be determined according to the mapping relationship between the size of the single photo and the shooting area of the single photo Coordinate value.
  • the single photo size is 10cm * 10cm
  • the photo overlap index is 50%
  • the surrounding photos corresponding to the upper left corner, the lower left corner, the upper right corner, and the lower right corner are assigned to the single photo shooting of the upper left corner, the lower left corner, the upper right corner, and the lower right corner.
  • Area, and the mapping relationship between the size of the single photo and the shooting area of the single photo is 1: 200, then the shooting area of the single photo is correspondingly 20m * 20m. If the midpoint of the surrounding photos is taken as each surrounding shooting point, and the center shooting point adopts the coordinate origin, the coordinate values of each surrounding shooting point can be (-10, 10), (-10, -10), (10 , 10) and (10, -10), the unit is m.
  • the pre-shooting point between each shooting point in the combined shooting point set can be determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system Set relative positional relationship.
  • the relative distance between the surrounding shooting points located at each vertex in the combined shooting point is 20m
  • the relative distance between the center shooting point at the center point and the surrounding shooting points is
  • a new method of planning for sampling points for surveying and mapping which uses the overall planning method of multiple surveying points based on combined shooting point sets to replace the existing parallel line movement planning method, to solve the existing high cost and low surveying efficiency in the existing UAV aerial survey methods
  • the problem has achieved the technical effect of reducing the cost of surveying and mapping and improving the efficiency of surveying and mapping.
  • FIG. 5 is a flowchart of a method for surveying and mapping a UAV side according to Embodiment 5 of the present disclosure.
  • This embodiment can be applied to a situation where a set of surveying photos corresponding to multiple surveying sampling points is used. It is executed by a machine-side surveying and mapping device, which can be implemented by software and / or hardware, and can be generally integrated in the drone equipment, and is used in conjunction with a control terminal set to control the drone.
  • the method includes the following operations:
  • Step 510 Receive the surveying and mapping parameters sent by the control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: the surveying and mapping parameters of the surveying and mapping drone in the surveying and mapping area Multiple survey sampling points.
  • control terminal determines the mapping parameters that match the surveying area, that is, after the surveying and mapping UAV surveys multiple surveying sampling points in the surveying area, the determined multiple surveying sampling points can be sent to the surveying and mapping without Man-machine.
  • the control terminal before receiving the mapping parameters sent by the control terminal, it may further include: receiving at least one flight control instruction sent by the control terminal, and setting in the air according to the flight control instruction Move in a fixed direction and / or set distance; hover at the current location according to the hovering command sent by the control terminal; according to the location query information sent by the control terminal, coordinate the geographic location of the current location Feedback to the control terminal, wherein the geographic location coordinates are set as: the control terminal determines a reference photographing location point.
  • the control terminal determines the reference photographing location point through the position information specified by the user, the user must input at least one flight control instruction for the mapping drone to the control terminal.
  • the control terminal sends the flight control command input by the user to the surveying and mapping drone, and the surveying and mapping drone travels according to the received flight control command, that is, moves in the air in a set direction and / or a set distance.
  • the control terminal may send a hovering instruction to the mapping drone to control The mapping drone hovering at the current position.
  • control terminal sends location query information to the surveying and mapping drone, and the surveying and mapping drone can feed back the geographic location coordinates of the current location to the control terminal.
  • the control terminal may use the geographic location coordinates fed back by the surveying and mapping unmanned aerial vehicle as a reference photographing location point.
  • Step 520 Perform flight shooting in the surveying area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points.
  • the surveying and mapping unmanned aerial vehicle may perform flight shooting in the surveying area according to the surveying parameters including a plurality of surveying sampling points sent by the control terminal, so as to obtain a set of surveying and photographing photos corresponding to the multiple surveying sampling points.
  • the surveying and mapping parameters may further include: flying height; performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain surveying and mapping photos corresponding to the plurality of surveying and sampling points
  • the set may include: performing flight shooting in the surveying area at the flying height according to the surveying and mapping parameters to obtain a surveying photo set corresponding to the plurality of surveying sampling points.
  • the surveying and mapping parameters in the embodiments of the present disclosure may further include the flying height, which is set to instruct the surveying drone to take a flight shot in the surveying area at the flying height, thereby obtaining a set of surveying and photographing photos corresponding to multiple surveying and sampling points.
  • performing flight shooting in the surveying area according to the surveying parameters to obtain a set of surveying photos corresponding to the plurality of surveying sampling points may include: The geographic location information of the surveying and mapping sampling points determines that when flying to each of the surveying and mapping sampling points, the surveying and photographing photos corresponding to each of the surveying and surveying sampling points are taken to form the surveying and photographing photo set.
  • the surveying and mapping drone can fly to each surveying sampling point according to the geographic location information of each surveying sampling point, and every time it reaches a surveying and sampling point, the camera can be called to take pictures to obtain
  • the surveying photos corresponding to each surveying sampling point constitute a set of surveying photos.
  • Step 530 Combine and / or stitch multiple photos in the set of surveying photos to obtain a surveying map corresponding to the surveying area.
  • the surveying and mapping drone can combine and / or stitch multiple photos in the surveying and photographing photo collection, so as to obtain a complete surveying and mapping map corresponding to the surveying and mapping area.
  • multiple photos in the set of surveying photos are combined and / or stitched to obtain a map for mapping corresponding to the surveying area, which may include: Obtain the center surveying photos taken at at least one center shooting point, and the surrounding surveying photos taken at multiple surrounding shooting points associated with each center shooting point; according to the photos between each surrounding surveying photo and the corresponding center surveying photo For the degree of overlap, each central surveying photo and the corresponding surrounding surveying photo are stitched into a combined shooting photo; the photos are taken according to the combination corresponding to each central shooting point to obtain a mapping map corresponding to the surveying area.
  • the surveying and mapping drone can obtain the central surveying photos taken at at least one central shooting point in the surveying and photographing photo set, and obtain the surrounding surveying photos taken at multiple surrounding shooting points associated with each central shooting point, and then Each central surveying photo and corresponding surrounding surveying photo are stitched together according to the degree of photo overlap between each surrounding surveying photo and the corresponding central surveying photo to form a combined photograph. It can be seen that, in the embodiment of the present disclosure, when stitching the photos acquired by the surveying and mapping drone, stitching is not performed according to the overlapping degree of the photos between each two consecutive photos, so the processing time of image data can be greatly reduced, thereby Improve surveying and mapping efficiency.
  • the combined shooting photo is the surveying map corresponding to the surveying area; if the center surveying photo and the corresponding surrounding surveying photo are stitched together
  • the multiple combined photographs are stitched according to a certain degree of overlap, and the resulting combined photograph is the mapping map corresponding to the survey area.
  • the survey map of the survey area may include at least one of the following: a digital surface model of the survey area, a three-dimensional map of the survey area, and a planar map of the survey area .
  • the survey map corresponding to the survey area obtained by combining and / or stitching photos according to multiple photos in the survey photo set may be a digital surface model corresponding to the survey area, a three-dimensional survey area A flat map of the map or survey area.
  • the method may further include: The mapping map corresponding to the mapping area is sent to the control terminal and / or ground terminal.
  • the surveying and mapping drone may send the surveying map corresponding to the surveying area to the control terminal and / or the ground terminal.
  • the control terminal may use the surveying map as a work area and determine at least one work plot to generate a work route corresponding to the work plot and send it to the work drone.
  • the ground terminal can apply other aspects to the surveying and mapping map according to actual needs, for example, perform regional analysis on the surveying and mapping area according to the geographic information data of the surveying and mapping map and the climatic conditions corresponding to the surveying and mapping area.
  • the control terminal by receiving a plurality of surveying sampling points surveyed in the surveying area sent by the control terminal, to perform flight shooting in the surveying area according to the surveying sampling points, to obtain a set of surveying photographs corresponding to the plurality of surveying sampling points, and to map Multiple photos in the photo collection are combined and / or stitched to obtain a survey map corresponding to the survey area.
  • a new method for acquiring a survey map is proposed.
  • the overall planning method of multiple survey sampling points is used to replace the existing parallel lines
  • the mobile planning method solves the problems of high cost and low surveying efficiency in the existing UAV aerial survey method, and realizes the technical effect of reducing surveying and mapping cost and improving surveying and mapping efficiency.
  • FIG. 6 is a schematic diagram of a device for controlling a terminal-side surveying and mapping provided by Embodiment 6 of the present disclosure. As shown in FIG. 6, the device includes: a mapping parameter determination module 610 and a mapping parameter transmission module 620, in which:
  • the mapping parameter determination module 610 is configured to determine the mapping parameters that match the mapping area, wherein the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys and maps in the mapping area;
  • the mapping parameter sending module 620 is configured to send the mapping parameter to the mapping drone.
  • a new method for determining the sampling points of surveying and mapping is proposed by using the control terminal to determine the multiple sampling points for surveying and mapping in the surveying and mapping UAV in the surveying area, and sending the mapping parameters to the mapping UAV.
  • the mapping parameter determination module 610 includes: a photographing position point acquisition unit, which is set to: acquire a reference photographing position point corresponding to the surveying and mapping area, and take a photographing point in the combined photographing point set with the reference photograph A mapping relationship is established between the location points; the auxiliary photographing location point determination unit is set to determine the reference photographing location point according to the preset relative position relationship between each shooting point in the combined shooting point set and the mapping relationship Corresponding multiple auxiliary photographing position points; a first mapping sampling point determining unit, set to: use the reference photographing position point and the multiple auxiliary photographing position points as the surveying and mapping drone to survey and map in the surveying and mapping area Sampling points for multiple surveys.
  • a photographing position point acquisition unit which is set to: acquire a reference photographing position point corresponding to the surveying and mapping area, and take a photographing point in the combined photographing point set with the reference photograph A mapping relationship is established between the location points
  • the auxiliary photographing location point determination unit is set to determine the reference photograph
  • the surveying and mapping parameter determination module 610 includes: a surveying and mapping combined shooting area determination unit, which is configured to determine one or more surveying and mapping in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set A combined shooting area; a unit for determining a photographing position, configured to: according to a preset relative positional relationship between each shooting point in the set of combined shooting points, determine a plurality of photographing position points in the surveying and mapping combined shooting area; A second surveying and sampling point determination unit, which is set to use the plurality of photographing position points as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • a surveying and mapping combined shooting area determination unit which is configured to determine one or more surveying and mapping in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set A combined shooting area
  • a unit for determining a photographing position configured to: according to a preset relative positional
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the surveying and mapping combined shooting area is after multiple photos are taken according to the multiple shooting points in the combined shooting point set.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, the surrounding shooting points being four vertices of a rectangle centered on the center shooting point; wherein, according to The shape of the composite photo taken by each shooting point in the combined shooting point set is rectangular.
  • the photographing position point acquisition unit is set to: detect a user's touch operation in the human-machine interaction interface, and determine a screen position point according to the touch operation; the currently displayed surveying and mapping area in the human-machine interaction interface A geographic location coordinate matched with the screen location point is obtained from the map data of the reference location as the reference location point.
  • the photographing position point acquiring unit is set to: if it is detected that the user's touch operation is a single-point touch operation, determine the user's touch point as the screen position point;
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • the photographing position point acquiring unit is configured to: acquire the center point of the surveying and mapping area as the reference photographing position point.
  • the photographing location point acquisition unit is set to send location query information to the surveying and mapping drone, and use the geographic location coordinates fed back by the surveying and mapping drone as the reference photographing location point; wherein, the surveying and mapping The unmanned aerial vehicle is preset at a position matching the surveying area.
  • the device further includes: a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping UAV; hovering instruction sending module, set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone at the current position Hovering; wherein, the flight control instruction is set to: control the mapping drone to move in the air in a set direction and / or a set distance.
  • a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping UAV
  • hovering instruction sending module set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone at the current position Hovering
  • the flight control instruction is set to: control the mapping drone to move in the air in a
  • the photographing location point acquiring unit is set to acquire the geographic location coordinates input by the user as the reference photographing location point.
  • the photographing position acquisition unit is configured to: establish a mapping relationship between a shooting point selected by the user in the combined shooting point set and the reference photographing position point.
  • the photographing position acquisition unit is configured to: establish a mapping relationship between the central shooting point in the combined shooting point set and the reference photographing position point.
  • the photographing position acquisition unit is set to calculate the distance between the reference photographing position and each positioning key of the surveying and mapping area, and the positioning key points include: corners of the surveying and mapping area and Describe the center point of the surveying and mapping area; obtain a positioning key point closest to the reference photographing position as the target reference point; according to the position information of the target reference point in the surveying and mapping area, in the combined shooting point set A shooting point matching the position information is selected to establish a mapping relationship with the reference shooting position point.
  • the surveying and mapping combined shooting area determination unit is set to: select a positioning point in the surveying and mapping area; determine a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area; If the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area, in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • the photographing position point determination unit is set to map the central shooting point in the combined shooting point set to the area midpoint of the surveying and mapping combined shooting area, and use the area midpoint as a photographing position point ; According to the preset relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point, map each of the surrounding shooting points to the combined surveying and mapping shooting area, and The formed multiple mapping points are used as the photographing location points.
  • the device further includes: a screen selection area acquisition module, configured to: detect a user's touch operation in the human-computer interaction interface, and acquire a screen selection area matching the touch operation; a survey area information acquisition module, It is set that: in the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • a screen selection area acquisition module configured to: detect a user's touch operation in the human-computer interaction interface, and acquire a screen selection area matching the touch operation
  • a survey area information acquisition module It is set that: in the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • the screen selection area acquisition module is configured to: if it is detected that the user's touch operation is a single-point touch operation, determine the enclosed area enclosed by the connection line of at least three touch points of the user as The screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the device further includes: a shooting parameter acquisition module, configured to: acquire shooting parameters of a camera device carried by the surveying and mapping drone, and the shooting parameters include the surveying and mapping drone at a set flight altitude Under the single photo shooting area, each shooting point corresponds to a single photo shooting area; the relative position relationship determination module is set to determine the combined shooting point according to a preset photo overlap index and the single photo shooting area Preset relative position relationship between each shooting point in the set; the surveying and mapping parameters also include: the flying altitude, the flying altitude is set to: instruct the surveying and mapping UAV to use the flying altitude in the surveying and mapping Take flight shots in the area.
  • a shooting parameter acquisition module configured to: acquire shooting parameters of a camera device carried by the surveying and mapping drone, and the shooting parameters include the surveying and mapping drone at a set flight altitude Under the single photo shooting area, each shooting point corresponds to a single photo shooting area
  • the relative position relationship determination module is set to determine the combined shooting point according to a preset photo overlap index and the single
  • the relative position relationship determination module is configured to: determine the size of a single photo according to the frame size of the photographing device and the pixel width of the photographing device; construct a two-dimensional coordinate system, and set the two-dimensional coordinate system Select the target point as the center shooting point; generate a center photo in the two-dimensional coordinate system according to the center shooting point and the size of the single photo; at the upper left corner, lower left corner, upper right corner, and right of the center photo In the lower corner, four surrounding photos that meet the center photo overlap index with the center photo are generated respectively; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, it is determined to correspond to each of the surrounding photos
  • the coordinate values of the surrounding shooting points in the two-dimensional coordinate system according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the set of combined shooting points The preset relative position relationship between each shooting point.
  • the apparatus further includes: a flying height calculation module, configured to calculate the set flying height according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • a flying height calculation module configured to calculate the set flying height according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • the shooting parameter acquisition module is set to: calculate the mapping drone at the set flying height according to the pixel width of the camera device, the frame area of the camera device, and the resolution of the ground pixel Under the single photo shooting area.
  • control terminal side surveying and mapping device described above can execute the control terminal side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • control terminal side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • mapping method on the control terminal side provided by any embodiment of the present disclosure.
  • the device includes: a surveying and mapping parameter receiving module 710, a surveying and photographing photo collection shooting module 720, and a surveying and mapping map generating module 730, where:
  • the mapping parameter receiving module 710 is configured to receive the mapping parameters sent by the control terminal, wherein the mapping parameters are determined by the control terminal according to the mapping area, and the mapping parameters include: the mapping drone Describe multiple surveying and mapping sampling points in the surveying and mapping area;
  • the surveying and photographing photo collection shooting module 720 is configured to perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a surveying and photographing photo collection corresponding to the plurality of surveying and sampling points;
  • the mapping map generation module 730 is configured to: combine and / or stitch the multiple photos in the mapping photo set to obtain a mapping map corresponding to the mapping area.
  • the control terminal by receiving a plurality of surveying sampling points surveyed in the surveying area sent by the control terminal, to perform flight shooting in the surveying area according to the surveying sampling points, to obtain a set of surveying photographs corresponding to the plurality of surveying sampling points, and to map Multiple photos in the photo collection are combined and / or stitched to obtain a survey map corresponding to the survey area.
  • a new method for acquiring a survey map is proposed.
  • the overall planning method of multiple survey sampling points is used to replace the existing parallel lines
  • the mobile planning method solves the problems of high cost and low surveying efficiency in the existing UAV aerial survey method, and realizes the technical effect of reducing surveying and mapping cost and improving surveying and mapping efficiency.
  • the surveying and photographing photo collection shooting module 720 is configured to: when determining to fly to each surveying and sampling point according to the geographic location information of each surveying and sampling point, obtain a shot with each surveying and sampling point The corresponding mapping photos constitute the collection of mapping photos.
  • the device further includes an instruction movement module configured to receive at least one flight control instruction sent by the control terminal, and set a direction in the air according to the flight control instruction, and / or to set Distance movement; the instruction hovering module is set to: hover at the current location according to the hovering instruction sent by the control terminal; the geographic position coordinate feedback module is set to: according to the position query information sent by the control terminal , Feeding back the geographic location coordinates of the current location to the control terminal, wherein the geographic location coordinates are set as: the control terminal determines a reference photographing location point.
  • an instruction movement module configured to receive at least one flight control instruction sent by the control terminal, and set a direction in the air according to the flight control instruction, and / or to set Distance movement
  • the instruction hovering module is set to: hover at the current location according to the hovering instruction sent by the control terminal
  • the geographic position coordinate feedback module is set to: according to the position query information sent by the control terminal , Feeding back the geographic location coordinates of the current location to the control terminal, wherein
  • the surveying and mapping parameters further include: a flying height; a surveying and photographing photo collection shooting module 720, which is configured to perform flight shooting in the surveying and mapping area at the flying height according to the surveying and mapping parameters, to obtain A collection of surveying and mapping photos corresponding to surveying and sampling points.
  • the surveying and mapping map generation module 730 is set to: obtain, in the surveying and photographing photo set, a central surveying photograph taken at at least one central shooting point, and surroundings photographed at multiple surrounding shooting points associated with each central shooting point Surveying and mapping photos; according to the degree of photo overlap between each surrounding surveying photo and the corresponding center surveying photo, stitch each central surveying photo with the corresponding surrounding surveying photo into a combined shooting photo; according to the combination corresponding to each center shooting point Photographs to obtain a mapping map corresponding to the surveying area.
  • the surveying map of the surveying area includes at least one of the following: a digital surface model of the surveying area, a three-dimensional map of the surveying area, and a flat map of the surveying area.
  • the device further includes a surveying and mapping map sending module, configured to: send a surveying and mapping map corresponding to the surveying and mapping area to the control terminal and / or ground terminal.
  • a surveying and mapping map sending module configured to: send a surveying and mapping map corresponding to the surveying and mapping area to the control terminal and / or ground terminal.
  • the above surveying and mapping UAV side surveying and mapping device can execute the surveying and mapping UAV side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • the method for surveying and mapping the UAV side provided by any embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 8 shows a block diagram of a control terminal 612 suitable for implementing embodiments of the present disclosure.
  • the control terminal 612 shown in FIG. 8 is only an example, and should not bring any limitation to the functions and use scope of the embodiments of the present disclosure.
  • control terminal 612 is expressed in the form of a general-purpose computing device.
  • the components of the control terminal 612 may include, but are not limited to, one or more processors 616, a storage device 628, and a bus 618 connecting different system components (including the storage device 628 and the processor 616).
  • the bus 618 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards) Association, VESA) local area bus and peripheral component interconnect (Peripheral Component Interconnect, PCI) bus.
  • the control terminal 612 typically includes a variety of computer system readable media. These media may be any available media that can be accessed by the control terminal 612, including volatile and non-volatile media, removable and non-removable media.
  • the storage device 628 may include a computer system readable medium in the form of volatile memory, such as random access memory (Random Access Memory, RAM) 630 and / or cache memory 632.
  • the control terminal 612 may further include other removable / non-removable, volatile / nonvolatile computer system storage media.
  • the storage system 634 may be configured to read and write non-removable, non-volatile magnetic media (not shown in FIG. 8 and is generally referred to as a "hard disk drive").
  • each drive may be connected to the bus 618 through one or more data media interfaces.
  • the storage device 628 may include at least one program product having a set of (eg, at least one) program modules configured to perform the functions of each embodiment of the present disclosure.
  • a program 636 having a set of (at least one) program modules 626 may be stored in, for example, a storage device 628.
  • Such program modules 626 include but are not limited to an operating system, one or more application programs, other program modules, and program data. These Each of the examples or some combination may include an implementation of the network environment.
  • the program module 626 generally performs the functions and / or methods in the embodiments described in the present disclosure.
  • the control terminal 612 may also communicate with one or more external devices 614 (eg, keyboard, pointing device, camera, display 624, etc.), and may also communicate with one or more devices that enable users to interact with the control terminal 612, and / or Or communicate with any device (such as a network card, modem, etc.) that enables the control terminal 612 to communicate with one or more other computing devices. Such communication can be performed through an input / output (I / O) interface 622.
  • the control terminal 612 may also communicate with one or more networks (such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 620.
  • networks such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 620.
  • the network adapter 620 communicates with other modules of the control terminal 612 through the bus 618. It should be understood that although not shown in the figure, other hardware and / or software modules may be used in conjunction with the control terminal 612, including but not limited to: microcode, device driver, redundant processing unit, external disk drive array, disk array (Redundant Arrays) of Independent Disks (RAID) systems, tape drives and data backup storage systems, etc.
  • the processor 616 runs a program stored in the storage device 628 to execute various functional applications and data processing, for example, to implement the control terminal side surveying and mapping method provided by the above-described embodiments of the present disclosure.
  • the processing unit executes the program, it realizes: determining the mapping parameters that match the surveying area, wherein the surveying parameters include: a plurality of surveying sampling points mapped by the surveying drone in the surveying area ; Send the surveying and mapping parameters to the surveying and mapping UAV.
  • the ninth embodiment is a surveying and mapping drone for performing the surveying and mapping UAV side surveying method provided by any embodiment of the present disclosure
  • the surveying and mapping drone includes: one or more processes A storage device, configured to: store one or more programs; when the one or more programs are executed by the one or more processors, such that the one or more processors are implemented as any implementation of the present disclosure
  • the surveying and mapping UAV side surveying and mapping method provided in the example: receiving surveying and mapping parameters sent by a control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: the surveying and mapping unmanned Multiple sampling points for surveying and mapping in the surveying and mapping area; performing flight shooting in the surveying area according to the mapping parameters to obtain a set of surveying and photographing photos corresponding to the sampling points for surveying and mapping; Multiple photos in the collection are combined and / or stitched to obtain a mapping map corresponding to the mapping area.
  • Embodiment 10 of the present disclosure also provides a computer storage medium for storing a computer program, which when executed by a computer processor is used to perform any of the control terminal side surveying and mapping methods described in the above embodiments of the present disclosure: determination and mapping Regional mapping mapping parameters, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area; and the mapping parameters are sent to the mapping drone.
  • the computer program when executed by a computer processor, it is used to perform the surveying and mapping method for surveying and mapping unmanned aerial vehicles according to any one of the above embodiments of the present disclosure: receiving surveying and mapping parameters sent by a control terminal, where the surveying and mapping parameters are the The control terminal is determined according to the surveying and mapping area, and the surveying and mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area; and performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters To obtain a collection of surveying photos corresponding to the plurality of surveying sampling points; to combine and / or stitch the multiple photos in the collection of surveying photos to obtain a surveying map corresponding to the surveying area.
  • the computer storage media of the embodiments of the present disclosure may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above.
  • Examples of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read Only Memory, ROM) , Erasable programmable read-only memory ((Erasable Programmable Read Only Memory, EPROM) or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable of the above The combination.
  • the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal that is propagated in baseband or as part of a carrier wave, in which computer-readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device. .
  • the program code contained on the computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • any appropriate medium including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • the computer program code for performing the operations of the present disclosure may be written in one or more programming languages or a combination thereof, the programming languages including object-oriented programming languages-such as Java, Smalltalk, C ++, as well as conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code may be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, through an Internet service provider Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider Internet connection for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the embodiment of the present disclosure proposes a new surveying and mapping system and a surveying method.
  • the overall planning method of multiple surveying sampling points based on the new surveying and mapping system is used to replace the existing parallel line movement planning method to solve the problems existing in the existing UAV aerial surveying methods.
  • the high cost and low efficiency of surveying and mapping have achieved the technical effect of reducing the cost of surveying and mapping and improving the efficiency of surveying and mapping.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Mathematical Physics (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Processing Or Creating Images (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un système d'arpentage et de cartographie. Ledit système comprend : un terminal de commande et un véhicule aérien sans pilote (UAV) d'arpentage et de cartographie ; le terminal de commande est configuré pour déterminer des paramètres d'arpentage et de cartographie correspondant à une zone d'arpentage et de cartographie et les envoyer à l'UAV d'arpentage et de cartographie, ces paramètres comprenant une pluralité de points d'échantillonnage d'arpentage et de cartographie relevés et cartographiés par l'UAV d'arpentage et de cartographie dans la zone d'arpentage et de cartographie ; et l'UAV d'arpentage et de cartographie est configuré pour recevoir les paramètres d'arpentage et de cartographie, et effectuer une photographie aérienne dans la zone d'arpentage et de cartographie en fonction des paramètres de ceux-ci, de façon à obtenir une collecte de photos d'arpentage et de cartographie correspondant à la pluralité de points d'échantillonnage d'arpentage et de cartographie, et à soumettre une pluralité de photos de la collection de photos d'arpentage et de cartographie à au moins une des opérations suivantes : une combinaison de photos ou un collage de photos, permettant ainsi d'obtenir une carte correspondant à la zone d'arpentage et de cartographie.
PCT/CN2018/116660 2018-11-21 2018-11-21 Système, procédé, appareil, dispositif et support d'arpentage et de cartographie WO2020103023A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
AU2018449839A AU2018449839B2 (en) 2018-11-21 2018-11-21 Surveying and mapping method and device
EP18941071.5A EP3885702A4 (fr) 2018-11-21 2018-11-21 Système, procédé, appareil, dispositif et support d'arpentage et de cartographie
CA3120727A CA3120727A1 (fr) 2018-11-21 2018-11-21 Systeme, procede, appareil, dispositif et support d'arpentage et de cartographie
KR1020217016658A KR20210105345A (ko) 2018-11-21 2018-11-21 측량 및 매핑 방법, 장치 및 기기
PCT/CN2018/116660 WO2020103023A1 (fr) 2018-11-21 2018-11-21 Système, procédé, appareil, dispositif et support d'arpentage et de cartographie
JP2021527154A JP7182710B2 (ja) 2018-11-21 2018-11-21 測量方法、装置及びデバイス
CN201880091778.4A CN112469967B (zh) 2018-11-21 2018-11-21 测绘系统、测绘方法、装置、设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/116660 WO2020103023A1 (fr) 2018-11-21 2018-11-21 Système, procédé, appareil, dispositif et support d'arpentage et de cartographie

Publications (1)

Publication Number Publication Date
WO2020103023A1 true WO2020103023A1 (fr) 2020-05-28

Family

ID=70773259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/116660 WO2020103023A1 (fr) 2018-11-21 2018-11-21 Système, procédé, appareil, dispositif et support d'arpentage et de cartographie

Country Status (7)

Country Link
EP (1) EP3885702A4 (fr)
JP (1) JP7182710B2 (fr)
KR (1) KR20210105345A (fr)
CN (1) CN112469967B (fr)
AU (1) AU2018449839B2 (fr)
CA (1) CA3120727A1 (fr)
WO (1) WO2020103023A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113865557A (zh) * 2021-09-08 2021-12-31 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN114838710A (zh) * 2022-03-29 2022-08-02 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统
CN117782030A (zh) * 2023-11-24 2024-03-29 北京天数智芯半导体科技有限公司 距离测量方法及装置、存储介质及电子设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114877872B (zh) * 2022-07-01 2022-10-14 北京今日蓝天科技有限公司 无人机及其操作系统、生成测绘图的方法、介质、设备
WO2024096691A1 (fr) * 2022-11-04 2024-05-10 경북대학교 산학협력단 Procédé et dispositif d'estimation de coordonnées gps de multiples objets cibles et de suivi d'objets cibles sur la base d'informations d'image de caméra concernant un véhicule aérien sans pilote
CN116088584B (zh) * 2023-04-07 2023-07-18 山东省地质矿产勘查开发局第五地质大队(山东省第五地质矿产勘查院) 一种测绘协同作业方法、系统及一种电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082488A (zh) * 2007-06-30 2007-12-05 徐春云 异地遥测之图像拼接方法
KR101236992B1 (ko) * 2012-11-05 2013-02-26 명화지리정보(주) 기준점 데이터가 적용된 수치 영상이미지의 실시간 업데이팅 영상처리시스템
EP2639547A2 (fr) * 2012-03-12 2013-09-18 Aisin Aw Co., Ltd. Système de fourniture de données d'image, procédé de fourniture de données d'image et produit de programme informatique
CN104567815A (zh) * 2014-12-26 2015-04-29 北京航天控制仪器研究所 一种基于图像匹配的无人机载光电稳定平台自动侦察系统
CN105225241A (zh) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 无人机深度图像的获取方法及无人机
CN105352481A (zh) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 高精度无人机影像无控制点测绘成图方法及系统
CN108474657A (zh) * 2017-03-31 2018-08-31 深圳市大疆创新科技有限公司 一种环境信息采集方法、地面站及飞行器

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3364257B2 (ja) * 1993-01-27 2003-01-08 朝日航洋株式会社 空中写真撮影方法
JPH1054719A (ja) * 1996-08-09 1998-02-24 A Tec:Kk 空中測量写真の撮影方法およびそれに使用する撮影ポ イントマーク
JP4414524B2 (ja) * 1999-11-17 2010-02-10 アジア航測株式会社 航空写真撮影計画シミュレーション方法
CN102495522A (zh) * 2011-12-01 2012-06-13 天津曙光敬业科技有限公司 基于无人直升机航拍的360°空中全景互动漫游系统的制作方法
JP6055274B2 (ja) * 2012-10-31 2016-12-27 株式会社トプコン 航空写真測定方法及び航空写真測定システム
CN103338333B (zh) * 2013-07-17 2016-04-13 中测新图(北京)遥感技术有限责任公司 航摄仪方位元素优化配置方法
WO2017073310A1 (fr) * 2015-10-27 2017-05-04 三菱電機株式会社 Système de capture d'image pour une mesure de forme de structure, procédé de capture d'image de structure utilisée pour une mesure de forme de structure, dispositif de commande embarqué, dispositif de télécommande, programme et support d'enregistrement
US20170221241A1 (en) * 2016-01-28 2017-08-03 8681384 Canada Inc. System, method and apparatus for generating building maps
JP2018077626A (ja) * 2016-11-08 2018-05-17 Necソリューションイノベータ株式会社 飛行制御装置、飛行制御方法、及びプログラム
CN106444841B (zh) * 2016-11-15 2019-04-26 航天图景(北京)科技有限公司 一种基于多旋翼无人机倾斜摄影系统的航线规划方法
CN108846004A (zh) * 2018-04-20 2018-11-20 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定目标更新方法及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082488A (zh) * 2007-06-30 2007-12-05 徐春云 异地遥测之图像拼接方法
EP2639547A2 (fr) * 2012-03-12 2013-09-18 Aisin Aw Co., Ltd. Système de fourniture de données d'image, procédé de fourniture de données d'image et produit de programme informatique
KR101236992B1 (ko) * 2012-11-05 2013-02-26 명화지리정보(주) 기준점 데이터가 적용된 수치 영상이미지의 실시간 업데이팅 영상처리시스템
CN104567815A (zh) * 2014-12-26 2015-04-29 北京航天控制仪器研究所 一种基于图像匹配的无人机载光电稳定平台自动侦察系统
CN105225241A (zh) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 无人机深度图像的获取方法及无人机
CN105352481A (zh) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 高精度无人机影像无控制点测绘成图方法及系统
CN108474657A (zh) * 2017-03-31 2018-08-31 深圳市大疆创新科技有限公司 一种环境信息采集方法、地面站及飞行器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3885702A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113865557A (zh) * 2021-09-08 2021-12-31 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN113865557B (zh) * 2021-09-08 2024-01-16 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN114838710A (zh) * 2022-03-29 2022-08-02 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统
CN114838710B (zh) * 2022-03-29 2023-08-29 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统
CN117782030A (zh) * 2023-11-24 2024-03-29 北京天数智芯半导体科技有限公司 距离测量方法及装置、存储介质及电子设备

Also Published As

Publication number Publication date
JP2022507715A (ja) 2022-01-18
AU2018449839B2 (en) 2023-02-23
EP3885702A4 (fr) 2021-12-01
CN112469967A (zh) 2021-03-09
CN112469967B (zh) 2023-12-26
EP3885702A1 (fr) 2021-09-29
AU2018449839A1 (en) 2021-06-24
JP7182710B2 (ja) 2022-12-02
KR20210105345A (ko) 2021-08-26
CA3120727A1 (fr) 2020-05-28

Similar Documents

Publication Publication Date Title
WO2020103022A1 (fr) Système d'arpentage et de cartographie, procédé et appareil d'arpentage et de cartographie, dispositif et support
US11346665B2 (en) Method and apparatus for planning sample points for surveying and mapping, control terminal, and storage medium
WO2020103023A1 (fr) Système, procédé, appareil, dispositif et support d'arpentage et de cartographie
WO2020103019A1 (fr) Procédé et appareil de planification d'arpentage et de cartographie de points d'échantillonnage, terminal de commande et support d'informations
WO2020103021A1 (fr) Procédé et appareil de planification pour examiner et cartographier des points d'échantillonnage, terminal de commande et support de stockage
WO2020103024A1 (fr) Système de commande de tâche, procédé de commande de tâche, appareil, dispositif et support

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18941071

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021527154

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3120727

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018449839

Country of ref document: AU

Date of ref document: 20181121

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018941071

Country of ref document: EP

Effective date: 20210621