WO2020103023A1 - 一种测绘系统、测绘方法、装置、设备及介质 - Google Patents

一种测绘系统、测绘方法、装置、设备及介质

Info

Publication number
WO2020103023A1
WO2020103023A1 PCT/CN2018/116660 CN2018116660W WO2020103023A1 WO 2020103023 A1 WO2020103023 A1 WO 2020103023A1 CN 2018116660 W CN2018116660 W CN 2018116660W WO 2020103023 A1 WO2020103023 A1 WO 2020103023A1
Authority
WO
WIPO (PCT)
Prior art keywords
surveying
mapping
area
shooting
point
Prior art date
Application number
PCT/CN2018/116660
Other languages
English (en)
French (fr)
Inventor
刘鹏
金晓会
Original Assignee
广州极飞科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州极飞科技有限公司 filed Critical 广州极飞科技有限公司
Priority to EP18941071.5A priority Critical patent/EP3885702A4/en
Priority to CN201880091778.4A priority patent/CN112469967B/zh
Priority to AU2018449839A priority patent/AU2018449839B2/en
Priority to JP2021527154A priority patent/JP7182710B2/ja
Priority to KR1020217016658A priority patent/KR20210105345A/ko
Priority to CA3120727A priority patent/CA3120727A1/en
Priority to PCT/CN2018/116660 priority patent/WO2020103023A1/zh
Publication of WO2020103023A1 publication Critical patent/WO2020103023A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/004Map manufacture or repair; Tear or ink or water resistant maps; Long-life maps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching

Definitions

  • the embodiments of the present disclosure relate to the field of surveying and mapping technology, for example, to a surveying and mapping system, a surveying and mapping method, a device, a device, and a medium.
  • UAV aerial surveying and mapping (referred to as aerial surveying) technology can greatly reduce the investment in the work cycle, human and financial resources of traditional aerial surveying and mapping technology, and has more practical significance in the field of surveying and mapping.
  • UAV aerial surveying and mapping technology carries out the observation of the current situation of the aerial photography area through the remote video transmission technology through the equipped video capture equipment, and at the same time uses the aerial image stitching technology to stitch the captured photos to obtain the overall image of the aerial photography area.
  • the traditional UAV aerial survey method generally adopts the method of parallel line traversal to carry out mobile surveying in the surveying area when taking photos, and in order to ensure the success of stitching, it is usually required that there is a certain degree of overlap between each two consecutive photos.
  • a photo is required to have a certain degree of overlap with other photos in the horizontal and vertical directions.
  • the degree of overlap is generally required to be greater than 50%.
  • the opener found that the related technology has the following defects: traditional UAV aerial survey methods are used to survey and map the aerial photography area of a large area of land, and multiple high-overlap photos are taken during the survey .
  • traditional UAV aerial survey methods are used to survey and map the aerial photography area of a large area of land, and multiple high-overlap photos are taken during the survey .
  • stitching the above photos taken by the drone it takes a long time and the efficiency is low.
  • the photos obtained by the drone are uploaded to the server for splicing, the data upload and processing process will take longer.
  • the traditional UAV aerial survey method is applied to small plot surveying and mapping, it not only has complicated operations, but also has long processing time and high hardware cost.
  • the embodiments of the present disclosure provide a surveying and mapping system, a surveying and mapping method, a device, equipment and a medium to reduce the cost of surveying and mapping and improve the efficiency of surveying and mapping.
  • An embodiment of the present disclosure provides a surveying and mapping system, including: a control terminal and a surveying and mapping drone, wherein:
  • the control terminal is configured to determine the mapping parameters that match the mapping area and send the mapping parameters to the mapping drone, the mapping parameters include: the mapping drone in the mapping area Multiple sampling points for surveying and mapping;
  • the surveying and mapping drone is set to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points, and convert A plurality of photos in the collection of surveying photos are combined and / or stitched together to obtain a surveying map corresponding to the surveying area.
  • the surveying and mapping drone is further configured to generate map tile data corresponding to the surveying and mapping area according to a surveying and mapping map corresponding to the surveying and mapping area.
  • system further includes: operating drones;
  • the control terminal is further configured to use the surveying and mapping area as a work area, acquire map tile data corresponding to the work area from the surveying and mapping drone, and generate the work area based on the map tile data Display an area map of the area, determine at least one work plot within the work area based on at least one area anchor point selected by the user for the area map, and generate a work route corresponding to the work plot to send to the Operating drone
  • the operation drone is configured to receive the operation route and perform flight operations in the at least one operation plot according to the operation route.
  • An embodiment of the present disclosure also provides a method for controlling terminal side surveying and mapping, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the surveying and mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area;
  • determining the mapping parameters matching the mapping area includes:
  • the reference photographing position point and the plurality of auxiliary photographing position points are used as a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area.
  • determining the mapping parameters matching the mapping area includes:
  • each shooting point in the combined shooting point set determines a plurality of shooting position points in the surveying and mapping combined shooting area
  • the plurality of photographing location points are used as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined, and / or spliced to form a surveying and mapping map of the surveying and mapping area.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are four vertices of a rectangle centered on the center shooting point;
  • the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is rectangular.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • detecting a user's touch operation in the human-computer interaction interface and determining a screen position point according to the touch operation includes at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the method before sending location query information to the surveying and mapping drone, the method further includes:
  • the flight control instruction is set to: control the mapping drone to move in the air in a set direction and / or a set distance.
  • obtaining a reference photographing location point corresponding to the surveying and mapping area includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point includes:
  • the positioning key point including: a corner point of the surveying and mapping area and a center point of the surveying and mapping area;
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point.
  • one or more combined mapping and shooting areas are determined within the mapping area, including:
  • the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area includes:
  • mapping each of the surrounding shooting points into the surveying and mapping combined shooting area maps each of the surrounding shooting points into the surveying and mapping combined shooting area.
  • the formed multiple mapping points serve as the photographing location point.
  • the method before determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, the method further includes:
  • mapping area information In the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the method before sending the mapping parameters to the mapping drone, the method further includes:
  • the shooting parameters include a single photo shooting area of the surveying and mapping drone at a set flight height, and each shooting point corresponds to a single photo shooting area ;
  • the surveying and mapping parameters further include: the flying height, and the flying height is set to instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • determining the preset relative position relationship between each shooting point in the combined shooting point set includes:
  • the lower left corner, the upper right corner, and the lower right corner of the center photo respectively generate four surrounding photos that meet the photo overlap index with the center photo;
  • the preset relative position relationship between each shooting point in the combined shooting point set is determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system.
  • the method before acquiring the shooting parameters of the camera device carried by the surveying and mapping drone, the method further includes:
  • the set flying height is calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • obtaining the shooting parameters of the camera equipment carried by the surveying and mapping drone including:
  • the embodiment of the present disclosure also provides a method for surveying and mapping the UAV side, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • Receiving surveying and mapping parameters sent by the control terminal wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: a plurality of surveying and mapping that the surveying and mapping drone surveys and maps in the surveying and mapping area Sampling point;
  • performing flight shooting in the surveying area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to the plurality of surveying and sampling points including:
  • the surveying photos corresponding to each of the mapping sampling points are taken to form the surveying photo set.
  • the method before receiving the mapping parameters sent by the control terminal, the method further includes:
  • the control terminal determines a reference photographing location point.
  • mapping parameters further include: flying height;
  • Performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points includes:
  • multiple photos in the set of surveying photos are combined and / or stitched to obtain a surveying map corresponding to the surveying area, including:
  • each central surveying photo and the corresponding surrounding surveying and mapping photo are stitched into a combined photograph
  • the surveying map of the surveying area includes at least one of the following:
  • a digital surface model of the surveying area a three-dimensional map of the surveying area, and a flat map of the surveying area.
  • the method further includes:
  • An embodiment of the present disclosure also provides a surveying and mapping device on the control terminal side, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the mapping parameter determination module is set to: determine the mapping parameters that match the mapping area, wherein the mapping parameters include: a plurality of mapping sampling points mapped by the mapping drone in the mapping area;
  • the mapping parameter sending module is configured to send the mapping parameter to the mapping drone.
  • the embodiment of the present disclosure also provides a surveying and mapping UAV side surveying and mapping device, which is applied to the surveying and mapping system described in the embodiment of the present disclosure and includes:
  • the surveying and mapping parameter receiving module is set to receive the surveying and mapping parameters sent by the control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: Multiple sampling points for surveying and mapping in the surveying and mapping area;
  • the surveying and photographing photo collection shooting module is set to: perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a surveying and photographing photo collection corresponding to the plurality of surveying and sampling points;
  • the surveying and mapping map generation module is set to: combine and / or stitch the multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area.
  • An embodiment of the present disclosure also provides a control terminal.
  • the control terminal includes:
  • One or more processors are One or more processors;
  • the storage device is configured to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement any control terminal side surveying and mapping method described in the embodiments of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored, and when the program is executed by a processor, the method for controlling the terminal-side surveying and mapping provided by the embodiment of the present disclosure is implemented.
  • An embodiment of the present disclosure also provides a surveying and mapping drone, which includes:
  • One or more processors are One or more processors;
  • the storage device is configured to store one or more programs
  • the one or more processors implement any method for surveying and mapping unmanned aerial vehicles in the embodiments of the present disclosure.
  • An embodiment of the present disclosure also provides a computer storage medium on which a computer program is stored.
  • the program is executed by a processor, the method for surveying and mapping the UAV side provided by the embodiment of the present disclosure is implemented.
  • FIG. 1 is a schematic diagram of a surveying and mapping system provided by Embodiment 1 of the present disclosure
  • Embodiment 2 is a flowchart of a method for controlling terminal side surveying and mapping provided by Embodiment 2 of the present disclosure
  • FIG. 3a is a flowchart of a method for controlling terminal side surveying and mapping provided in Embodiment 3 of the present disclosure
  • 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set according to Embodiment 3 of the present disclosure
  • FIG. 4a is a flowchart of a method for controlling terminal side surveying and mapping provided by Embodiment 4 of the present disclosure
  • FIG. 4b is a schematic diagram of the distribution of each photographing location point provided by Embodiment 2 of the present disclosure.
  • FIG. 5 is a flowchart of a method for surveying and mapping a UAV side according to Embodiment 5 of the present disclosure
  • FIG. 6 is a schematic diagram of a mapping device for controlling a terminal side provided by Embodiment 6 of the present disclosure
  • FIG. 7 is a schematic diagram of a surveying and mapping UAV side surveying and mapping device provided by Embodiment 7 of the present disclosure.
  • Embodiment 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 1 is a schematic diagram of a surveying and mapping system provided by Embodiment 1 of the present disclosure.
  • the structure of the operational drone surveying and mapping system includes: a control terminal 10 and a surveying and mapping drone 20 of which:
  • the control terminal 10 is set to: determine the mapping parameters that match the surveying and mapping area, and send the mapping parameters to the surveying and mapping UAV 20.
  • the mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping of the drone 20 in the surveying and mapping area;
  • UAV 20 is set up to receive surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a collection of surveying and photographing photos corresponding to multiple surveying and sampling points, and combine the multiple photos in the surveying and photographing photo set into a photo combination and / Or splicing to obtain a mapping map corresponding to the mapping area.
  • the control terminal 10 may be any device that controls the mapping UAV, such as a remote control of the UAV.
  • the embodiments of the present disclosure do not limit the device type of the control terminal.
  • the surveying and mapping unmanned aerial vehicle 20 may be a drone configured to survey and survey the surveying and mapping area to obtain data related to the surveying and mapping area, such as acquiring multiple surveying and mapping photos of the surveying and mapping area.
  • the surveying and mapping unmanned aerial vehicle 20 is provided with a photographing device, and is set to acquire multiple surveying and mapping photos corresponding to the surveying and mapping area.
  • the mapping system is composed of a control terminal 10 and a mapping drone 20.
  • the control terminal 10 is responsible for determining a plurality of sampling points for surveying and mapping in the surveying area by the mapping drone, and sending the mapping parameters formed by the sampling points to the mapping drone 20.
  • the surveying and mapping UAV 20 can receive the surveying and mapping parameters determined by the control terminal and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a set of surveying and mapping photos corresponding to a plurality of surveying and sampling points included in the surveying and mapping parameters.
  • the surveying and mapping drone can also combine and / or stitch multiple photos in the surveying and photographing photo collection to obtain a surveying and mapping map corresponding to the surveying and mapping area.
  • the surveying and mapping system provided by the embodiments of the present disclosure can greatly reduce the processing time of image data, thereby improving the efficiency of surveying and mapping.
  • the surveying and mapping drone 20 is further configured to generate map tile data corresponding to the surveying and mapping area according to the surveying and mapping map corresponding to the surveying and mapping area.
  • the map tile data is set as: the relevant data for generating the tile map, which is formed by slicing the map data by the slicing algorithm.
  • the surveying and mapping drone 20 can not only combine and / or stitch multiple photos in the surveying photo collection to obtain a surveying map corresponding to the surveying area, but also use a slicing algorithm according to the obtained surveying map And other technologies to generate map tile data corresponding to the surveyed area.
  • the map tile data can be set to: generate a tile map, the pyramid model formed by the tile map is a multi-resolution hierarchical model, from the bottom layer to the top layer of the tile pyramid, the resolution is getting lower and lower, but the geographical representation The range is unchanged.
  • the map tile data generated by the surveying and mapping drone 20 may be set to: locate the location in the surveying and mapping area.
  • the surveying and mapping system may further include a working drone; the control terminal 10 is further configured to: use the surveying and mapping area as the working area, obtain map tile data corresponding to the working area from the surveying and mapping drone 20, and according to the map tiles The data is generated and displayed on the area map of the operation area.
  • at least one area anchor point selected by the user for the area map at least one operation block is determined in the operation area, and an operation route corresponding to the operation block is generated and sent to the operation drone ;
  • Operation drone set to: receive the operation route, and perform flight operations in at least one operation plot according to the operation route.
  • the operational drone can be set as: a drone that operates the surveying and mapping area according to the operational requirements, such as detecting the crops, soil, vegetation or water quality in the surveying and mapping area, or spraying pesticides in the surveying and mapping area, etc. .
  • the control terminal 10 may also use the surveying area as a working area, and acquire map tile data corresponding to the working area from the surveying and mapping drone 20. Since the map tile data includes a variety of map data with different resolutions, the control terminal 10 can generate an area map corresponding to the work area according to the resolution requirements of the work drone according to the map tile data for display.
  • the user can select at least one area anchor point for the area map.
  • the area positioning point may be set as: determining at least one work plot within the work area. For example, a square work plot of 10m * 10m is generated with the area positioning point as the center.
  • control terminal 10 determines the work plot, it can generate a work route corresponding to the work plot and send it to the work drone. For example, in a square work plot of 10m * 10m, with the vertex at the upper left corner as the starting point, travel 1m in a clockwise direction every 5 seconds according to the side length of the work plot. Different operation plots can generate different operation routes, which is not limited in the embodiments of the present disclosure.
  • the operation drone receives the operation route, it can perform flight operations in the determined operation plot according to the operation route.
  • the working principle of the surveying and mapping system in the embodiment of the present disclosure is to determine a plurality of surveying and sampling points in the surveying area through the control terminal and send them to the surveying and mapping unmanned aerial vehicle.
  • the drone can take flight photos in the surveying area according to the determined surveying sampling points to obtain a collection of surveying photos corresponding to multiple surveying sampling points, and then combine and / or stitch together the multiple photos in the surveying photo set, and finally Obtain a complete survey map corresponding to the survey area.
  • the embodiment of the present disclosure constitutes a new surveying and mapping system through a control terminal and a surveying and mapping drone, wherein the control terminal is set to: determine the surveying and mapping parameters matching the surveying and mapping area, and send the surveying and mapping parameters to the surveying and mapping drone, and the surveying and mapping are unmanned
  • the machine is set up to receive the surveying and mapping parameters and perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a collection of surveying and photographing photos corresponding to multiple surveying and sampling points, and to combine and / or stitch together the multiple photos in the surveying and photographing set, Obtaining a surveying map corresponding to the surveying area, a new surveying system and a surveying method are proposed, and the overall planning method of multiple surveying sampling points based on the new surveying system is used to replace the existing parallel line movement planning method to solve the existing unmanned
  • the problems of high cost and low surveying efficiency in the aircraft aerial surveying method achieve the technical effect of reducing surveying and mapping costs
  • Embodiment 2 is a flowchart of a method for controlling a terminal-side surveying and mapping provided by Embodiment 2 of the present disclosure.
  • This embodiment can be applied to the case of determining a plurality of surveying and sampling points in a surveying area.
  • the method can be executed by a control-terminal-side surveying and mapping device
  • the device can be implemented by software and / or hardware, and can generally be integrated in a control device (for example, a drone remote control) and used in conjunction with a surveying and mapping drone set up to be responsible for aerial photography.
  • the method includes the following operations:
  • Step 210 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • the mapping area is an area with a clear latitude and longitude range, which can be an area of any shape and any size.
  • the embodiments of the present disclosure do not limit the shape and size of the mapping area.
  • the mapping parameters matched in the mapping area that is, the multiple sampling points for mapping in the mapping area by the mapping drone can be determined by the control terminal. Determining multiple surveying and sampling points through the control terminal can effectively improve the surveying and mapping efficiency of the entire surveying and mapping system.
  • Step 220 Send the mapping parameters to the mapping drone.
  • the control terminal determines a plurality of sampling points for surveying and mapping in the surveying and mapping UAV, it can be sent to the surveying and mapping UAV to enable the surveying and mapping UAV to obtain the corresponding set of surveying and photographing photos according to the sampling points.
  • Surveying and mapping drones have a certain degree of overlap in surveying and mapping photos obtained from multiple surveying sampling points, but not every two consecutive photos have a certain degree of overlap, so it can greatly reduce the processing time of image data, thereby Improve surveying and mapping efficiency.
  • a new method for determining the sampling points of surveying and mapping is proposed by using the control terminal to determine a plurality of sampling points for surveying and mapping in the surveying and mapping UAV in the surveying area, and sending the mapping parameters to the surveying and mapping UAV.
  • FIG. 3a is a flowchart of a method for controlling a terminal-side surveying and mapping provided in Embodiment 3 of the present disclosure.
  • one implementation manner of determining surveying and mapping parameters matching a surveying and mapping area is provided.
  • the method of this embodiment may include:
  • Step 310 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • step 310 may include the following operations:
  • Step 311 Acquire a reference photographing position point corresponding to the surveying and mapping area, and establish a mapping relationship between a photographing point in the combined photographing point set and the reference photographing position point.
  • the reference photographing location point is a location point in the surveying area, which has matching geographic location coordinates.
  • the above-mentioned location points can be selected by the user in the surveying area (for example, click, or directly input latitude and longitude, etc.), or can be automatically determined according to the area shape of the surveying area (for example, the center point of the surveying area or the surveying area) Corners, etc.).
  • the combined shooting point set may be a set of shooting points preset according to a preset distribution rule, and the set may include a plurality of shooting points, and there may be a relative direction and a relative distance relationship between each two shooting points.
  • the combined shooting point set includes 5 shooting points, located at the center and four vertices of the rectangle, respectively. Among them, the relative distance between each vertex and the center point is 100m.
  • each vertex is located in four directions: east, south, west, and north.
  • all the sampling points corresponding to the surveying area may be obtained according to the combined shooting point set.
  • one of the points in the surveying area may be first determined as a reference photographing position point, and then the reference photographing position point and one of the shooting points in the combined shooting point set may be mapped to each other.
  • the relative positional relationship between each shooting point in the combined shooting point set is determined, but it does not establish a corresponding relationship with the actual geographic location information, so it cannot be directly mapped into the actual surveying area, as long as the combination If one shooting point in the shooting point set is given actual geographic location information, then the geographic position information of all the shooting points in the combined shooting point set can all be determined.
  • multiple photos taken according to multiple shooting points in the combined shooting point have overlapping areas.
  • you can change The multiple photos are combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment.
  • FIG. 3b is a schematic diagram of the location distribution of each shooting point in a combined shooting point set provided in Embodiment 3 of the present disclosure.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, where the surrounding shooting points are shots taken at the center The points are the four vertices of the rectangle at the center; wherein, the shape of the synthesized photo taken according to each shooting point in the combined shooting point set is a rectangle.
  • the combined shooting point set may include five shooting points, which are a center shooting point and four surrounding shooting points, respectively.
  • the center shooting point may be the center of a rectangle, and correspondingly, the surrounding shooting points may be four vertices of the rectangle corresponding to the center shooting point.
  • Each shooting point has a certain positional relationship, and the setting of the positional relationship satisfies certain conditions, that is, when each photo taken according to each shooting position point determined by each shooting point is combined, a Full rectangular photo.
  • the combination process is to overlay each photo according to the overlapping image between each other.
  • each auxiliary photographing point may rotate around the reference photographing position point according to the user's operation, or move according to the user's sliding operation or the like.
  • the five shooting points in the selected combined shooting point set are a central shooting point and four surrounding shooting points, as long as each surrounding shooting point can ensure that the overlapping degree with the central shooting point (for example, 60% or 70%, etc.), and there is no need to meet such a high degree of overlap between the two surrounding shooting points, which greatly reduces the total number of surveying and mapping photos taken by a surveying and mapping area of a fixed size. Can greatly reduce the time and hardware cost of subsequent photo synthesis or stitching.
  • the solution of the embodiment of the present disclosure is applied to a small plot, for example, after combining or stitching multiple photos taken at each shooting point in a combined shooting point set, one plot can be completely covered.
  • the solution of the disclosed embodiment can be significantly superior to the related art method of parallel line traversal for selecting points in terms of the number of points for surveying and mapping and the difficulty of splicing later.
  • obtaining the reference photographing location point corresponding to the surveying area may include: detecting a user's touch operation in the human-computer interaction interface, and determining a screen location point according to the touch operation; Obtaining a geographic location coordinate that matches the location point of the screen from the map data of the mapping area currently displayed in the human-computer interaction interface as the reference location point.
  • the reference photographing position point may be determined according to the point specified by the user in the human-computer interaction interface.
  • the map data may be latitude and longitude information.
  • detecting a user's touch operation in the human-machine interaction interface and determining a screen position point according to the touch operation may include at least one of the following:
  • the user's touch point is determined as the screen position point
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • determining a screen position point according to the user's touch operation in the human-computer interaction interface may have multiple implementation manners.
  • the touch point corresponding to the single touch operation of the user may be determined as the screen position point.
  • a point on the line segment generated by the user's stroke touch operation may also be used as the screen position point. For example, take the midpoint of the line segment as the screen position point. It is also possible to use a point inside the user's picture frame touch operation as a screen position point, for example, a middle point in the frame as a screen position point.
  • acquiring the reference photographing location point corresponding to the surveying and mapping area may include: acquiring the center point of the surveying and mapping region as the reference photographing location point.
  • the reference photographing location point may also be automatically generated by a control terminal that controls unmanned mapping.
  • the center point of the surveying and mapping area where the drone is located is directly used as the reference photographing position point.
  • acquiring the reference photographing location point corresponding to the surveying area may further include: acquiring geographic location coordinates input by the user as the reference photographing location point.
  • the geographic location coordinates input by the user can also be directly used as the reference photographing location point.
  • the user can input the geographic location coordinates through a soft keyboard in the human-computer interaction interface, a numeric keyboard in the control terminal, or voice input.
  • obtaining the reference photographing location point corresponding to the surveying and mapping area may include: sending location query information to the surveying and mapping drone, and using the geographic location coordinates fed back by the surveying and mapping drone as The reference photographing position point; wherein, the surveying and mapping unmanned aerial vehicle is preset at a position matching the surveying and mapping area.
  • the reference photographing location point may also be determined through the location information specified by the user.
  • the user can send location query information to the surveying and mapping drone through the control terminal.
  • the user triggers a set identifier on the human-machine interaction interface of the control terminal to send location query information to the surveying and mapping drone to query the current position of the surveying and mapping drone.
  • the surveying and mapping unmanned aerial vehicle obtains the current geographic location coordinates through its own positioning device and feeds them back to the control terminal.
  • the control terminal may directly use the location point corresponding to the received geographic location coordinates as the reference photographing location point.
  • the surveying and mapping UAV sends geographical coordinates to the control terminal, its projection point on the ground should be inside the surveying and mapping area.
  • it before sending the position query information to the surveying and mapping drone, it may further include: receiving at least one flight control instruction for the surveying and mapping drone input by the user, and The flight control instruction is sent to the surveying and mapping drone; when it is confirmed that the position confirmation response input by the user is received, a hovering instruction is sent to the surveying and mapping drone to control the surveying and mapping drone Position hovering; wherein the flight control command is set to: control the surveying and mapping unmanned aerial vehicle to move in a set direction and / or set distance in the air.
  • the user must input at least one flight control instruction for the mapping drone to the control terminal.
  • the control terminal sends the flight control instruction input by the user to the surveying and mapping drone, so that the surveying and mapping drone travels according to the flight control instruction.
  • the control terminal may send a hovering instruction to the mapping drone to control The mapping drone hovering at the current position.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference shooting position point may include: a shooting point selected by the user in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • the user can arbitrarily select one of the shooting points in each shooting point in the combined shooting point set, and combine the shooting point in the combined shooting point set selected by the user with the reference photographing position point Establish a mapping relationship.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: the central shooting point in the combined shooting point set , Establish a mapping relationship with the reference photographing location point.
  • establishing a mapping relationship between a shooting point in the combined shooting point set and the reference photographing position point may further include: calculating each of the reference photographing position point and the surveying area The distance between the positioning keys, the positioning key points including: the corner points of the surveying area and the center point of the surveying area; acquiring a positioning key point closest to the reference photographing position point as the target reference point; Position information of the target reference point in the surveying and mapping area, select a shooting point matching the position information in the combined shooting point set and establish a mapping relationship with the reference shooting position point.
  • the mapping relationship may also be determined according to the distance relationship between the reference photographing position point and each key point in the surveying area.
  • the corner points of the surveying area and the center point of the surveying area are used as positioning key points, the distance between the reference photographing position point and each positioning key point of the surveying and mapping area is calculated, and a position closest to the reference photographing position point is obtained The key point is used as the target reference point.
  • a shooting point matching the position information is selected in the combined shooting point set to establish a mapping relationship with the reference shooting position point. For example, if the target reference point is located at the upper left of the surveying area, the shooting point in the upper left corner can be selected from the combined shooting point set and the reference photographing position point to establish a mapping relationship.
  • Step 312 Determine a plurality of auxiliary photographing location points corresponding to the reference photographing location point according to the preset relative position relationship between each photographing point in the combined photographing point set and the mapping relationship.
  • the auxiliary photographing location point may be other location points in the surveying area that are different from the reference photographing location point.
  • the preset relative position relationship and the determined mapping between each shooting point in the combined shooting point set may be determined Relationship, further determining other multiple auxiliary photographing location points corresponding to the reference photographing location point.
  • the combined shooting point set includes a total of 5 shooting points, where the central shooting point in the shooting point set and the reference photographing position point establish a mapping relationship, then the other four shooting points in the combined shooting point set and The position relationship between the center shooting points determines the other four auxiliary shooting position points corresponding to the reference shooting position points.
  • Step 313 Use the reference photographing location point and the plurality of auxiliary photographing location points as a plurality of surveying sampling points for the surveying and mapping drone to survey in the surveying area.
  • the reference photographing position point and the auxiliary photographing position point can be used as the surveying and mapping sampling points for the surveying and mapping of the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • the photographs obtained by the mapping sampling point planning method provided in the embodiments of the present disclosure for each mapping sampling point do not require a certain degree of overlap between each two consecutive photographs, so the processing time of image data can be greatly reduced .
  • Step 320 Send the mapping parameters to the mapping drone.
  • a mapping point in the combined photographing point set is mapped to the reference photographing position point, and at the same time according to each photographing point in the combined photographing point set Predetermined relative position relationship and mapping relationship to determine a plurality of auxiliary photographing position points corresponding to the reference photographing position point, and then use the reference photographing position point and the plurality of auxiliary photographing position points as the surveying and mapping of the UAV in the surveying area Sampling points, a new method of planning for sampling points of surveying and mapping is proposed.
  • the overall planning method of multiple surveying points based on combined shooting point sets is used to replace the existing parallel line movement planning method to solve the problems existing in the existing UAV aerial survey methods.
  • the problems of high cost and low surveying efficiency have achieved the technical effect of reducing surveying and mapping costs and improving surveying and mapping efficiency.
  • FIG. 4a is a flowchart of a method for controlling terminal-side surveying and mapping provided in Embodiment 4 of the present disclosure.
  • the method of this embodiment may include:
  • Step 410 Determine the mapping parameters that match the mapping area, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area.
  • step 410 may include the following operations:
  • Step 411 Determine one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set.
  • the combined shooting area may be an area where the obtained photos are synthesized after taking pictures according to each shooting point in the set of combined shooting points. That is, the combined shooting area may be an overall photographing area that can be captured by the combined shooting point set.
  • the mapping area information may be related information of the mapping area, such as the area shape or size of the mapping area.
  • the combined shooting area for surveying and mapping may be a shooting area of the same size as the combined shooting area, and one combined shooting area for mapping corresponds to an actual shooting range within the parcel, that is, the size of the area included in the combined shooting area for mapping and the geographical location of the area Two key pieces of information.
  • the combined shooting area corresponding to the set of combined shooting points should be obtained first, and then the area of the mapping area can be determined according to the information of the combined shooting area and the size of the mapping area.
  • the surveying and mapping combination shooting area can completely cover the surveying and mapping area; if there are multiple surveying and mapping combination shooting areas, the multiple surveying and mapping combination shooting areas can completely cover the surveying and mapping area after synthesis.
  • the combined shooting area is a square of 100m * 100m and the surveying area is a rectangle of 100m * 200m
  • at least two surveying and mapping combined shooting areas should completely cover the surveying and mapping area.
  • multiple photos taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the combined shooting area for surveying and mapping is a shooting area formed by combining multiple photos and / or stitching after taking multiple photos according to multiple shooting points in the combined shooting point set;
  • Surveying and mapping combined shooting areas are combined and / or spliced to form a mapping map of the surveying and mapping areas.
  • the combined shooting area for mapping and the combined shooting area are the same, except that the combined shooting area does not establish a corresponding relationship with the mapped area, and the combined shooting area for surveying and mapping can be a separate shooting area formed by the division in the mapping area.
  • the shape and size of the shooting area are the same as the combined shooting area.
  • the overlapping area between the surveying and mapping combined shooting areas can be set according to actual needs. For example, the overlapping area accounts for 30% or 50% of the surveying and mapping combined shooting area. Be limited.
  • the surveying and mapping drone in order to enable the photos acquired by the surveying and mapping drone to stitch together the images of the complete surveying and mapping area, optionally, the surveying and mapping drone takes multiple photos taken by multiple shooting points in the combined shooting point set There should be overlapping areas between photos.
  • multiple photos can be combined and / or stitched to form a complete combined area.
  • the combined area may completely cover the surveying area, or may only cover a part of the surveying area, which is not limited in this embodiment. It should be noted that the overlapping areas between multiple photos in the embodiments of the present disclosure do not have overlapping areas between every two consecutive photos.
  • each photo obtained by the surveying and mapping UAV can be synthesized according to the overlapping part to form a complete image
  • determining one or more surveying and mapping combined shooting areas in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set may include: in the surveying and mapping area Select a positioning point; determine a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area; if the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, Select a new positioning point in the surveying area, and return to perform the operation of determining a surveying and mapping combined shooting area in the surveying area based on the positioning point and the combined shooting area until it is determined that the surveying area can be completely covered All surveying and mapping combined shooting area.
  • the positioning point may be a position point in the surveying and mapping area, which is set as: positioning the surveying and mapping combined shooting area in the surveying and mapping area.
  • the positioning point may be a position point selected in the surveying area according to actual needs, such as a corner point or a center point of the surveying area.
  • a surveying and mapping combination shooting area can be determined first in a surveying and mapping area through a positioning point. For example, if the surveying area is rectangular, you can select the top left corner vertex of the surveying area as the positioning point, and the top left corner vertex of the combined shooting area coincides with the positioning point, then the combined shooting area forms a corresponding surveying and mapping combination shot in the surveying area region.
  • a surveying and mapping combination shooting area cannot completely cover the surveying and mapping area, you should select a new positioning point in the surveying and mapping area, and return to perform the operation of determining a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area until it is determined All the combined shooting areas of the surveying and mapping can be covered completely. It should be noted that when re-selecting a new positioning point, it should be noted that there is an overlapping area between the surveying and mapping combination shooting area determined by the new positioning point and the adjacent surveying and mapping combination shooting area.
  • it before determining one or more surveying and mapping combined shooting areas in the surveying area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set, it may further include: Touch operation in the human-computer interaction interface, and obtain a screen selection area matching the touch operation; in the map data currently displayed in the human-computer interaction interface, obtain a geographic location area matching the screen selection area as The mapping area information.
  • the screen selection area may be an area formed by the user's touch operation in the human-machine interaction interface of the control terminal of the mapping drone, which may be an area of any shape and size (not exceeding the size of the screen), which is implemented in the present disclosure
  • the example does not limit the shape and size of the screen selection area.
  • the mapping area may be designated and generated by the user who controls the mapping drone in real time. For example, by detecting the user's touch operation in the human-computer interaction interface to obtain a screen selection area matching the touch operation, and determining the matching geographic location area for the screen selection area according to the map data currently displayed in the human-machine interaction interface, to The determined geographical area is used as the mapping area information.
  • detecting a user's touch operation in the human-computer interaction interface and acquiring a screen selection area matching the touch operation may include:
  • the closed area enclosed by the connection line of at least three touch points of the user is determined as the screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the closed area formed by the detected single touch operation of the user may be used as the screen selection area matching the touch operation.
  • the closed area surrounded by the connection line of at least three touch points of the user is determined as the screen selection area.
  • the frame generated by the detected frame touch operation of the user may also be used as the screen selection area.
  • Step 412 Determine multiple photographing location points in the surveying and mapping combined shooting area according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • the photographing location point may be a location point in the surveying area, with matching geographic location coordinates.
  • the photographing position point may be determined according to a preset relative position relationship between each shooting point in the combined shooting point set.
  • determining a plurality of shooting position points in the surveying and mapping combined shooting area may include: Map the central shooting point in the combined shooting point set to the midpoint of the area of the combined shooting area for surveying and mapping, and use the midpoint of the area as a photographing location point; according to each periphery in the combined shooting point set A preset relative position relationship between the shooting point and the center shooting point, each of the surrounding shooting points is mapped to the surveying and mapping combined shooting area, and the formed multiple mapping points are used as the shooting position points .
  • each shooting point in the combined shooting point set corresponding to the combined shooting area can be mapped to the surveying and mapping In the combined shooting area, it is taken as the photographing location point.
  • the central shooting point in the combined shooting point set may be first mapped to the midpoint of the area of the combined surveying and mapping shooting area, so that the midpoint of the area of the combined shooting area of surveying and mapping is used as a photographing location point.
  • each surrounding shooting point may be determined according to the relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point Map to the combined shooting area of surveying and mapping respectively, and use the formed multiple mapping points as the photographing location points.
  • FIG. 4b is a schematic diagram of the distribution of each photographing location point provided by Embodiment 2 of the present disclosure.
  • the two center points 40 and 50 are the midpoints of the combined shooting area of the surveying and mapping, respectively
  • the midpoint 40 and the surrounding photographing position points 410 are the combined shooting area of the surveying and mapping In the area
  • the midpoint 50 and the four surrounding photographing location points 510 are a combined surveying and mapping area.
  • the relative positional relationship between the midpoint of the two shooting combined shooting areas and the surrounding photographing position points is the same as the preset relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point.
  • Step 413 Use the plurality of photographing location points as a plurality of sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • the photographing location point can be used as a surveying and mapping sampling point for surveying and mapping by the drone in the surveying and mapping area.
  • the surveying and mapping UAV can perform aerial photography according to each sampling point of surveying and mapping, and send the photos obtained by the aerial photography to the corresponding control terminal or ground terminal of the control, so that the control terminal can synthesize the obtained photos to obtain the final surveying image.
  • the mapping drone can realize the synthesis of multiple photos in the local machine.
  • Step 420 Send the mapping parameters to the mapping drone.
  • the shooting parameters before sending the surveying and mapping parameters to the surveying and mapping drone, it may further include: acquiring the shooting parameters of the photographing device carried by the surveying and mapping drone, the shooting The parameters include a single photo shooting area of the surveying and mapping drone at a set flight altitude, and each shooting point corresponds to a single photo shooting area; according to the preset photo overlap index and the single photo shooting area, determine the A preset relative position relationship between each shooting point in the combined shooting point set; the mapping parameter further includes: the flying height, and the flying height is set to: instruct the mapping drone to use the flying height Perform flight photography in the surveying area.
  • the shooting parameters of the camera device carried by the surveying and mapping drone it may further include: calculating the set flight according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel height.
  • the single photo shooting area is the actual surveying area that can be captured by a single photo.
  • the preset photo overlap index may be an overlap index set according to actual needs, such as 50%, 60%, or 70%. Although the embodiment of the present disclosure does not limit the value of the preset photo overlap index, but It should be noted that the preset photo overlap index should satisfy that when each photo is synthesized according to the overlapping part, a complete rectangle can be formed.
  • the single photo shooting area of the mapping drone at the set flight altitude must be determined to be taken according to the single photo
  • the size of the area determines the preset relative position relationship between each shooting point in the combined shooting point set.
  • Each shooting point corresponds to a single photo shooting area, for example, the shooting point is the midpoint or one of the vertices in the single photo shooting area.
  • the preset relative position relationship between each shooting point in the combined shooting point set can be determined according to the preset photo overlap index and the single photo shooting area.
  • the surveying and mapping parameters in the embodiments of the present disclosure may further include the flying height, which is set to: instruct the surveying and mapping unmanned aerial vehicle to perform flight shooting in the surveying and mapping area at the flying height.
  • the flying height of the mapping UAV directly affects the ground pixel resolution.
  • the resolution of the ground pixels directly determines the area of the mapping area that can be covered by a single photo. Therefore, before using the surveying and mapping unmanned aerial vehicle to take aerial photographs of the surveying and mapping area, it is first necessary to determine the set flying height of the mapping and unmanned aerial vehicle.
  • the set flying height of the mapping UAV can be calculated according to the pixel width of the camera device, the lens focal length of the camera device, and the resolution of the ground pixel.
  • the ground pixel resolution flight height * pixel width / lens focal length
  • the flight height ground pixel resolution * lens focal length / pixel width.
  • the pixel width the width of the sensor size of the camera device / the width of the frame.
  • obtaining the shooting parameters of the camera device carried by the surveying and mapping drone may include: according to the pixel width of the camera device, the frame area of the camera device, and the ground pixel resolution To calculate the single photo shooting area of the surveying and mapping UAV at the set flight altitude.
  • the single photo shooting area of the mapping drone at the set flight height may be calculated according to the pixel width of the camera device, the frame size of the camera device, and the resolution of the ground pixel.
  • the single photo shooting area ground pixel resolution * frame size
  • ground pixel resolution flight height * pixel width / lens focal length.
  • single photo shooting length ground pixel resolution * frame length
  • single photo shooting width ground pixel resolution * frame width. For example, if the frame size is 3456 * 4608 and the ground pixel resolution is 0.05m, the single photo shooting area is 172.8m * 230.4m.
  • determining the preset relative positional relationship between each shooting point in the combined shooting point set according to the preset photo overlap index and the single photo shooting area may include: : Determine the size of a single photo according to the frame size of the camera device and the pixel width of the camera device; construct a two-dimensional coordinate system, and select the target point as the center shooting point in the two-dimensional coordinate system; according to the The center shooting point and the size of the single photo generate a center photo in the two-dimensional coordinate system; in the upper left corner, the lower left corner, the upper right corner, and the lower right corner of the center photo, respectively generate Four surrounding photos of the photo overlap index; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, determine the surrounding shooting points corresponding to each of the surrounding photos in the two-dimensional coordinate system Coordinate values of the center; according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the preset relative position relationship between each shooting point in the combined shooting point set
  • the target point may be any point in the two-dimensional coordinate system.
  • the target point may be the origin of the two-dimensional coordinate system.
  • the center photo and its four surrounding photos are not real photos, but a rectangular area with the same size and shape as a single photo.
  • the surrounding shooting point corresponding to each surrounding photo in the two-dimensional coordinate system can be determined according to the mapping relationship between the size of the single photo and the shooting area of the single photo Coordinate value.
  • the single photo size is 10cm * 10cm
  • the photo overlap index is 50%
  • the surrounding photos corresponding to the upper left corner, the lower left corner, the upper right corner, and the lower right corner are assigned to the single photo shooting of the upper left corner, the lower left corner, the upper right corner, and the lower right corner.
  • Area, and the mapping relationship between the size of the single photo and the shooting area of the single photo is 1: 200, then the shooting area of the single photo is correspondingly 20m * 20m. If the midpoint of the surrounding photos is taken as each surrounding shooting point, and the center shooting point adopts the coordinate origin, the coordinate values of each surrounding shooting point can be (-10, 10), (-10, -10), (10 , 10) and (10, -10), the unit is m.
  • the pre-shooting point between each shooting point in the combined shooting point set can be determined according to the central shooting point and the coordinate value of each surrounding shooting point in the two-dimensional coordinate system Set relative positional relationship.
  • the relative distance between the surrounding shooting points located at each vertex in the combined shooting point is 20m
  • the relative distance between the center shooting point at the center point and the surrounding shooting points is
  • a new method of planning for sampling points for surveying and mapping which uses the overall planning method of multiple surveying points based on combined shooting point sets to replace the existing parallel line movement planning method, to solve the existing high cost and low surveying efficiency in the existing UAV aerial survey methods
  • the problem has achieved the technical effect of reducing the cost of surveying and mapping and improving the efficiency of surveying and mapping.
  • FIG. 5 is a flowchart of a method for surveying and mapping a UAV side according to Embodiment 5 of the present disclosure.
  • This embodiment can be applied to a situation where a set of surveying photos corresponding to multiple surveying sampling points is used. It is executed by a machine-side surveying and mapping device, which can be implemented by software and / or hardware, and can be generally integrated in the drone equipment, and is used in conjunction with a control terminal set to control the drone.
  • the method includes the following operations:
  • Step 510 Receive the surveying and mapping parameters sent by the control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: the surveying and mapping parameters of the surveying and mapping drone in the surveying and mapping area Multiple survey sampling points.
  • control terminal determines the mapping parameters that match the surveying area, that is, after the surveying and mapping UAV surveys multiple surveying sampling points in the surveying area, the determined multiple surveying sampling points can be sent to the surveying and mapping without Man-machine.
  • the control terminal before receiving the mapping parameters sent by the control terminal, it may further include: receiving at least one flight control instruction sent by the control terminal, and setting in the air according to the flight control instruction Move in a fixed direction and / or set distance; hover at the current location according to the hovering command sent by the control terminal; according to the location query information sent by the control terminal, coordinate the geographic location of the current location Feedback to the control terminal, wherein the geographic location coordinates are set as: the control terminal determines a reference photographing location point.
  • the control terminal determines the reference photographing location point through the position information specified by the user, the user must input at least one flight control instruction for the mapping drone to the control terminal.
  • the control terminal sends the flight control command input by the user to the surveying and mapping drone, and the surveying and mapping drone travels according to the received flight control command, that is, moves in the air in a set direction and / or a set distance.
  • the control terminal may send a hovering instruction to the mapping drone to control The mapping drone hovering at the current position.
  • control terminal sends location query information to the surveying and mapping drone, and the surveying and mapping drone can feed back the geographic location coordinates of the current location to the control terminal.
  • the control terminal may use the geographic location coordinates fed back by the surveying and mapping unmanned aerial vehicle as a reference photographing location point.
  • Step 520 Perform flight shooting in the surveying area according to the surveying and mapping parameters to obtain a set of surveying and photographing photos corresponding to the plurality of surveying and sampling points.
  • the surveying and mapping unmanned aerial vehicle may perform flight shooting in the surveying area according to the surveying parameters including a plurality of surveying sampling points sent by the control terminal, so as to obtain a set of surveying and photographing photos corresponding to the multiple surveying sampling points.
  • the surveying and mapping parameters may further include: flying height; performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain surveying and mapping photos corresponding to the plurality of surveying and sampling points
  • the set may include: performing flight shooting in the surveying area at the flying height according to the surveying and mapping parameters to obtain a surveying photo set corresponding to the plurality of surveying sampling points.
  • the surveying and mapping parameters in the embodiments of the present disclosure may further include the flying height, which is set to instruct the surveying drone to take a flight shot in the surveying area at the flying height, thereby obtaining a set of surveying and photographing photos corresponding to multiple surveying and sampling points.
  • performing flight shooting in the surveying area according to the surveying parameters to obtain a set of surveying photos corresponding to the plurality of surveying sampling points may include: The geographic location information of the surveying and mapping sampling points determines that when flying to each of the surveying and mapping sampling points, the surveying and photographing photos corresponding to each of the surveying and surveying sampling points are taken to form the surveying and photographing photo set.
  • the surveying and mapping drone can fly to each surveying sampling point according to the geographic location information of each surveying sampling point, and every time it reaches a surveying and sampling point, the camera can be called to take pictures to obtain
  • the surveying photos corresponding to each surveying sampling point constitute a set of surveying photos.
  • Step 530 Combine and / or stitch multiple photos in the set of surveying photos to obtain a surveying map corresponding to the surveying area.
  • the surveying and mapping drone can combine and / or stitch multiple photos in the surveying and photographing photo collection, so as to obtain a complete surveying and mapping map corresponding to the surveying and mapping area.
  • multiple photos in the set of surveying photos are combined and / or stitched to obtain a map for mapping corresponding to the surveying area, which may include: Obtain the center surveying photos taken at at least one center shooting point, and the surrounding surveying photos taken at multiple surrounding shooting points associated with each center shooting point; according to the photos between each surrounding surveying photo and the corresponding center surveying photo For the degree of overlap, each central surveying photo and the corresponding surrounding surveying photo are stitched into a combined shooting photo; the photos are taken according to the combination corresponding to each central shooting point to obtain a mapping map corresponding to the surveying area.
  • the surveying and mapping drone can obtain the central surveying photos taken at at least one central shooting point in the surveying and photographing photo set, and obtain the surrounding surveying photos taken at multiple surrounding shooting points associated with each central shooting point, and then Each central surveying photo and corresponding surrounding surveying photo are stitched together according to the degree of photo overlap between each surrounding surveying photo and the corresponding central surveying photo to form a combined photograph. It can be seen that, in the embodiment of the present disclosure, when stitching the photos acquired by the surveying and mapping drone, stitching is not performed according to the overlapping degree of the photos between each two consecutive photos, so the processing time of image data can be greatly reduced, thereby Improve surveying and mapping efficiency.
  • the combined shooting photo is the surveying map corresponding to the surveying area; if the center surveying photo and the corresponding surrounding surveying photo are stitched together
  • the multiple combined photographs are stitched according to a certain degree of overlap, and the resulting combined photograph is the mapping map corresponding to the survey area.
  • the survey map of the survey area may include at least one of the following: a digital surface model of the survey area, a three-dimensional map of the survey area, and a planar map of the survey area .
  • the survey map corresponding to the survey area obtained by combining and / or stitching photos according to multiple photos in the survey photo set may be a digital surface model corresponding to the survey area, a three-dimensional survey area A flat map of the map or survey area.
  • the method may further include: The mapping map corresponding to the mapping area is sent to the control terminal and / or ground terminal.
  • the surveying and mapping drone may send the surveying map corresponding to the surveying area to the control terminal and / or the ground terminal.
  • the control terminal may use the surveying map as a work area and determine at least one work plot to generate a work route corresponding to the work plot and send it to the work drone.
  • the ground terminal can apply other aspects to the surveying and mapping map according to actual needs, for example, perform regional analysis on the surveying and mapping area according to the geographic information data of the surveying and mapping map and the climatic conditions corresponding to the surveying and mapping area.
  • the control terminal by receiving a plurality of surveying sampling points surveyed in the surveying area sent by the control terminal, to perform flight shooting in the surveying area according to the surveying sampling points, to obtain a set of surveying photographs corresponding to the plurality of surveying sampling points, and to map Multiple photos in the photo collection are combined and / or stitched to obtain a survey map corresponding to the survey area.
  • a new method for acquiring a survey map is proposed.
  • the overall planning method of multiple survey sampling points is used to replace the existing parallel lines
  • the mobile planning method solves the problems of high cost and low surveying efficiency in the existing UAV aerial survey method, and realizes the technical effect of reducing surveying and mapping cost and improving surveying and mapping efficiency.
  • FIG. 6 is a schematic diagram of a device for controlling a terminal-side surveying and mapping provided by Embodiment 6 of the present disclosure. As shown in FIG. 6, the device includes: a mapping parameter determination module 610 and a mapping parameter transmission module 620, in which:
  • the mapping parameter determination module 610 is configured to determine the mapping parameters that match the mapping area, wherein the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys and maps in the mapping area;
  • the mapping parameter sending module 620 is configured to send the mapping parameter to the mapping drone.
  • a new method for determining the sampling points of surveying and mapping is proposed by using the control terminal to determine the multiple sampling points for surveying and mapping in the surveying and mapping UAV in the surveying area, and sending the mapping parameters to the mapping UAV.
  • the mapping parameter determination module 610 includes: a photographing position point acquisition unit, which is set to: acquire a reference photographing position point corresponding to the surveying and mapping area, and take a photographing point in the combined photographing point set with the reference photograph A mapping relationship is established between the location points; the auxiliary photographing location point determination unit is set to determine the reference photographing location point according to the preset relative position relationship between each shooting point in the combined shooting point set and the mapping relationship Corresponding multiple auxiliary photographing position points; a first mapping sampling point determining unit, set to: use the reference photographing position point and the multiple auxiliary photographing position points as the surveying and mapping drone to survey and map in the surveying and mapping area Sampling points for multiple surveys.
  • a photographing position point acquisition unit which is set to: acquire a reference photographing position point corresponding to the surveying and mapping area, and take a photographing point in the combined photographing point set with the reference photograph A mapping relationship is established between the location points
  • the auxiliary photographing location point determination unit is set to determine the reference photograph
  • the surveying and mapping parameter determination module 610 includes: a surveying and mapping combined shooting area determination unit, which is configured to determine one or more surveying and mapping in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set A combined shooting area; a unit for determining a photographing position, configured to: according to a preset relative positional relationship between each shooting point in the set of combined shooting points, determine a plurality of photographing position points in the surveying and mapping combined shooting area; A second surveying and sampling point determination unit, which is set to use the plurality of photographing position points as a plurality of surveying and sampling points for surveying and mapping in the surveying and mapping area by the drone.
  • a surveying and mapping combined shooting area determination unit which is configured to determine one or more surveying and mapping in the surveying and mapping area according to the combined shooting area and the mapping area information corresponding to the combined shooting point set A combined shooting area
  • a unit for determining a photographing position configured to: according to a preset relative positional
  • multiple pictures taken according to multiple shooting points in the combined shooting point set have overlapping areas between them, and / or
  • the surveying and mapping combined shooting area is after multiple photos are taken according to the multiple shooting points in the combined shooting point set.
  • the shooting points in the combined shooting point set include: a center shooting point and four surrounding shooting points, the surrounding shooting points being four vertices of a rectangle centered on the center shooting point; wherein, according to The shape of the composite photo taken by each shooting point in the combined shooting point set is rectangular.
  • the photographing position point acquisition unit is set to: detect a user's touch operation in the human-machine interaction interface, and determine a screen position point according to the touch operation; the currently displayed surveying and mapping area in the human-machine interaction interface A geographic location coordinate matched with the screen location point is obtained from the map data of the reference location as the reference location point.
  • the photographing position point acquiring unit is set to: if it is detected that the user's touch operation is a single-point touch operation, determine the user's touch point as the screen position point;
  • a point within the frame generated by the user's touch is selected as the screen position point.
  • the photographing position point acquiring unit is configured to: acquire the center point of the surveying and mapping area as the reference photographing position point.
  • the photographing location point acquisition unit is set to send location query information to the surveying and mapping drone, and use the geographic location coordinates fed back by the surveying and mapping drone as the reference photographing location point; wherein, the surveying and mapping The unmanned aerial vehicle is preset at a position matching the surveying area.
  • the device further includes: a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping UAV; hovering instruction sending module, set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone at the current position Hovering; wherein, the flight control instruction is set to: control the mapping drone to move in the air in a set direction and / or a set distance.
  • a flight control instruction sending module configured to receive at least one flight control instruction for the surveying and mapping drone input by a user, and send the flight control instruction to the surveying and mapping UAV
  • hovering instruction sending module set to send a hovering instruction to the surveying and mapping drone when it is confirmed that the position confirmation response input by the user is received to control the surveying and mapping drone at the current position Hovering
  • the flight control instruction is set to: control the mapping drone to move in the air in a
  • the photographing location point acquiring unit is set to acquire the geographic location coordinates input by the user as the reference photographing location point.
  • the photographing position acquisition unit is configured to: establish a mapping relationship between a shooting point selected by the user in the combined shooting point set and the reference photographing position point.
  • the photographing position acquisition unit is configured to: establish a mapping relationship between the central shooting point in the combined shooting point set and the reference photographing position point.
  • the photographing position acquisition unit is set to calculate the distance between the reference photographing position and each positioning key of the surveying and mapping area, and the positioning key points include: corners of the surveying and mapping area and Describe the center point of the surveying and mapping area; obtain a positioning key point closest to the reference photographing position as the target reference point; according to the position information of the target reference point in the surveying and mapping area, in the combined shooting point set A shooting point matching the position information is selected to establish a mapping relationship with the reference shooting position point.
  • the surveying and mapping combined shooting area determination unit is set to: select a positioning point in the surveying and mapping area; determine a surveying and mapping combined shooting area in the surveying and mapping area based on the positioning point and the combined shooting area; If the surveying and mapping combined shooting area cannot completely cover the surveying and mapping area, then select a new positioning point in the surveying and mapping area, and return to the execution according to the positioning point and the combined shooting area, in the surveying and mapping area The operation of determining one combined shooting area of surveying and mapping until it is determined that all of the combined shooting area of surveying and mapping can completely cover the measuring area.
  • the photographing position point determination unit is set to map the central shooting point in the combined shooting point set to the area midpoint of the surveying and mapping combined shooting area, and use the area midpoint as a photographing position point ; According to the preset relative positional relationship between each surrounding shooting point in the combined shooting point set and the central shooting point, map each of the surrounding shooting points to the combined surveying and mapping shooting area, and The formed multiple mapping points are used as the photographing location points.
  • the device further includes: a screen selection area acquisition module, configured to: detect a user's touch operation in the human-computer interaction interface, and acquire a screen selection area matching the touch operation; a survey area information acquisition module, It is set that: in the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • a screen selection area acquisition module configured to: detect a user's touch operation in the human-computer interaction interface, and acquire a screen selection area matching the touch operation
  • a survey area information acquisition module It is set that: in the map data currently displayed in the human-computer interaction interface, a geographic location area matching the screen selection area is acquired as the mapping area information.
  • the screen selection area acquisition module is configured to: if it is detected that the user's touch operation is a single-point touch operation, determine the enclosed area enclosed by the connection line of at least three touch points of the user as The screen selection area; and / or
  • the frame generated by the user's touch is used as the screen selection area.
  • the device further includes: a shooting parameter acquisition module, configured to: acquire shooting parameters of a camera device carried by the surveying and mapping drone, and the shooting parameters include the surveying and mapping drone at a set flight altitude Under the single photo shooting area, each shooting point corresponds to a single photo shooting area; the relative position relationship determination module is set to determine the combined shooting point according to a preset photo overlap index and the single photo shooting area Preset relative position relationship between each shooting point in the set; the surveying and mapping parameters also include: the flying altitude, the flying altitude is set to: instruct the surveying and mapping UAV to use the flying altitude in the surveying and mapping Take flight shots in the area.
  • a shooting parameter acquisition module configured to: acquire shooting parameters of a camera device carried by the surveying and mapping drone, and the shooting parameters include the surveying and mapping drone at a set flight altitude Under the single photo shooting area, each shooting point corresponds to a single photo shooting area
  • the relative position relationship determination module is set to determine the combined shooting point according to a preset photo overlap index and the single
  • the relative position relationship determination module is configured to: determine the size of a single photo according to the frame size of the photographing device and the pixel width of the photographing device; construct a two-dimensional coordinate system, and set the two-dimensional coordinate system Select the target point as the center shooting point; generate a center photo in the two-dimensional coordinate system according to the center shooting point and the size of the single photo; at the upper left corner, lower left corner, upper right corner, and right of the center photo In the lower corner, four surrounding photos that meet the center photo overlap index with the center photo are generated respectively; according to the mapping relationship between the size of the single photo and the shooting area of the single photo, it is determined to correspond to each of the surrounding photos
  • the coordinate values of the surrounding shooting points in the two-dimensional coordinate system according to the coordinate values of the central shooting point and each of the surrounding shooting points in the two-dimensional coordinate system, determine the set of combined shooting points The preset relative position relationship between each shooting point.
  • the apparatus further includes: a flying height calculation module, configured to calculate the set flying height according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • a flying height calculation module configured to calculate the set flying height according to the pixel width of the camera device, the lens focal length of the camera device, and the ground pixel resolution.
  • the shooting parameter acquisition module is set to: calculate the mapping drone at the set flying height according to the pixel width of the camera device, the frame area of the camera device, and the resolution of the ground pixel Under the single photo shooting area.
  • control terminal side surveying and mapping device described above can execute the control terminal side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • control terminal side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • mapping method on the control terminal side provided by any embodiment of the present disclosure.
  • the device includes: a surveying and mapping parameter receiving module 710, a surveying and photographing photo collection shooting module 720, and a surveying and mapping map generating module 730, where:
  • the mapping parameter receiving module 710 is configured to receive the mapping parameters sent by the control terminal, wherein the mapping parameters are determined by the control terminal according to the mapping area, and the mapping parameters include: the mapping drone Describe multiple surveying and mapping sampling points in the surveying and mapping area;
  • the surveying and photographing photo collection shooting module 720 is configured to perform flight shooting in the surveying and mapping area according to the surveying and mapping parameters to obtain a surveying and photographing photo collection corresponding to the plurality of surveying and sampling points;
  • the mapping map generation module 730 is configured to: combine and / or stitch the multiple photos in the mapping photo set to obtain a mapping map corresponding to the mapping area.
  • the control terminal by receiving a plurality of surveying sampling points surveyed in the surveying area sent by the control terminal, to perform flight shooting in the surveying area according to the surveying sampling points, to obtain a set of surveying photographs corresponding to the plurality of surveying sampling points, and to map Multiple photos in the photo collection are combined and / or stitched to obtain a survey map corresponding to the survey area.
  • a new method for acquiring a survey map is proposed.
  • the overall planning method of multiple survey sampling points is used to replace the existing parallel lines
  • the mobile planning method solves the problems of high cost and low surveying efficiency in the existing UAV aerial survey method, and realizes the technical effect of reducing surveying and mapping cost and improving surveying and mapping efficiency.
  • the surveying and photographing photo collection shooting module 720 is configured to: when determining to fly to each surveying and sampling point according to the geographic location information of each surveying and sampling point, obtain a shot with each surveying and sampling point The corresponding mapping photos constitute the collection of mapping photos.
  • the device further includes an instruction movement module configured to receive at least one flight control instruction sent by the control terminal, and set a direction in the air according to the flight control instruction, and / or to set Distance movement; the instruction hovering module is set to: hover at the current location according to the hovering instruction sent by the control terminal; the geographic position coordinate feedback module is set to: according to the position query information sent by the control terminal , Feeding back the geographic location coordinates of the current location to the control terminal, wherein the geographic location coordinates are set as: the control terminal determines a reference photographing location point.
  • an instruction movement module configured to receive at least one flight control instruction sent by the control terminal, and set a direction in the air according to the flight control instruction, and / or to set Distance movement
  • the instruction hovering module is set to: hover at the current location according to the hovering instruction sent by the control terminal
  • the geographic position coordinate feedback module is set to: according to the position query information sent by the control terminal , Feeding back the geographic location coordinates of the current location to the control terminal, wherein
  • the surveying and mapping parameters further include: a flying height; a surveying and photographing photo collection shooting module 720, which is configured to perform flight shooting in the surveying and mapping area at the flying height according to the surveying and mapping parameters, to obtain A collection of surveying and mapping photos corresponding to surveying and sampling points.
  • the surveying and mapping map generation module 730 is set to: obtain, in the surveying and photographing photo set, a central surveying photograph taken at at least one central shooting point, and surroundings photographed at multiple surrounding shooting points associated with each central shooting point Surveying and mapping photos; according to the degree of photo overlap between each surrounding surveying photo and the corresponding center surveying photo, stitch each central surveying photo with the corresponding surrounding surveying photo into a combined shooting photo; according to the combination corresponding to each center shooting point Photographs to obtain a mapping map corresponding to the surveying area.
  • the surveying map of the surveying area includes at least one of the following: a digital surface model of the surveying area, a three-dimensional map of the surveying area, and a flat map of the surveying area.
  • the device further includes a surveying and mapping map sending module, configured to: send a surveying and mapping map corresponding to the surveying and mapping area to the control terminal and / or ground terminal.
  • a surveying and mapping map sending module configured to: send a surveying and mapping map corresponding to the surveying and mapping area to the control terminal and / or ground terminal.
  • the above surveying and mapping UAV side surveying and mapping device can execute the surveying and mapping UAV side surveying and mapping method provided by any embodiment of the present disclosure, and has corresponding function modules and beneficial effects of the execution method.
  • the method for surveying and mapping the UAV side provided by any embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a control terminal according to Embodiment 8 of the present disclosure.
  • FIG. 8 shows a block diagram of a control terminal 612 suitable for implementing embodiments of the present disclosure.
  • the control terminal 612 shown in FIG. 8 is only an example, and should not bring any limitation to the functions and use scope of the embodiments of the present disclosure.
  • control terminal 612 is expressed in the form of a general-purpose computing device.
  • the components of the control terminal 612 may include, but are not limited to, one or more processors 616, a storage device 628, and a bus 618 connecting different system components (including the storage device 628 and the processor 616).
  • the bus 618 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards) Association, VESA) local area bus and peripheral component interconnect (Peripheral Component Interconnect, PCI) bus.
  • the control terminal 612 typically includes a variety of computer system readable media. These media may be any available media that can be accessed by the control terminal 612, including volatile and non-volatile media, removable and non-removable media.
  • the storage device 628 may include a computer system readable medium in the form of volatile memory, such as random access memory (Random Access Memory, RAM) 630 and / or cache memory 632.
  • the control terminal 612 may further include other removable / non-removable, volatile / nonvolatile computer system storage media.
  • the storage system 634 may be configured to read and write non-removable, non-volatile magnetic media (not shown in FIG. 8 and is generally referred to as a "hard disk drive").
  • each drive may be connected to the bus 618 through one or more data media interfaces.
  • the storage device 628 may include at least one program product having a set of (eg, at least one) program modules configured to perform the functions of each embodiment of the present disclosure.
  • a program 636 having a set of (at least one) program modules 626 may be stored in, for example, a storage device 628.
  • Such program modules 626 include but are not limited to an operating system, one or more application programs, other program modules, and program data. These Each of the examples or some combination may include an implementation of the network environment.
  • the program module 626 generally performs the functions and / or methods in the embodiments described in the present disclosure.
  • the control terminal 612 may also communicate with one or more external devices 614 (eg, keyboard, pointing device, camera, display 624, etc.), and may also communicate with one or more devices that enable users to interact with the control terminal 612, and / or Or communicate with any device (such as a network card, modem, etc.) that enables the control terminal 612 to communicate with one or more other computing devices. Such communication can be performed through an input / output (I / O) interface 622.
  • the control terminal 612 may also communicate with one or more networks (such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 620.
  • networks such as a local area network (Local Area Network, LAN), wide area network Wide Area Network, WAN) and / or a public network such as the Internet through the network adapter 620.
  • the network adapter 620 communicates with other modules of the control terminal 612 through the bus 618. It should be understood that although not shown in the figure, other hardware and / or software modules may be used in conjunction with the control terminal 612, including but not limited to: microcode, device driver, redundant processing unit, external disk drive array, disk array (Redundant Arrays) of Independent Disks (RAID) systems, tape drives and data backup storage systems, etc.
  • the processor 616 runs a program stored in the storage device 628 to execute various functional applications and data processing, for example, to implement the control terminal side surveying and mapping method provided by the above-described embodiments of the present disclosure.
  • the processing unit executes the program, it realizes: determining the mapping parameters that match the surveying area, wherein the surveying parameters include: a plurality of surveying sampling points mapped by the surveying drone in the surveying area ; Send the surveying and mapping parameters to the surveying and mapping UAV.
  • the ninth embodiment is a surveying and mapping drone for performing the surveying and mapping UAV side surveying method provided by any embodiment of the present disclosure
  • the surveying and mapping drone includes: one or more processes A storage device, configured to: store one or more programs; when the one or more programs are executed by the one or more processors, such that the one or more processors are implemented as any implementation of the present disclosure
  • the surveying and mapping UAV side surveying and mapping method provided in the example: receiving surveying and mapping parameters sent by a control terminal, wherein the surveying and mapping parameters are determined by the control terminal according to the surveying and mapping area, and the surveying and mapping parameters include: the surveying and mapping unmanned Multiple sampling points for surveying and mapping in the surveying and mapping area; performing flight shooting in the surveying area according to the mapping parameters to obtain a set of surveying and photographing photos corresponding to the sampling points for surveying and mapping; Multiple photos in the collection are combined and / or stitched to obtain a mapping map corresponding to the mapping area.
  • Embodiment 10 of the present disclosure also provides a computer storage medium for storing a computer program, which when executed by a computer processor is used to perform any of the control terminal side surveying and mapping methods described in the above embodiments of the present disclosure: determination and mapping Regional mapping mapping parameters, where the mapping parameters include: a plurality of mapping sampling points that the mapping drone surveys in the mapping area; and the mapping parameters are sent to the mapping drone.
  • the computer program when executed by a computer processor, it is used to perform the surveying and mapping method for surveying and mapping unmanned aerial vehicles according to any one of the above embodiments of the present disclosure: receiving surveying and mapping parameters sent by a control terminal, where the surveying and mapping parameters are the The control terminal is determined according to the surveying and mapping area, and the surveying and mapping parameters include: a plurality of surveying and sampling points for the surveying and mapping drone to survey and map in the surveying and mapping area; and performing flight shooting in the surveying and mapping area according to the surveying and mapping parameters To obtain a collection of surveying photos corresponding to the plurality of surveying sampling points; to combine and / or stitch the multiple photos in the collection of surveying photos to obtain a surveying map corresponding to the surveying area.
  • the computer storage media of the embodiments of the present disclosure may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above.
  • Examples of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read Only Memory, ROM) , Erasable programmable read-only memory ((Erasable Programmable Read Only Memory, EPROM) or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable of the above The combination.
  • the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal that is propagated in baseband or as part of a carrier wave, in which computer-readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device. .
  • the program code contained on the computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • any appropriate medium including but not limited to wireless, wire, optical cable, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the foregoing.
  • the computer program code for performing the operations of the present disclosure may be written in one or more programming languages or a combination thereof, the programming languages including object-oriented programming languages-such as Java, Smalltalk, C ++, as well as conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code may be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, through an Internet service provider Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider Internet connection for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the embodiment of the present disclosure proposes a new surveying and mapping system and a surveying method.
  • the overall planning method of multiple surveying sampling points based on the new surveying and mapping system is used to replace the existing parallel line movement planning method to solve the problems existing in the existing UAV aerial surveying methods.
  • the high cost and low efficiency of surveying and mapping have achieved the technical effect of reducing the cost of surveying and mapping and improving the efficiency of surveying and mapping.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Mathematical Physics (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种测绘系统,所述系统包括:控制终端以及测绘无人机,其中:所述控制终端设置为:确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;所述测绘无人机设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图。

Description

一种测绘系统、测绘方法、装置、设备及介质 技术领域
本公开实施例涉及测绘技术领域,例如,涉及一种测绘系统、测绘方法、装置、设备及介质。
背景技术
近年来无人机因其高效、灵活及低成本等特性,已被广泛应用于测绘、应急及救灾等领域。无人机航空测绘(简称航测)技术能够大大降低传统航空测绘技术的工作周期、人力和财力的投入,在测绘等领域具有更加现实的意义。
无人机航空测绘技术通过搭载的视频捕捉设备通过图像远程传输技术实施观察航拍区域的现状,同时利用航摄影像拼接技术对拍摄的照片进行拼接,以得到航摄区域的整体影像。传统无人机航测方法在拍摄照片时,一般采取平行线遍历的方式在测绘区域内进行移动测绘,并为了保证拼接成功,通常要求每连续两张照片之间具有一定的重叠度。为了保证后续的正常拼接,要求一张照片在横向以及纵向均与其他照片具有一定的重叠度,一般来说,为了保证后续的正常拼接,重叠度一般要求大于50%。
公开人在实现本公开的过程中,发现相关技术存在如下缺陷:传统无人机航测方法都是针对大面积地块的航摄区域进行测绘的,测绘过程中拍摄多张重叠度较高的照片。将无人机拍摄的上述照片进行拼接时,耗时较长且效率较低。此外,如果将无人机获取的照片上传到服务器进行拼接处理,则数据上传和处理过程耗时更长。同时,传统无人机航测方法应用于小地块测绘时,不仅操作繁杂,处理时间长且硬件成本也较高。
发明内容
本公开实施例提供一种测绘系统、测绘方法、装置、设备及介质,以降低测绘成本,并提高测绘效率。
本公开实施例提供了一种测绘系统,包括:控制终端以及测绘无人机,其中:
所述控制终端,设置为:确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
所述测绘无人机,设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。
可选的,所述测绘无人机还设置为:根据与所述测绘区域对应的测绘地图,生成与所述测绘区域对应的地图瓦片数据。
可选的,所述系统还包括:作业无人机;
所述控制终端还设置为:将所述测绘区域作为作业区域,从所述测绘无人机获取与所述作业区域对应的地图瓦片数据,并根据所述地图瓦片数据生成所述作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块,并生成与所述作业地块对应的作业航线发送至所述作业无人机;
所述作业无人机,设置为:接收所述作业航线,并根据所述作业航线在所述至少一个作业地块中进行飞行作业。
本公开实施例还提供了一种控制终端侧测绘方法,应用于本公开实施例所述的测绘系统中,包括:
确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
将所述测绘参数发送至所述测绘无人机。
可选的,确定与测绘区域匹配的测绘参数,包括:
获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;
根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;
将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,确定与测绘区域匹配的测绘参数,包括:
根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;
根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点;
将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合, 和/或拼接形成所述测绘区域的测绘地图。
可选的,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;
其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;
在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
可选的,检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点,包括下述至少一项:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
获取所述测绘区域的中心点作为所述参考拍照位置点。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;
其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
可选的,在向所述测绘无人机发送位置查询信息之前,还包括:
接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;
在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;
其中,所述飞行控制指令设置为:控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
可选的,获取与测绘区域对应的参考拍照位置点,包括:
获取用户输入的地理位置坐标作为所述参考拍照位置点。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
可选的,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;
获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;
根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
可选的,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,包括:
在所述测绘区域内选择一个定位点;
根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;
如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
可选的,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,包括:
将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;
根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
可选的,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还包括:
检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;
在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
可选的,检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域,包括:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,在将所述测绘参数发送至所述测绘无人机之前,还包括:
获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;
根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;
所述测绘参数还包括:所述飞行高度,所述飞行高度设置为:指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
可选的,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,包括:
根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;
构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;
根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;
在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;
根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;
根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
可选的,在获取测绘无人机所携带的拍照设备的拍摄参数之前,还包括:
根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
可选的,获取测绘无人机所携带的拍照设备的拍摄参数,包括:
根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
本公开实施例还提供了一种测绘无人机侧测绘方法,应用于本公开实施例所述的测绘系统中,包括:
接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;
将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。
可选的,根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,包括:
在根据每个所述测绘采样点的地理位置信息,确定飞行至每个所述测绘采样点时,拍摄得到与每个所述测绘采样点分别对应的测绘照片构成所述测绘照片集合。
可选的,在接收控制终端发送的测绘参数之前,还包括:
接收所述控制终端发送的至少一项飞行控制指令,并根据所述飞行控制指令在空中进行设定方向,和/或设定距离的移动;
根据所述控制终端发送的悬停指令,在当前所在位置进行悬停;
根据所述控制终端发送的位置查询信息,将当前所在位置的地理位置坐标反馈至所述控制终端,其中,所述地理位置坐标设置为:所述控制终端确定参考拍照位置点。
可选的,所述测绘参数还包括:飞行高度;
根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,包括:
根据所述测绘参数以所述飞行高度在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合。
可选的,将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图,包括:
在所述测绘照片集合中获取在至少一个中心拍摄点拍摄的中心测绘照片,以及在每个中心拍摄点关联的多个周围拍摄点拍摄的周围测绘照片;
根据每个周围测绘照片与对应中心测绘照片之间的照片重叠度,将每个中心测绘照片与对应的周围测绘照片拼接为组合拍摄照片;
根据与每个中心拍摄点对应的组合拍摄照片,得到与所述测绘区域对应的测绘地图。
可选的,所述测绘区域的测绘地图包括下述至少一项:
所述测绘区域的数字表面模型、所述测绘区域的三维地图以及所述测绘区域的平面地图。
可选的,在将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图之后,还包括:
将所述测绘区域对应的测绘地图发送至所述控制终端,和/或地面终端。
本公开实施例还提供了一种控制终端侧测绘装置,应用于本公开实施例所述的测绘系统中,包括:
测绘参数确定模块,设置为:确定与测绘区域匹配的测绘参数,其中,所述测绘参数包 括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
测绘参数发送模块,设置为:将所述测绘参数发送至所述测绘无人机。
本公开实施例还提供了一种测绘无人机侧测绘装置,应用于本公开实施例所述的测绘系统中,包括:
测绘参数接收模块,设置为:接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
测绘照片集合拍摄模块,设置为:根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;
测绘地图生成模块,设置为:将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。
本公开实施例还提供了一种控制终端,所述控制终端包括:
一个或多个处理器;
存储装置,设置为:存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本公开实施例中任意所述的控制终端侧测绘方法。
本公开实施例还提供了一种计算机存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开实施例所提供的控制终端侧测绘方法。
本公开实施例还提供了一种测绘无人机,所述测绘无人机包括:
一个或多个处理器;
存储装置,设置为:存储一个或多个程序;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本公开实施例中任意所述的测绘无人机侧测绘方法。
本公开实施例还提供了一种计算机存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开实施例所提供的测绘无人机侧测绘方法。
附图说明
图1是本公开实施例一提供的一种测绘系统的示意图;
图2是本公开实施例二提供的一种控制终端侧测绘方法的流程图;
图3a是本公开实施例三提供的一种控制终端侧测绘方法的流程图;
图3b是本公开实施例三提供的一种组合拍摄点集内每个拍摄点的位置分布示意图;
图4a是本公开实施例四提供的一种控制终端侧测绘方法的流程图;
图4b是本公开实施例二提供的一种每个拍照位置点的分布示意图;
图5是本公开实施例五提供的一种测绘无人机侧测绘方法的流程图;
图6是本公开实施例六提供的一种控制终端侧测绘装置的示意图;
图7是本公开实施例七提供的一种测绘无人机侧测绘装置的示意图;
图8为本公开实施例八提供的一种控制终端的结构示意图。
具体实施方式
下面结合附图和实施例对本公开作进一步的详细说明。可以理解的是,此处所描述的实施例仅仅用于解释本公开,而非对本公开的限定。
为了便于描述,附图中仅示出了与本公开相关的部分而非全部内容。在更加详细地讨论示例性实施例之前应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将各项操作(或步骤)描述成顺序的处理,但是其中的许多操作可以被并行地、并发地或者同时实施。此外,各项操作的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
实施例一
图1是本公开实施例一提供的一种测绘系统的示意图,如图1所示,该作业无人机测绘系统的结构包括:控制终端10以及测绘无人机20其中:
控制终端10,设置为:确定与测绘区域匹配的测绘参数,并将测绘参数发送至测绘无人机20,测绘参数包括:测绘无人机20在测绘区域中测绘的多个测绘采样点;测绘无人机20,设置为:接收测绘参数,并根据测绘参数在测绘区域中进行飞行拍摄,得到与多个测绘采样点对应的测绘照片集合,将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图。
其中,控制终端10可以是控制测绘无人机的任意设备,如无人机遥控器等。本公开实施例并不对控制终端的设备类型进行限定。测绘无人机20可以是设置为:对测绘区域进行测绘以获取测绘区域相关数据的无人机,如获取测绘区域的多个测绘照片。测绘无人机20具备拍照设备,设置为:获取测绘区域对应的多个测绘照片。
在本公开实施例中,如图1所示,测绘系统由控制终端10以及测绘无人机20组成。其中,控制终端10负责确定测绘无人机在测绘区域中测绘的多个测绘采样点,并将测绘采样点形成的测绘参数发送至测绘无人机20。测绘无人机20可以接收控制终端确定的测绘参数,并根据测绘参数在测绘区域中进行飞行拍摄,得到与测绘参数中包括的多个测绘采样点对应 的测绘照片集合。同时,测绘无人机还可以将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图。为了能够将测绘照片集合中的多张照片进行照片组合和/或拼接形成完整的图像,多个测绘采样点对应的多张照片之间具有一定的重叠度,但并不要求每连续两张照片之间都具有一定的重叠度,因此本公开实施例所提供的测绘系统能够大幅降低图像数据的处理耗时,从而提高测绘效率。
可选的,测绘无人机20还设置为:根据与测绘区域对应的测绘地图,生成与测绘区域对应的地图瓦片数据。
其中,地图瓦片数据是设置为:生成瓦片地图的相关数据,由切片算法对地图数据进行切片后形成。
在本公开实施例中,测绘无人机20除了可以对测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图,还可以根据得到的测绘地图利用切片算法等技术生成测绘区域对应的地图瓦片数据。地图瓦片数据可以设置为:生成瓦片地图,瓦片地图所构成的金字塔模型是一种多分辨率层次模型,从瓦片金字塔的底层到顶层,分辨率越来越低,但表示的地理范围不变。测绘无人机20所生成的地图瓦片数据可以设置为:对测绘区域中的地点进行定位。
可选的,测绘系统还可以包括作业无人机;控制终端10还设置为:将测绘区域作为作业区域,从测绘无人机20获取与作业区域对应的地图瓦片数据,并根据地图瓦片数据生成作业区域的区域地图进行显示,根据用户针对区域地图选择的至少一个区域定位点,在作业区域内确定至少一个作业地块,并生成与作业地块对应的作业航线发送至作业无人机;作业无人机,设置为:接收作业航线,并根据作业航线在至少一个作业地块中进行飞行作业。
其中,作业无人机可以是设置为:根据作业需求对测绘区域进行作业的无人机,如对测绘区域中的农作物、土壤、植被或水质等状况进行检测,或在测绘区域中喷洒农药等。
在本公开实施例中,控制终端10还可以将测绘区域作为一个作业区域,并从测绘无人机20获取与作业区域对应的地图瓦片数据。由于地图瓦片数据包括了多种分辨率不同的地图数据,控制终端10可以根据作业无人机对分辨率的需求根据地图瓦片数据生成作业区域对应的区域地图进行显示。用户在操控控制终端10时,可以针对区域地图选择至少一个区域定位点。其中,区域定位点可以设置为:在作业区域内确定至少一个作业地块。例如,以区域定位点为中心,生成10m*10m的正方形作业地块。相应的,控制终端10确定作业地块后,可以生成与作业地块对应的作业航线发送至作业无人机。例如,在10m*10m的正方形作业地块中,以左上角顶点作为起始点,按照作业地块的边长在顺时针方向上每5秒行进1m。不同的作业地块可以生成不同的作业航线,本公开实施例对此并不进行限制。作业无人机接收到作业航线后,即可根据作业航线在确定的作业地块中进行飞行作业。
本公开实施例中测绘系统的工作原理是:通过控制终端确定测绘区域中的多个测绘采样点并发送至测绘无人机。无人机可以根据确定的测绘采样点在测绘区域中进行飞行拍摄,得到与多个测绘采样点对应的测绘照片集合,然后将测绘照片集合中的多张照片进行照片组合和/或拼接,最终得到与测绘区域对应的完整的测绘地图。
本公开实施例通过控制终端以及测绘无人机组成一种新的测绘系统,其中,控制终端设置为:确定与测绘区域匹配的测绘参数,并将测绘参数发送至测绘无人机,测绘无人机设置为:接收测绘参数,并根据测绘参数在测绘区域中进行飞行拍摄,得到与多个测绘采样点对应的测绘照片集合,将测绘照片集合中的多张照片进行照片组合和/或拼接,得到与测绘区域对应的测绘地图,提出了一种新的测绘系统和测绘方法,使用基于新的测绘系统的多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例二
图2是本公开实施例二提供的一种控制终端侧测绘方法的流程图,本实施例可适用于确定测绘区域中多个测绘采样点的情况,该方法可以由控制终端侧测绘装置来执行,该装置可以由软件和/或硬件的方式来实现,并一般可集成在控制设备(例如,无人机遥控器)中,与设置为:负责航拍的测绘无人机配合使用。相应的,如图2所示,该方法包括如下操作:
步骤210、确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
其中,测绘区域是具有明确经纬度范围的区域,其可以是任意形状及任意大小的区域,本公开实施例并不对测绘区域的形状和大小进行限定。
在本公开实施例中,测绘区域所匹配的测绘参数,即测绘无人机在测绘区域中测绘的多个测绘采样点可以由控制终端来确定。通过控制终端确定多个测绘采样点可以有效提高整个测绘系统的测绘效率。
步骤220、将所述测绘参数发送至所述测绘无人机。
相应的,控制终端确定测绘无人机在测绘区域中测绘的多个测绘采样点后,可以发送至测绘无人机,以使测绘无人机根据测绘采样点获取对应的测绘照片集合。测绘无人机根据多个测绘采样点所获取的测绘照片具有一定的重叠度,但并不是每连续两张照片之间都具有一定的重叠度,因此能够大幅降低图像数据的处理耗时,从而提高测绘效率。
本实施例的技术方案,通过利用控制终端确定测绘无人机在测绘区域中测绘的多个测绘采样点,并将测绘参数发送至测绘无人机,提出了一种新的测绘采样点确定方法,使用多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的 成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例三
图3a是本公开实施例三提供的一种控制终端侧测绘方法的流程图,在本实施例中,给出了确定与测绘区域匹配的测绘参数的其中一种实现方式。相应的,如图3a所示,本实施例的方法可以包括:
步骤310、确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,步骤310可以包括下述操作:
步骤311、获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系。
其中,参考拍照位置点是测绘区域中的一个位置点,具有匹配的地理位置坐标。上述位置点可以是用户在测绘区域中选取的(例如,点选,或者直接输入经纬度等),也可以是根据测绘区域的区域形状自动确定出的(例如,测绘区域的中心点或者测绘区域的角点等)。组合拍摄点集可以是根据预设分布规则预先设置的拍摄点的集合,在该集合中可以包括多个拍摄点,且每两个拍摄点之间可以具有相对方向和相对距离关系。例如,组合拍摄点集中包括5个拍摄点,分别位于矩形的中心和四个顶点处。其中,每个顶点与中心点之间的相对距离为100m。又如,每个顶点分别位于东、南、西、北四个方向。
在本公开实施例中,可以根据组合拍摄点集辅助获取测绘区域对应的所有测绘采样点。可选的,可以首先确定测绘区域中的其中一个点作为参考拍照位置点,然后将该参考拍照位置点与组合拍摄点集内的其中一个拍摄点建立彼此之间的映射关系。
换句话说,组合拍摄点集内的各个拍摄点之间的相对位置关系确定,但是却并未与实际的地理位置信息建立对应关系,因此无法直接映射至实际的测绘区域中去,只要将组合拍摄点集中的一个拍摄点赋予实际的地理位置信息,则该组合拍摄点集中的全部拍摄点的地理位置信息就全部可以确定得到。
典型的,按照所述组合拍摄点集中的多个拍摄点所拍摄的多张照片之间具有重叠区域,相应的,按照所述组合拍摄点集中的多个拍摄点拍摄多张照片后,可以将所述多张照片组合,和/或拼接形成的一个完整的组合区域。该组合区域可以完整覆盖测绘区域,也可以仅覆盖测绘区域的一部分,本实施例对此并不进行限制。
图3b是本公开实施例三提供的一种组合拍摄点集内每个拍摄点的位置分布示意图。在本公开的一个可选实施例中,如图3b所示,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;其中, 根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
在本公开实施例中,可选的,如图3b所示,组合拍摄点集内可以包括5个拍摄点,分别为中心拍摄点以及四个周围拍摄点。其中,中心拍摄点可以是一个矩形的中心,相对应的,周围拍摄点可以是中心拍摄点对应矩形的四个顶点。每个拍摄点之间具有一定的位置关系,该位置关系的设定满足一定的条件,即根据每个拍摄点所确定的每个拍照位置点拍摄得到的每个照片进行组合时,可以得到一个完整的矩形照片。其中,组合过程即为将每个照片按照彼此之间的重叠图像进行覆盖。在其他实施例中,在完成默认映射后,每个辅助拍照点可以根据用户的操作而以参考拍照位置点为中心转动,或者根据用户的滑动等操作移动。
相关技术在形成与测绘区域对应的测绘点时,由于采取平行线遍历的方式在测绘区域内进行移动测绘,因此应该保证在一个测绘点下拍摄的照片的水平相邻位置以及竖直相邻位置的其他拍摄点均具有预设的重叠度。这就会使得一张测绘照片中包含的,区别与其他测绘照片的信息量较少,因此,应该拍摄大量的照片完成对一个测绘区域的测绘,后期照片的合成以及拼接所需的工作量以及时间都较大。在本实施例中,所选取的组合拍摄点集内的5个拍摄点是一个中心拍摄点以及四个周围拍摄点,每个周围拍摄点只要能保证与中心拍摄点满足上述重叠度(例如,60%或者70%等)要求即可,两两周围拍摄点之间不必满足如此高的重叠度要求,这就大大降低了测绘一块固定大小的测绘区域拍摄的测绘照片的总数量,进而也就可以大大降低后续的照片合成或者拼接的时间和硬件成本。特别的,如果将本公开实施例的方案应用在小地块中,例如一个组合拍摄点集中的每个拍摄点所拍摄的多张照片进行组合或者拼接后,可以完整覆盖一个地块时,本公开实施例的方案在测绘点的数量以及后期的拼接难度上,可以明显优于相关技术的平行线遍历选点测绘的方式。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
在本公开实施例中,可以根据用户在人机交互界面中指定的点确定参考拍照位置点。可选的,可以检测用户在人机交互界面中的触摸操作,如点击或划击等操作,并根据用户的触摸操作确定人机交互界面中的其中一个屏幕位置点。然后依据人机交互界面中当前显示的测绘区域的地图数据,确定与屏幕位置点匹配的一个地理位置坐标作为参考位置点。其中,地图数据可以是经纬度信息等。
在本公开的一个可选实施例中,检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点,可以包括下述至少一项:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏 幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
在本公开实施例中,根据用户在人机交互界面中的触摸操作确定一个屏幕位置点可以具有多种实现方式。可选的,可以将用户的单点触摸操作对应的触摸点确定为屏幕位置点。也可以将用户的划线触摸操作所生成的线段上的一点作为屏幕位置点。例如,将线段的中点作为屏幕位置点。也还可以将用户的画框触摸操作内部的一点作为屏幕位置点,例如,将框体内的中点作为屏幕位置点。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:获取所述测绘区域的中心点作为所述参考拍照位置点。
另外,在本公开实施例中,参考拍照位置点还可以由控制测绘无人的控制终端自动生成。例如直接将测绘无人机所在的测绘区域的中心点作为参考拍照位置点。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,还可以包括:获取用户输入的地理位置坐标作为所述参考拍照位置点。
在本公开实施例中,还可以直接将用户在输入的地理位置坐标作为参考拍照位置点。可选的,用户可以通过人机交互界面中的软键盘、控制终端中的数字键盘或语音输入等方式输入地理位置坐标。
在本公开的一个可选实施例中,获取与测绘区域对应的参考拍照位置点,可以包括:向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
在本公开实施例中,还可以通过用户指定的位置信息确定参考拍照位置点。可选的,用户可以通过控制终端向测绘无人机发送位置查询信息。例如,用户在控制终端的人机交互界面触发设定标识向测绘无人机发送位置查询信息以查询测绘无人机的当前位置。测绘无人机接收到位置查询信息后,通过自身的定位装置获取当前地理位置坐标并反馈给控制终端。控制终端可以将接收到的地理位置坐标对应的位置点直接作为参考拍照位置点。相应的,测绘无人机向控制终端发送地理位置坐标时,其在地面的投影点应该位于测绘区域内部。
在本公开的一个可选实施例中,在向所述测绘无人机发送位置查询信息之前,还可以包括:接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;其中,所述飞行控制指令设置为: 控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
相应的,如果通过用户指定的位置信息确定参考拍照位置点,则用户必须对控制终端输入针对测绘无人机的至少一项飞行控制指令。控制终端将用户输入的飞行控制指令发送至测绘无人机,以使测绘无人机根据飞行控制指令行驶。在测绘无人机行驶的过程中,如果用户向控制终端输入了位置确认响应,例如,用户输入了停止飞行指令作为位置确认响应,则控制终端可以向测绘无人机发送悬停指令,以控制测绘无人机在当前位置悬停。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,可以包括:将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
相应的,在获取到参考拍照位置点后,用户可以在组合拍摄点集内的每个拍摄点中任意选择其中一个拍摄点,并将用户选择的组合拍摄点集中的拍摄点与参考拍照位置点建立映射关系。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,还可以包括:将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
在本公开实施例中,可选的,还可以直接将组合拍摄点集内的中心拍摄点与参考拍照位置点建立映射关系。
在本公开的一个可选实施例中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,还可以包括:计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
在本公开实施例中,可选的,还可以根据参考拍照位置点与测绘区域内每个关键点之间的距离关系来确定映射关系。可选的,将测绘区域的角点以及测绘区域的中心点作为定位关键点,计算参考拍照位置点与测绘区域每个定位关键点之间的距离,并获取距离参考拍照位置点最近的一个定位关键点作为目标参考点。然后根据目标参考点在测绘区域内的位置信息,在组合拍摄点集内选择与位置信息匹配的一个拍摄点与参考拍照位置点建立映射关系。例如,目标参考点位于测绘区域的左上方,则可以在组合拍摄点集内选择左上角的拍摄点与参考拍照位置点建立映射关系。
步骤312、根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点。
其中,辅助拍照位置点可以是测绘区域中区别于参考拍照位置点的其他位置点。
进一步的,当确定参考拍照位置点与组合拍摄点集内的其中一个拍摄点之间的映射关系后,可以根据组合拍摄点集内每个拍摄点之间预设的相对位置关系以及确定的映射关系,进一步确定参考拍照位置点对应的其他多个辅助拍照位置点。
示例性的,假设组合拍摄点集内共包括5个拍摄点,其中拍摄点集内的中心拍摄点与参考拍照位置点建立了映射关系,则可以根据组合拍摄点集内其他4个拍摄点与中心拍摄点之间的位置关系,确定与参考拍照位置点对应的其他4个辅助拍照位置点。
步骤313、将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,得到参考拍照位置点以及每个辅助拍照位置点后,即可将参考拍照位置点以及辅助拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点。测绘无人机可以根据每个测绘采样点进行航拍,并将航拍得到的照片发送至控制对应的控制终端或者地面终端,以使控制终端能够根据获取的照片进行合成得到最终的测绘图像。或者,由于本公开实施例的方案可以大大降低测绘照片的拍摄数量,测绘无人机可以在本机中实现对多照片的合成。
本公开实施例所提供测绘采样点的规划方法所得到的每个测绘采样点对应获取的照片不要求每连续两张照片之间都具有一定的重叠度,因此能够大幅降低图像数据的处理耗时。
步骤320、将所述测绘参数发送至所述测绘无人机。
采用上述技术方案,通过获取与测绘区域对应的参考拍照位置点,将组合拍摄点集内的一个拍摄点与该参考拍照位置点建立映射关系,同时根据组合拍摄点集内每个拍摄点之间预设的相对位置关系以及映射关系,确定与参考拍照位置点对应的多个辅助拍照位置点,进而将参考拍照位置点以及多个辅助拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点,提出了一种新的测绘采样点的规划方法,使用基于组合拍摄点集的多测绘点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例四
图4a是本公开实施例四提供的一种控制终端侧测绘方法的流程图,在本实施例中,给出了确定与测绘区域匹配的测绘参数的另外一种实现方式。相应的,如图4a所示,本实施例的方法可以包括:
步骤410、确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,步骤410可以包括下述操作:
步骤411、根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域。
其中,组合拍摄区域可以是按照组合拍摄点集内每个拍摄点进行拍照后,所得照片合成后的区域。也即,组合拍摄区域可以是组合拍摄点集所能捕捉到的整体拍照区域。测绘区域信息可以是测绘区域的相关信息,如测绘区域的区域形状或大小等。测绘组合拍摄区域可以是与组合拍摄区域大小相同的拍摄区域,一个测绘组合拍摄区域对应地块内的一个实际的拍摄范围,也即,测绘组合拍摄区域同时包括的区域大小,以及区域的地理位置信息这两项关键信息。
在本公开实施例中,在确定测绘无人机的测绘采样点之前,首先应该获取与组合拍摄点集对应的组合拍摄区域,然后可以根据组合拍摄区域以及测绘区域的大小等信息,在测绘区域内确定一个或多个测绘组合拍摄区域。如果测绘组合拍摄区域为一个,则测绘组合拍摄区域能够完全覆盖测绘区域;如果测绘组合拍摄区域为多个,则多个测绘组合拍摄区域合成后能够完全覆盖测绘区域。示例性的,假设组合拍摄区域为100m*100m的正方形,测绘区域为100m*200m的矩形,则应该最少两个测绘组合拍摄区域才能完全覆盖测绘区域。
在本公开的一个可选实施例中,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合,和/或拼接形成所述测绘区域的测绘地图。
也即,测绘组合拍摄区域与组合拍摄区域是一致的,只不过组合拍摄区域没有与测绘区域建立对应关系,而测绘组合拍摄区域可以是测绘区域中分割形成的彼此之间相互独立的拍摄区域,其拍摄区域的形状和大小与组合拍摄区域相同。测绘组合拍摄区域之间的重叠区域可以根据实际需求设定,例如,重叠区域占测绘组合拍摄区域的30%或50%等,本公开实施例并不对测绘组合拍摄区域之间的重叠区域的数值进行限定。
在本公开实施例中,为了使测绘无人机所获取的照片能够拼接组成完整的测绘区域的图像,可选的,测绘无人机按照组合拍摄点集中的多个拍摄点所拍摄的多张照片之间应该具有重叠区域。相应的,按照组合拍摄点集中的多个拍摄点拍摄多张照片后,可以将多张照片组合,和/或拼接形成的一个完整的组合区域。该组合区域可以完整覆盖测绘区域,也可以仅覆盖测绘区域的一部分,本实施例对此并不进行限制。应该说明的是,本公开实施例中的多张照片之间具有重叠区域并不是每连续两张照片之间具有重叠区域。同理,为了保证测绘无人机获取的每个照片能够按照重叠部分合成,形成一个完整的图像,在测绘区域内确定的多个 测绘组合拍摄区域之间同样应该具有重叠区域。可选的,可以是每两个相邻的测绘组合拍摄区域之间具有重叠区域,以使每个测绘组合拍摄区域进行组合,和/或拼接能够形成测绘区域的测绘信息。
在本公开的一个可选实施例中,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,可以包括:在所述测绘区域内选择一个定位点;根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
其中,定位点可以是测绘区域内的一个位置点,设置为:在测绘区域内对测绘组合拍摄区域进行定位。
在本公开实施例中,定位点可以是根据实际需求在测绘区域内选择的一个位置点,如测绘区域的角点或中心点等。可以通过一个定位点在测绘区域内首先确定一个测绘组合拍摄区域。例如,如果测绘区域为矩形,则可以选择测绘区域的左上角顶点作为定位点,将组合拍摄区域的左上角顶点与定位点重合,则组合拍摄区域在测绘区域内就形成一个对应的测绘组合拍摄区域。应该说明的是,利用定位点以及组合拍摄区域,在测绘区域内确定一个测绘组合拍摄区域时,要保障测绘组合拍摄区域能够最大程度地覆盖测绘区域。相应的,在利用定位点以及组合拍摄区域在测绘区域内确定一个测绘组合拍摄区域后,可以判断确定的测绘组合拍摄区域是否能够完整覆盖测绘区域。如果能够完全覆盖,则无需再确定其他的测绘组合拍摄区域。如果一个测绘组合拍摄区域不能完整覆盖测绘区域,则应该在测绘区域内选择新的定位点,并返回执行根据定位点以及组合拍摄区域,在测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖测绘区域的全部测绘组合拍摄区域。应该说明的是,在重新选择新的定位点时,应该注意新的定位点所确定的测绘组合拍摄区域与相邻的测绘组合拍摄区域之间具有重叠区域。
在本公开的一个可选实施例中,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还可以包括:检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
其中,屏幕选择区域可以是用户在测绘无人机的控制终端的人机交互界面中的触摸操作所形成的区域,其可以是任意形状和大小(不超过屏幕的大小)的区域,本公开实施例并不 对屏幕选择区域的形状和大小进行限定。
在本公开实施例中,测绘区域可以由操控测绘无人机的用户实时指定生成。例如,通过检测用户在人机交互界面中的触摸操作以获取与触摸操作匹配的屏幕选择区域,并根据人机交互界面中当前显示的地图数据为屏幕选择区域确定匹配的地理位置区域,以将确定的地理位置区域作为测绘区域信息。
在本公开的一个可选实施例中,检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域,可以包括:
如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,可以将检测到的用户的单点触摸操作所形成的封闭区域作为与触摸操作匹配的屏幕选择区域。例如,将用户的至少三个触摸点的连线所围成的封闭区域确定为屏幕选择区域。还可以将检测到的用户的画框触摸操作所生成的框体作为屏幕选择区域。
步骤412、根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点。
其中,拍照位置点可以是测绘区域中的一个位置点,具有匹配的地理位置坐标。
在本公开实施例中,拍照位置点可以根据组合拍摄点集中每个拍摄点之间预设的相对位置关系确定。
在本公开的一个可选实施例中,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,可以包括:将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
在本公开实施例中,由于一个测绘组合拍摄区域与一个组合拍摄区域是相对应的,因此在确定拍照位置点时,可以将组合拍摄区域对应的组合拍摄点集中的每个拍摄点映射到测绘组合拍摄区域中作为拍照位置点。可选的,在映射时,可以首先将组合拍摄点集中的中心拍摄点映射至测绘组合拍摄区域的区域中点,从而将测绘组合拍摄区域的区域中点作为一个拍照位置点。
进一步的,当确定将测绘组合拍摄区域的区域中点作为一个拍照位置点后,可以根据组合拍摄点集中的每个周围拍摄点与中心拍摄点之间的相对位置关系,将每个周围拍摄点分别 映射至测绘组合拍摄区域中,并将形成的多个映射点作为拍照位置点。
图4b是本公开实施例二提供的一种每个拍照位置点的分布示意图。在一个例子中,如图4b所示,两个中心点40和50分别为测绘组合拍摄区域的区域中点,相应的,区域中点40与四个周围拍照位置点410是一个测绘组合拍摄区域,区域中点50与四个周围拍照位置点510是一个测绘组合拍摄区域。两个测绘组合拍摄区域中的区域中点与周围拍照位置点之间的相对位置关系,与组合拍摄点集中的每个周围拍摄点与中心拍摄点之间预设的相对位置关系是相同的。
步骤413、将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
相应的,得到每个拍照位置点后,即可将拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点。测绘无人机可以根据每个测绘采样点进行航拍,并将航拍得到的照片发送至控制对应的控制终端或者地面终端,以使控制终端能够根据获取的照片进行合成得到最终的测绘图像。或者,由于本公开实施例的方案可以大大降低测绘照片的拍摄数量,测绘无人机可以在本机中实现对多照片的合成。
步骤420、将所述测绘参数发送至所述测绘无人机。
在本公开的一个可选实施例中,在将所述测绘参数发送至所述测绘无人机之前,还可以包括:获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;所述测绘参数还包括:所述飞行高度,所述飞行高度设置为:指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。在获取测绘无人机所携带的拍照设备的拍摄参数之前,还可以包括:根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
其中,单照片拍摄区域即为单张照片能够捕捉到的实际的测绘区域。预设的照片重叠度指标可以是根据实际需求设定的重叠度指标,如50%、60%或70%等,虽然本公开实施例并不对预设的照片重叠度指标的数值进行限定,但应该说明的是,预设的照片重叠度指标要满足将每个照片按照重叠部分进行合成时,能够形成一个完整的矩形。
在本公开实施例中,由于必须对测绘无人机获取的照片进行合成以得到最终的测绘图像,因此必须确定测绘无人机在设定飞行高度下的单照片拍摄区域,以根据单照片拍摄区域的大小确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。每个拍摄点都对应一个单照片拍摄区域,例如,拍摄点为单照片拍摄区域的中点或其中的一个顶点。可以根据预设的照片重叠度指标以及单照片拍摄区域,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。
本公开实施例中的测绘参数还可以包括飞行高度,设置为:指示测绘无人机以飞行高度在测绘区域中进行飞行拍摄。可以理解的是,当测绘无人机的拍照设备,如摄像头的拍摄参数固定时,测绘无人机的飞行高度直接影响了地面像元分辨率。而地面像元分辨率又直接决定了单张照片所能涵盖的测绘区域的面积。因此,在利用测绘无人机对测绘区域进行航拍之前,首先要确定测绘无人机的设定飞行高度。可以根据拍照设备的像元宽度、拍照设备的镜头焦距以及地面像元分辨率,计算测绘无人机的设定飞行高度。可选的,由地面像元分辨率=飞行高度*像元宽度/镜头焦距,可以得到飞行高度=地面像元分辨率*镜头焦距/像元宽度。其中,像元宽度=拍照设备传感器尺寸宽度/画幅宽度。
在本公开的一个可选实施例中,获取测绘无人机所携带的拍照设备的拍摄参数,可以包括:根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
在本公开实施例中,进一步的,可以根据拍照设备的像元宽度、拍摄设备的画幅大小以及地面像元分辨率,计算测绘无人机在设定飞行高度下的单照片拍摄区域。可选的,单照片拍摄区域=地面像元分辨率*画幅大小,而地面像元分辨率=飞行高度*像元宽度/镜头焦距。
也即:单照片拍摄长度=地面像元分辨率*画幅长度;单照片拍摄宽度=地面像元分辨率*画幅宽度。例如,画幅大小为3456*4608,地面像元分辨率为0.05m,则单照片拍摄区域为172.8m*230.4m。
在本公开的一个可选实施例中,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,可以包括:根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
其中,目标点可以是二维坐标系中的任意一点,例如,目标点可以为二维坐标系的原点。
可选的,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系时,可以首先根据拍照设备的画幅大小以及拍照设备的像元宽度,确定单照片尺寸。其中,单照片尺寸=画幅大小*像元宽度(也即:单照片长度=画幅长度*像元宽度;单照片宽度=画幅宽度*像元宽度)。然后在二维坐标系中选取一个目标点作为组合拍摄点集中的中心拍摄点。进一步的,根据中心拍摄点以及单照片尺寸在二维坐标系中生成中心照片。例如,将中心拍摄点作为中心照片的 中点并根据单照片尺寸生成对应的中心照片。然后可以在中心照片的左上角、左下角、右上角以及右下角四个方位,根据单照片尺寸以及照片重叠度指标分别生成与中心照片匹配的四张周围照片。应该说明的是,中心照片和其匹配的四张周围照片均不是真正意义拍摄获取的照片,而是与单张照片大小和形状相同的一个矩形区域。相应的,获取到中心照片以及匹配的四张周围照片后,可以根据单照片尺寸与单照片拍摄区域之间的映射关系,确定与每个周围照片对应的周围拍摄点在二维坐标系中的坐标值。例如,单照片尺寸为10cm*10cm,照片重叠度指标为50%,左上角、左下角、右上角以及右下角对应的周围照片分配对应左上角、左下角、右上角以及右下角的单照片拍摄区域,且单照片尺寸与单照片拍摄区域的映射关系为1∶200,则单照片拍摄区域相应为20m*20m。如果将周围照片的中点作为每个周围拍摄点,中心拍摄点采用坐标原点,则每个周围拍摄点的坐标值分别可以是(-10,10)、(-10,-10)、(10,10)以及(10,-10),单位为m。相应的,得到每个周围拍摄点对应的坐标值后,即可根据中心拍摄点以及每个周围拍摄点在二维坐标系中的坐标值,确定组合拍摄点集内每个拍摄点之间预设的相对位置关系。例如,上述例子中,组合拍摄点集中位于每个顶点处的周围拍摄点之间的相对距离为20m,中心点处的中心拍摄点与周围拍摄点之间的相对距离为
Figure PCTCN2018116660-appb-000001
采用上述技术方案,通过获取与组合拍摄点集对应的组合拍摄区域,以根据组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,进而根据组合拍摄点集中每个拍摄点之间预设的相对位置关系,在测绘组合拍摄区域中确定多个拍照位置点,并将多个拍照位置点作为测绘无人机在测绘区域中测绘的测绘采样点,提出了一种新的测绘采样点的规划方法,使用基于组合拍摄点集的多测绘点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例五
图5是本公开实施例五提供的一种测绘无人机侧测绘方法的流程图,本实施例可适用于根据多个测绘采样点对应的测绘照片集合的情况,该方法可以由测绘无人机侧测绘装置来执行,该装置可以由软件和/或硬件的方式来实现,并一般可集成在无人机设备中,与设置为:负责控制无人机的控制终端配合使用。相应的,如图5所示,该方法包括如下操作:
步骤510、接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
在本公开实施例中,控制终端确定了测绘区域所匹配的测绘参数,即测绘无人机在测绘 区域中测绘的多个测绘采样点后,可以将确定的多个测绘采样点发送至测绘无人机。
在本公开的一个可选实施例中,在接收控制终端发送的测绘参数之前,还可以包括:接收所述控制终端发送的至少一项飞行控制指令,并根据所述飞行控制指令在空中进行设定方向,和/或设定距离的移动;根据所述控制终端发送的悬停指令,在当前所在位置进行悬停;根据所述控制终端发送的位置查询信息,将当前所在位置的地理位置坐标反馈至所述控制终端,其中,所述地理位置坐标设置为:所述控制终端确定参考拍照位置点。
在本公开实施例中,如果控制终端通过用户指定的位置信息确定参考拍照位置点,则用户必须对控制终端输入针对测绘无人机的至少一项飞行控制指令。控制终端将用户输入的飞行控制指令发送至测绘无人机,测绘无人机根据接收到的飞行控制指令行驶,即在空中进行设定方向,和/或设定距离的移动。在测绘无人机行驶的过程中,如果用户向控制终端输入了位置确认响应,例如,用户输入了停止飞行指令作为位置确认响应,则控制终端可以向测绘无人机发送悬停指令,以控制测绘无人机在当前位置悬停。同时,控制终端向测绘无人机发送位置查询信息,测绘无人机可以将当前所在位置的地理位置坐标反馈至控制终端。控制终端可以将测绘无人机反馈的地理位置坐标作为参考拍照位置点。
步骤520、根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合。
在本公开实施例中,测绘无人机可以根据控制终端发送的包括多个测绘采样点的测绘参数在测绘区域中进行飞行拍摄,从而得到与多个测绘采样点对应的测绘照片集合。
在本公开的一个可选实施例中,所述测绘参数还可以包括:飞行高度;根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,可以包括:根据所述测绘参数以所述飞行高度在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合。
本公开实施例中的测绘参数还可以包括飞行高度,设置为:指示测绘无人机以飞行高度在测绘区域中进行飞行拍摄,从而得到与多个测绘采样点对应的测绘照片集合。
在本公开的一个可选实施例中,根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,可以包括:在根据每个所述测绘采样点的地理位置信息,确定飞行至每个所述测绘采样点时,拍摄得到与每个所述测绘采样点分别对应的测绘照片构成所述测绘照片集合。
相应的,测绘无人机获取到测绘采样点后,可以根据每个测绘采样点的地理位置信息,飞行至每个测绘采样点,每到达一个测绘采样点即可调用拍照设备进行拍照,从而得到与每个测绘采样点分别对应的测绘照片构成测绘照片集合。
步骤530、将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘 区域对应的测绘地图。
相应的,测绘无人机获取到测绘照片集合后,即可对测绘照片集合中的多张照片进行照片组合和/或拼接,从而得到与测绘区域对应的完整的测绘地图。
在本公开的一个可选实施例中,将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图,可以包括:在所述测绘照片集合中获取在至少一个中心拍摄点拍摄的中心测绘照片,以及在每个中心拍摄点关联的多个周围拍摄点拍摄的周围测绘照片;根据每个个周围测绘照片与对应中心测绘照片之间的照片重叠度,将每个中心测绘照片与对应的周围测绘照片拼接为组合拍摄照片;根据与每个中心拍摄点对应的组合拍摄照片,得到与所述测绘区域对应的测绘地图。
可选的,测绘无人机可以在测绘照片集合中获取在至少一个中心拍摄点拍摄的中心测绘照片,并获取在每个中心拍摄点关联的多个周围拍摄点拍摄的周围测绘照片,然后将每个中心测绘照片与对应的周围测绘照片按照每个个周围测绘照片与对应中心测绘照片之间的照片重叠度进行拼接,形成组合拍摄照片。由此可见,本公开实施例在对测绘无人机获取的照片进行拼接时,并不是按照每连续两张照片之间的照片重叠度进行拼接,因此能够大幅降低图像数据的处理耗时,从而提高测绘效率。相应的,如果根据中心测绘照片及对应的周围测绘照片拼接形成了一个组合拍摄照片,则该组合拍摄照片即为测绘区域对应的测绘地图;如果根据中心测绘照片及对应的周围测绘照片拼接形成了多个组合拍摄照片,则对多个组合拍摄照片再按照一定的重叠度进行拼接后,得到的最终的组合拍摄照片即为测绘区域对应的测绘地图。
在本公开的一个可选实施例中,所述测绘区域的测绘地图可以包括下述至少一项:所述测绘区域的数字表面模型、所述测绘区域的三维地图以及所述测绘区域的平面地图。
在本公开实施例中,可选的,根据测绘照片集合中的多张照片进行照片组合和/或拼接得到的与测绘区域对应的测绘地图可以是测绘区域对应的数字表面模型、测绘区域的三维地图或测绘区域的平面地图。
在本公开的一个可选实施例中,在将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图之后,还可以包括:将所述测绘区域对应的测绘地图发送至所述控制终端,和/或地面终端。
在本公开实施例中,测绘无人机得到与测绘区域对应的测绘地图后,可以将测绘区域对应的测绘地图发送至控制终端,和/或地面终端。控制终端可以将测绘地图作为作业区域,并确定至少一个作业地块,以生成与作业地块对应的作业航线发送至作业无人机。地面终端可以根据实际需求对测绘地图进行其他方面的应用,例如,根据测绘地图的地理信息数据以及测绘区域对应的气候条件对测绘区域进行地域分析。
本公开实施例通过接收控制终端发送的在测绘区域中测绘的多个测绘采样点,以根据测绘采样点在测绘区域中进行飞行拍摄,得到与多个测绘采样点对应的测绘照片集合,将测绘照片集合中的多张照片进行照片组合和/或拼接,从而得到与测绘区域对应的测绘地图,提出了一种新的测绘地图获取方法,使用多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
实施例六
图6是本公开实施例六提供的一种控制终端侧测绘装置的示意图,如图6所示,所述装置包括:测绘参数确定模块610以及测绘参数发送模块620,其中:
测绘参数确定模块610,设置为:确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
测绘参数发送模块620,设置为:将所述测绘参数发送至所述测绘无人机。
本实施例的技术方案,通过利用控制终端确定测绘无人机在测绘区域中测绘的多个测绘采样点,并将测绘参数发送至测绘无人机,提出了一种新的测绘采样点确定方法,使用多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
可选的,测绘参数确定模块610,包括:拍照位置点获取单元,设置为:获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;辅助拍照位置点确定单元,设置为:根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;第一测绘采样点确定单元,设置为:将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,测绘参数确定模块610,包括:测绘组合拍摄区域确定单元,设置为:根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;拍照位置点确定单元,设置为:根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点;第二测绘采样点确定单元,设置为:将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
可选的,按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域,和/或
在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;其中,所述测绘组 合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片组合,和/或拼接形成的拍摄区域;将每个所述测绘组合拍摄区域进行组合,和/或拼接形成所述测绘区域的测绘地图。
可选的,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
可选的,拍照位置点获取单元,设置为:检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
可选的,拍照位置点获取单元,设置为:如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
可选的,拍照位置点获取单元,设置为:获取所述测绘区域的中心点作为所述参考拍照位置点。
可选的,拍照位置点获取单元,设置为:向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
可选的,所述装置还包括:飞行控制指令发送模块,设置为:接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;悬停指令发送模块,设置为:在确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;其中,所述飞行控制指令设置为:控制所述测绘无人机在空中进行设定方向,和/或设定距离的移动。
可选的,拍照位置点获取单元,设置为:获取用户输入的地理位置坐标作为所述参考拍照位置点。
可选的,拍照位置点获取单元,设置为:将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
可选的,拍照位置点获取单元,设置为:将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
可选的,拍照位置点获取单元,设置为:计算所述参考拍照位置点与所述测绘区域每个 定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
可选的,测绘组合拍摄区域确定单元,设置为:在所述测绘区域内选择一个定位点;根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
可选的,拍照位置点确定单元,设置为:将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作为所述拍照位置点。
可选的,所述装置还包括:屏幕选择区域获取模块,设置为:检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;测绘区域信息获取模块,设置为:在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
可选的,屏幕选择区域获取模块,设置为:如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;和/或
如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
可选的,所述装置还包括:拍摄参数获取模块,设置为:获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;相对位置关系确定模块,设置为:根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;所述测绘参数还包括:所述飞行高度,所述飞行高度设置为:指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
可选的,相对位置关系确定模块,设置为:根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足 所述照片重叠度指标的四张周围照片;根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
可选的,所述装置还包括:飞行高度计算模块,设置为:根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
可选的,拍摄参数获取模块,设置为:根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
上述控制终端侧测绘装置可执行本公开任意实施例所提供的控制终端侧测绘方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本公开任意实施例提供的控制终端侧测绘方法。
实施例七
图7是本公开实施例七提供的一种测绘无人机侧测绘装置的示意图,如图7所示,所述装置包括:测绘参数接收模块710、测绘照片集合拍摄模块720以及测绘地图生成模块730,其中:
测绘参数接收模块710,设置为:接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
测绘照片集合拍摄模块720,设置为:根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;
测绘地图生成模块730,设置为:将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。
本公开实施例通过接收控制终端发送的在测绘区域中测绘的多个测绘采样点,以根据测绘采样点在测绘区域中进行飞行拍摄,得到与多个测绘采样点对应的测绘照片集合,将测绘照片集合中的多张照片进行照片组合和/或拼接,从而得到与测绘区域对应的测绘地图,提出了一种新的测绘地图获取方法,使用多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。
可选的,测绘照片集合拍摄模块720,设置为:在根据每个所述测绘采样点的地理位置信息,确定飞行至每个所述测绘采样点时,拍摄得到与每个所述测绘采样点分别对应的测绘 照片构成所述测绘照片集合。
可选的,所述装置还包括:指令移动模块,设置为:接收所述控制终端发送的至少一项飞行控制指令,并根据所述飞行控制指令在空中进行设定方向,和/或设定距离的移动;指令悬停模块,设置为:根据所述控制终端发送的悬停指令,在当前所在位置进行悬停;地理位置坐标反馈模块,设置为:根据所述控制终端发送的位置查询信息,将当前所在位置的地理位置坐标反馈至所述控制终端,其中,所述地理位置坐标设置为:所述控制终端确定参考拍照位置点。
可选的,所述测绘参数还包括:飞行高度;测绘照片集合拍摄模块720,设置为:根据所述测绘参数以所述飞行高度在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合。
可选的,测绘地图生成模块730,设置为:在所述测绘照片集合中获取在至少一个中心拍摄点拍摄的中心测绘照片,以及在每个中心拍摄点关联的多个周围拍摄点拍摄的周围测绘照片;根据每个周围测绘照片与对应中心测绘照片之间的照片重叠度,将每个中心测绘照片与对应的周围测绘照片拼接为组合拍摄照片;根据与每个中心拍摄点对应的组合拍摄照片,得到与所述测绘区域对应的测绘地图。
可选的,所述测绘区域的测绘地图包括下述至少一项:所述测绘区域的数字表面模型、所述测绘区域的三维地图以及所述测绘区域的平面地图。
可选的,所述装置还包括:测绘地图发送模块,设置为:将所述测绘区域对应的测绘地图发送至所述控制终端,和/或地面终端。
上述测绘无人机侧测绘装置可执行本公开任意实施例所提供的测绘无人机侧测绘方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本公开任意实施例提供的测绘无人机侧测绘方法。
实施例八
图八为本公开实施例八提供的一种控制终端的结构示意图。图8示出了适于用来实现本公开实施方式的控制终端612的框图。图8显示的控制终端612仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图8所示,控制终端612以通用计算设备的形式表现。控制终端612的组件可以包括但不限于:一个或者多个处理器616,存储装置628,连接不同系统组件(包括存储装置628和处理器616)的总线618。
总线618表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来 说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MCA)总线,增强型ISA总线、视频电子标准协会(Video Electronics Standards Association,VESA)局域总线以及外围组件互连(Peripheral Component Interconnect,PCI)总线。
控制终端612典型地包括多种计算机系统可读介质。这些介质可以是任何能够被控制终端612访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
存储装置628可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(Random Access Memory,RAM)630和/或高速缓存存储器632。控制终端612可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统634可以设置为:读写不可移动的、非易失性磁介质(图8未显示,通常称为“硬盘驱动器”)。尽管图8中未示出,可以提供设置为:对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如只读光盘(Compact Disc-Read Only Memory,CD-ROM)、数字视盘(Digital Video Disc-Read Only Memory,DVD-ROM)或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线618相连。存储装置628可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本公开每个实施例的功能。
具有一组(至少一个)程序模块626的程序636,可以存储在例如存储装置628中,这样的程序模块626包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块626通常执行本公开所描述的实施例中的功能和/或方法。
控制终端612也可以与一个或多个外部设备614(例如键盘、指向设备、摄像头、显示器624等)通信,还可与一个或者多个使得用户能与该控制终端612交互的设备通信,和/或与使得该控制终端612能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(Input/Output,I/O)接口622进行。并且,控制终端612还可以通过网络适配器620与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器620通过总线618与控制终端612的其它模块通信。应当明白,尽管图中未示出,可以结合控制终端612使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)系统、磁带驱动器以及数据备份存储系统等。
处理器616通过运行存储在存储装置628中的程序,从而执行各种功能应用以及数据处理,例如实现本公开上述实施例所提供的控制终端侧测绘方法。
也即,所述处理单元执行所述程序时实现:确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;将所述测绘参数发送至所述测绘无人机。
实施例九
本实施例九是本公开实施例提供的一种用于执行本公开任一实施例所提供的测绘无人机侧测绘方法的测绘无人机,该测绘无人机包括:一个或多个处理器;存储装置,设置为:存储一个或多个程序;当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如本公开任一实施例所提供的测绘无人机侧测绘方法:接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。其结构以及细节内容可参照图8和实施例八。
实施例十
本公开实施例十还提供一种存储计算机程序的计算机存储介质,所述计算机程序在由计算机处理器执行时用于执行本公开上述实施例任一所述的控制终端侧测绘方法:确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;将所述测绘参数发送至所述测绘无人机。或者所述计算机程序在由计算机处理器执行时用于执行本公开上述实施例任一所述的测绘无人机侧测绘方法:接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;将所述测绘照片集合中的多张照片进行照片组合和/或拼接,得到与所述测绘区域对应的测绘地图。
本公开实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(RAM)、只读存储器(Read Only Memory,ROM)、可擦式可编程只读存储器((Erasable Programmable Read Only Memory,EPROM)或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器 件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言-诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言——诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)-连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
工业实用性
本公开实施例提出了一种新的测绘系统和测绘方法,使用基于新的测绘系统的多测绘采样点整体规划方式代替现有的平行线移动规划方式,解决现有无人机航测方法中存在的成本高且测绘效率低的问题,实现了降低测绘成本,并提高测绘效率的技术效果。

Claims (37)

  1. 一种测绘系统,包括:控制终端以及测绘无人机,其中:
    所述控制终端,设置为:确定与测绘区域匹配的测绘参数,并将所述测绘参数发送至所述测绘无人机,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    所述测绘无人机,设置为:接收所述测绘参数,并根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图。
  2. 根据权利要求1所述的系统,其中:
    所述测绘无人机还设置为:,根据与所述测绘区域对应的测绘地图,生成与所述测绘区域对应的地图瓦片数据。
  3. 根据权利要求2所述的系统,还包括:作业无人机;
    所述控制终端还设置为:将所述测绘区域作为作业区域,从所述测绘无人机获取与所述作业区域对应的地图瓦片数据,并根据所述地图瓦片数据生成所述作业区域的区域地图进行显示,根据用户针对所述区域地图选择的至少一个区域定位点,在所述作业区域内确定至少一个作业地块,并生成与所述作业地块对应的作业航线发送至所述作业无人机;
    所述作业无人机,设置为:接收所述作业航线,并根据所述作业航线在所述至少一个作业地块中进行飞行作业。
  4. 一种控制终端侧测绘方法,应用于如权利要求1-3任一项所述的测绘系统中,其中,包括:
    确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    将所述测绘参数发送至所述测绘无人机。
  5. 根据权利要求4所述的方法,其中,确定与测绘区域匹配的测绘参数,包括:
    获取与所述测绘区域对应的参考拍照位置点,并将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系;
    根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系以及所述映射关系,确定与所述参考拍照位置点对应的多个辅助拍照位置点;
    将所述参考拍照位置点以及所述多个辅助拍照位置点作为所述测绘无人机在所述测绘区域中测绘的多个测绘采样点。
  6. 根据权利要求4所述的方法,其中,确定与测绘区域匹配的测绘参数,包括:
    根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在所述测绘区域内确定一个或多个测绘组合拍摄区域;
    根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区 域中确定多个拍照位置点;
    将所述多个拍照位置点作为测绘无人机在所述测绘区域中测绘的多个测绘采样点。
  7. 根据权利要求6所述的方法,还包括以下至少一项:按照所述组合拍摄点集内的多个拍摄点所拍摄的多张照片之间具有重叠区域;以及
    在所述测绘区域内确定的多个测绘组合拍摄区域之间具有重叠区域;
    其中,所述测绘组合拍摄区域为按照所述组合拍摄点集内的多个拍摄点拍摄多张照片后,将所述多张照片进行下述至少一项:组合、拼接,形成拍摄区域;将每个所述测绘组合拍摄区域进行下述至少一项:组合、拼接,形成所述测绘区域的测绘地图。
  8. 根据权利要求5或6所述的方法,其中,所述组合拍摄点集内的拍摄点包括:中心拍摄点以及四个周围拍摄点,所述周围拍摄点为以所述中心拍摄点为中心的矩形的四个顶点;
    其中,根据所述组合拍摄点集内的每个拍摄点所拍摄得到的合成照片的形状为矩形。
  9. 根据权利要求5所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点;
    在所述人机交互界面中当前显示的测绘区域的地图数据中获取与所述屏幕位置点匹配的一个地理位置坐标作为所述参考位置点。
  10. 根据权利要求9所述的方法,其中,检测用户在人机交互界面中的触摸操作,并根据所述触摸操作确定一个屏幕位置点,包括下述至少一项:
    如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的触摸点确定为所述屏幕位置点;
    如果检测到所述用户的触摸操作为划线触摸操作,则在所述用户触摸生成的线段上选取一点作为所述屏幕位置点;以及
    如果检测到所述用户的触摸操作为画框触摸操作,则在所述用户触摸生成的框体内部选取一点作为所述屏幕位置点。
  11. 根据权利要求5所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    获取所述测绘区域的中心点作为所述参考拍照位置点。
  12. 根据权利要求5所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    向所述测绘无人机发送位置查询信息,将所述测绘无人机反馈的地理位置坐标作为所述参考拍照位置点;
    其中,所述测绘无人机预先设置于与所述测绘区域匹配的位置处。
  13. 根据权利要求12所述的方法,在向所述测绘无人机发送位置查询信息之前,还包括:
    接收用户输入的针对所述测绘无人机的至少一项飞行控制指令,并将所述飞行控制指令发送至所述测绘无人机;
    当确认接收到所述用户输入的位置确认响应时,向所述测绘无人机发送悬停指令,以控制所述测绘无人机在当前位置悬停;
    其中,所述飞行控制指令设置为:控制所述测绘无人机在空中进行下述至少一项:设定方向的移动、设定距离的移动。
  14. 根据权利要求5所述的方法,其中,获取与测绘区域对应的参考拍照位置点,包括:
    获取用户输入的地理位置坐标作为所述参考拍照位置点。
  15. 根据权利要求5所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    将用户在所述组合拍摄点集内选择的一个拍摄点,与所述参考拍照位置点建立映射关系。
  16. 根据权利要求8所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    将所述组合拍摄点集内的所述中心拍摄点,与所述参考拍照位置点建立映射关系。
  17. 根据权利要求8所述的方法,其中,将组合拍摄点集内的一个拍摄点与所述参考拍照位置点建立映射关系,包括:
    计算所述参考拍照位置点与所述测绘区域每个定位关键定之间的距离,所述定位关键点包括:所述测绘区域的角点以及所述测绘区域的中心点;
    获取距离所述参考拍照位置点最近的一个定位关键点作为目标参考点;
    根据所述目标参考点在所述测绘区域内的位置信息,在所述组合拍摄点集内选择与所述位置信息匹配的一个拍摄点与所述参考拍照位置点建立映射关系。
  18. 根据权利要求6所述的方法,其中,根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域,包括:
    在所述测绘区域内选择一个定位点;
    根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域;
    如果所述测绘组合拍摄区域不能对所述测绘区域完整覆盖,则在所述测绘区域内选择新的定位点,并返回执行根据所述定位点以及所述组合拍摄区域,在所述测绘区域内确定一个测绘组合拍摄区域的操作,直至确定出能够完整覆盖所述测绘区域的全部测绘组合拍摄区域。
  19. 根据权利要求8所述的方法,其中,根据所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,在所述测绘组合拍摄区域中确定多个拍照位置点,包括:
    将所述组合拍摄点集内的中心拍摄点映射至所述测绘组合拍摄区域的区域中点,并将所述区域中点作为一个拍照位置点;
    根据所述组合拍摄点集内的每个周围拍摄点与所述中心拍摄点之间预设的相对位置关系,将每个所述周围拍摄点分别映射至所述测绘组合拍摄区域中,并将形成的多个映射点作 为所述拍照位置点。
  20. 根据权利要求6所述的方法,在根据与组合拍摄点集对应的组合拍摄区域以及测绘区域信息,在测绘区域内确定一个或多个测绘组合拍摄区域之前,还包括:
    检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域;
    在所述人机交互界面中当前显示的地图数据中,获取与所述屏幕选择区域匹配的地理位置区域作为所述测绘区域信息。
  21. 根据权利要求20所述的方法,其中,检测用户在人机交互界面中的触摸操作,并获取与所述触摸操作匹配的屏幕选择区域,包括下述至少一项:
    如果检测到所述用户的触摸操作为单点触摸操作,则将所述用户的至少三个触摸点的连线所围成的封闭区域确定为所述屏幕选择区域;以及
    如果检测到所述用户的触摸操作为画框触摸操作,则将所述用户触摸生成的框体作为所述屏幕选择区域。
  22. 根据权利要求8述的方法,在将所述测绘参数发送至所述测绘无人机之前,还包括:
    获取所述测绘无人机所携带的拍照设备的拍摄参数,所述拍摄参数包括所述测绘无人机在设定飞行高度下的单照片拍摄区域,每个拍摄点都对应一个单照片拍摄区域;
    根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系;
    所述测绘参数还包括:所述飞行高度,所述飞行高度设置为:指示所述测绘无人机以所述飞行高度在所述测绘区域中进行飞行拍摄。
  23. 根据权利要求22所述的方法,其中,根据预设的照片重叠度指标以及所述单照片拍摄区域,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系,包括:
    根据所述拍照设备的画幅大小以及所述拍照设备的像元宽度,确定单照片尺寸;
    构建二维坐标系,并在所述二维坐标系中选择目标点作为中心拍摄点;
    根据所述中心拍摄点以及所述单照片尺寸,在所述二维坐标系中生成中心照片;
    在所述中心照片的左上角、左下角、右上角以及右下角,分别生成与所述中心照片满足所述照片重叠度指标的四张周围照片;
    根据所述单照片尺寸与所述单照片拍摄区域之间的映射关系,确定与每个所述周围照片对应的周围拍摄点在所述二维坐标系中的坐标值;
    根据所述中心拍摄点以及每个所述周围拍摄点在所述二维坐标系中的坐标值,确定所述组合拍摄点集内每个拍摄点之间预设的相对位置关系。
  24. 根据权利要求22所述的方法,在获取测绘无人机所携带的拍照设备的拍摄参数之前,还包括:
    根据所述拍照设备的像元宽度、所述拍照设备的镜头焦距以及地面像元分辨率,计算所述设定飞行高度。
  25. 根据权利要求22所述的方法,其中,获取测绘无人机所携带的拍照设备的拍摄参数,包括:
    根据所述拍照设备的像元宽度、所述拍摄设备的画幅区域以及地面像元分辨率,计算所述测绘无人机在所述设定飞行高度下的单照片拍摄区域。
  26. 一种测绘无人机侧测绘方法,应用于如权利要求1-3任一项所述的测绘系统中,包括:
    接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;
    将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图。
  27. 根据权利要求26所述的方法,其中,根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,包括:
    当根据每个所述测绘采样点的地理位置信息,确定飞行至每个所述测绘采样点时,拍摄得到与每个所述测绘采样点分别对应的测绘照片构成所述测绘照片集合。
  28. 根据权利要求26所述的方法,在接收控制终端发送的测绘参数之前,还包括:
    接收所述控制终端发送的至少一项飞行控制指令,并根据所述飞行控制指令在空中进行下述至少一项:设定方向的移动、设定距离的移动;
    根据所述控制终端发送的悬停指令,在当前所在位置进行悬停;
    根据所述控制终端发送的位置查询信息,将当前所在位置的地理位置坐标反馈至所述控制终端,其中,所述地理位置坐标设置为:所述控制终端确定参考拍照位置点。
  29. 根据权利要求26所述的方法,所述测绘参数还包括:飞行高度;
    根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合,包括:
    根据所述测绘参数以所述飞行高度在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合。
  30. 根据权利要求26-29任一项所述的方法,其中,将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图,包括:
    在所述测绘照片集合中获取在至少一个中心拍摄点拍摄的中心测绘照片,以及在每个中 心拍摄点关联的多个周围拍摄点拍摄的周围测绘照片;
    根据每个周围测绘照片与对应中心测绘照片之间的照片重叠度,将每个中心测绘照片与对应的周围测绘照片拼接为组合拍摄照片;
    根据与每个中心拍摄点对应的组合拍摄照片,得到与所述测绘区域对应的测绘地图。
  31. 根据权利要求26-29任一项所述的方法,其中,所述测绘区域的测绘地图包括下述至少一项:
    所述测绘区域的数字表面模型、所述测绘区域的三维地图以及所述测绘区域的平面地图。
  32. 根据权利要求26-29任一项所述的方法,在将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图之后,还包括:
    将所述测绘区域对应的测绘地图发送至下述至少一项:所述控制终端、地面终端。
  33. 一种控制终端侧测绘装置,应用于如权利要求1-3任一项所述的测绘系统中,包括:
    测绘参数确定模块,设置为:确定与测绘区域匹配的测绘参数,其中,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    测绘参数发送模块,设置为:将所述测绘参数发送至所述测绘无人机。
  34. 一种测绘无人机侧测绘装置,应用于如权利要求1-3任一项所述的测绘系统中,包括:
    测绘参数接收模块,设置为:接收控制终端发送的测绘参数,其中,所述测绘参数为所述控制终端根据所述测绘区域确定的,所述测绘参数包括:所述测绘无人机在所述测绘区域中测绘的多个测绘采样点;
    测绘照片集合拍摄模块,设置为:根据所述测绘参数在所述测绘区域中进行飞行拍摄,得到与所述多个测绘采样点对应的测绘照片集合;
    测绘地图生成模块,设置为:将所述测绘照片集合中的多张照片进行下述至少一项:照片组合、照片拼接,得到与所述测绘区域对应的测绘地图。
  35. 一种控制终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如权利要求4-25中任一所述的方法。
  36. 一种测绘无人机,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如权利要求26-32中任一所述的方法。
  37. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如权利要求4-25中任一所述的控制终端侧测绘方法,或者实现如权利要求26-32中任一所述的测绘无人机侧测绘方法。
PCT/CN2018/116660 2018-11-21 2018-11-21 一种测绘系统、测绘方法、装置、设备及介质 WO2020103023A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP18941071.5A EP3885702A4 (en) 2018-11-21 2018-11-21 SURVEYING AND MAPPING SYSTEM, METHOD, APPARATUS, DEVICE AND SUPPORT
CN201880091778.4A CN112469967B (zh) 2018-11-21 2018-11-21 测绘系统、测绘方法、装置、设备及存储介质
AU2018449839A AU2018449839B2 (en) 2018-11-21 2018-11-21 Surveying and mapping method and device
JP2021527154A JP7182710B2 (ja) 2018-11-21 2018-11-21 測量方法、装置及びデバイス
KR1020217016658A KR20210105345A (ko) 2018-11-21 2018-11-21 측량 및 매핑 방법, 장치 및 기기
CA3120727A CA3120727A1 (en) 2018-11-21 2018-11-21 Surveying and mapping system, surveying and mapping method and device, apparatus and medium
PCT/CN2018/116660 WO2020103023A1 (zh) 2018-11-21 2018-11-21 一种测绘系统、测绘方法、装置、设备及介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/116660 WO2020103023A1 (zh) 2018-11-21 2018-11-21 一种测绘系统、测绘方法、装置、设备及介质

Publications (1)

Publication Number Publication Date
WO2020103023A1 true WO2020103023A1 (zh) 2020-05-28

Family

ID=70773259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/116660 WO2020103023A1 (zh) 2018-11-21 2018-11-21 一种测绘系统、测绘方法、装置、设备及介质

Country Status (7)

Country Link
EP (1) EP3885702A4 (zh)
JP (1) JP7182710B2 (zh)
KR (1) KR20210105345A (zh)
CN (1) CN112469967B (zh)
AU (1) AU2018449839B2 (zh)
CA (1) CA3120727A1 (zh)
WO (1) WO2020103023A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113865557A (zh) * 2021-09-08 2021-12-31 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN114838710A (zh) * 2022-03-29 2022-08-02 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114877872B (zh) * 2022-07-01 2022-10-14 北京今日蓝天科技有限公司 无人机及其操作系统、生成测绘图的方法、介质、设备
CN116088584B (zh) * 2023-04-07 2023-07-18 山东省地质矿产勘查开发局第五地质大队(山东省第五地质矿产勘查院) 一种测绘协同作业方法、系统及一种电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082488A (zh) * 2007-06-30 2007-12-05 徐春云 异地遥测之图像拼接方法
KR101236992B1 (ko) * 2012-11-05 2013-02-26 명화지리정보(주) 기준점 데이터가 적용된 수치 영상이미지의 실시간 업데이팅 영상처리시스템
EP2639547A2 (en) * 2012-03-12 2013-09-18 Aisin Aw Co., Ltd. Picture data provision system, picture data provision method, and computer program product
CN104567815A (zh) * 2014-12-26 2015-04-29 北京航天控制仪器研究所 一种基于图像匹配的无人机载光电稳定平台自动侦察系统
CN105225241A (zh) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 无人机深度图像的获取方法及无人机
CN105352481A (zh) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 高精度无人机影像无控制点测绘成图方法及系统
CN108474657A (zh) * 2017-03-31 2018-08-31 深圳市大疆创新科技有限公司 一种环境信息采集方法、地面站及飞行器

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3364257B2 (ja) * 1993-01-27 2003-01-08 朝日航洋株式会社 空中写真撮影方法
JPH1054719A (ja) * 1996-08-09 1998-02-24 A Tec:Kk 空中測量写真の撮影方法およびそれに使用する撮影ポ イントマーク
JP4414524B2 (ja) * 1999-11-17 2010-02-10 アジア航測株式会社 航空写真撮影計画シミュレーション方法
CN102495522A (zh) * 2011-12-01 2012-06-13 天津曙光敬业科技有限公司 基于无人直升机航拍的360°空中全景互动漫游系统的制作方法
JP6055274B2 (ja) * 2012-10-31 2016-12-27 株式会社トプコン 航空写真測定方法及び航空写真測定システム
CN103338333B (zh) * 2013-07-17 2016-04-13 中测新图(北京)遥感技术有限责任公司 航摄仪方位元素优化配置方法
WO2017073310A1 (ja) * 2015-10-27 2017-05-04 三菱電機株式会社 構造物の形状測定用の画像撮影システム、構造物の形状測定に使用する構造物の画像を撮影する方法、機上制御装置、遠隔制御装置、プログラム、および記録媒体
US20170221241A1 (en) * 2016-01-28 2017-08-03 8681384 Canada Inc. System, method and apparatus for generating building maps
JP2018077626A (ja) * 2016-11-08 2018-05-17 Necソリューションイノベータ株式会社 飛行制御装置、飛行制御方法、及びプログラム
CN106444841B (zh) * 2016-11-15 2019-04-26 航天图景(北京)科技有限公司 一种基于多旋翼无人机倾斜摄影系统的航线规划方法
CN108846004A (zh) * 2018-04-20 2018-11-20 曜宇航空科技(上海)有限公司 一种基于无人机的地图中选定目标更新方法及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082488A (zh) * 2007-06-30 2007-12-05 徐春云 异地遥测之图像拼接方法
EP2639547A2 (en) * 2012-03-12 2013-09-18 Aisin Aw Co., Ltd. Picture data provision system, picture data provision method, and computer program product
KR101236992B1 (ko) * 2012-11-05 2013-02-26 명화지리정보(주) 기준점 데이터가 적용된 수치 영상이미지의 실시간 업데이팅 영상처리시스템
CN104567815A (zh) * 2014-12-26 2015-04-29 北京航天控制仪器研究所 一种基于图像匹配的无人机载光电稳定平台自动侦察系统
CN105225241A (zh) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 无人机深度图像的获取方法及无人机
CN105352481A (zh) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 高精度无人机影像无控制点测绘成图方法及系统
CN108474657A (zh) * 2017-03-31 2018-08-31 深圳市大疆创新科技有限公司 一种环境信息采集方法、地面站及飞行器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3885702A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113865557A (zh) * 2021-09-08 2021-12-31 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN113865557B (zh) * 2021-09-08 2024-01-16 诚邦测绘信息科技(浙江)有限公司 测绘用山体环境检测方法、系统、存储介质及智能终端
CN114838710A (zh) * 2022-03-29 2022-08-02 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统
CN114838710B (zh) * 2022-03-29 2023-08-29 中国一冶集团有限公司 基于无人机拍照的工程用快速的测绘方法及测绘系统

Also Published As

Publication number Publication date
CN112469967B (zh) 2023-12-26
KR20210105345A (ko) 2021-08-26
CN112469967A (zh) 2021-03-09
JP7182710B2 (ja) 2022-12-02
EP3885702A1 (en) 2021-09-29
CA3120727A1 (en) 2020-05-28
AU2018449839A1 (en) 2021-06-24
JP2022507715A (ja) 2022-01-18
EP3885702A4 (en) 2021-12-01
AU2018449839B2 (en) 2023-02-23

Similar Documents

Publication Publication Date Title
WO2020103022A1 (zh) 一种测绘系统、测绘方法、装置、设备及介质
US11346665B2 (en) Method and apparatus for planning sample points for surveying and mapping, control terminal, and storage medium
WO2020103023A1 (zh) 一种测绘系统、测绘方法、装置、设备及介质
WO2020103019A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2020103021A1 (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2020103024A1 (zh) 一种作业控制系统、作业控制方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18941071

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021527154

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3120727

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018449839

Country of ref document: AU

Date of ref document: 20181121

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018941071

Country of ref document: EP

Effective date: 20210621