WO2018120350A1 - Method and device for positioning unmanned aerial vehicle - Google Patents

Method and device for positioning unmanned aerial vehicle Download PDF

Info

Publication number
WO2018120350A1
WO2018120350A1 PCT/CN2017/072477 CN2017072477W WO2018120350A1 WO 2018120350 A1 WO2018120350 A1 WO 2018120350A1 CN 2017072477 W CN2017072477 W CN 2017072477W WO 2018120350 A1 WO2018120350 A1 WO 2018120350A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
image
current
reference image
feature point
Prior art date
Application number
PCT/CN2017/072477
Other languages
French (fr)
Chinese (zh)
Inventor
雷志辉
卞一杰
杨凯斌
贾宁
Original Assignee
深圳市道通智能航空技术有限公司
湖南省道通科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司, 湖南省道通科技有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2018120350A1 publication Critical patent/WO2018120350A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Definitions

  • the invention relates to the field of UAV control, and in particular to a method and a device for positioning a UAV.
  • UAVs have broad applications in disaster prevention and rescue, scientific investigations, etc., and flight control systems are an important part of drones, playing an important role in the intelligent and practical use of drones.
  • the drone can automatically return to the original route.
  • the map data provided by the third party can usually be stored in the flight system of the drone, and then passed through a positioning device such as a Global Position System (GPS).
  • GPS Global Position System
  • the resolution of the map data provided by the third party is related to the height of the drone from the ground. Generally, the higher the altitude of the drone's off-ground flight, the lower the resolution. Since the flying height of the drone during the operation will change frequently, the resolution of the ground target is likely to be large, and the matching accuracy is low, resulting in poor positioning accuracy when returning.
  • the technical problem to be solved by the present invention is how to improve the positioning accuracy.
  • an embodiment of the present invention discloses a method for positioning a drone, including:
  • a reference image is generated; the current image acquired at the current time is acquired; and the current position of the drone is determined according to the reference image and the current image.
  • generating a reference image includes: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image.
  • the ground image is collected, including: collecting the ground image during the flight of the drone from the starting position to the returning position.
  • the method for positioning the UAV disclosed in this embodiment further includes: determining the return flight.
  • the method for positioning the UAV disclosed in this embodiment further includes: receiving, by the controller, an instruction sent by the controller for indicating the return flight.
  • the method for positioning the UAV disclosed in this embodiment further includes: determining a reverse trajectory of the UAV flying from the starting position to the returning position.
  • the method for positioning the unmanned aerial vehicle disclosed in this embodiment further includes: flying from the returning position to the starting position according to the reverse trajectory.
  • determining a current location of the drone according to the reference image and the current image including: matching the current image with the reference image, and obtaining the drone at the current time relative to the reference image a motion vector; determining, according to the motion vector, positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone The direction of the drone, the speed of the drone and the heading of the drone.
  • the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: matching the current image with the reference image to obtain a scene of the drone relative to the reference at the current time.
  • the motion vector of the image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: matching the current image with the reference image to obtain a scene of the drone relative to the reference at the current time.
  • the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: selecting a feature point of the reference image, wherein the selected feature point is used as the reference feature point. Determining a feature point matching the reference feature point in the current image, wherein the matched feature point is used as the current feature point; matching the current feature point with the reference feature point to obtain a drone at the current time relative to the reference The motion vector of the image.
  • an apparatus for positioning a drone including:
  • a reference module for generating a reference image during the flight of the drone; an acquisition module for acquiring the current image acquired at the current time; and a positioning module for the reference image generated by the reference module and the current image acquired by the acquisition module , determine the current location of the drone.
  • the reference module includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
  • the sampling unit is specifically configured to collect a ground image during the flight of the drone from the starting position to the returning position.
  • the apparatus for positioning the UAV disclosed in this embodiment further includes: a determining module, Used to determine the return flight.
  • the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
  • the apparatus for positioning the UAV further includes: a trajectory module, configured to determine, after the reference module generates the reference image, a reverse trajectory of the UAV flying from the starting position to the returning position.
  • a trajectory module configured to determine, after the reference module generates the reference image, a reverse trajectory of the UAV flying from the starting position to the returning position.
  • the device for positioning the UAV disclosed in this embodiment further includes: a returning module, configured to fly from the return position to the starting position according to the reverse trajectory determined by the trajectory module.
  • the positioning module includes: a matching unit, configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to determine the drone according to the motion vector Positioning information relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone, an azimuth of the drone, a drone The speed and heading of the drone.
  • the matching unit is specifically configured to perform scene matching on the current image and the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
  • the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the reference feature in the current image Point matching feature points, wherein the matched feature points are used as the current feature points; the vector subunits are used to match the current feature points with the reference feature points to obtain the movement of the drone relative to the reference image at the current time.
  • a selecting subunit for selecting a feature point of the reference image wherein the selected feature point is used as a reference feature point
  • the feature point determining subunit is configured to determine the reference feature in the current image Point matching feature points, wherein the matched feature points are used as the current feature points
  • the vector subunits are used to match the current feature points with the reference feature points to obtain the movement of the drone relative to the reference image at the current time.
  • the method and device for positioning a drone generate a reference image during the flight of the drone, and the reference image can reflect the latest ground situation, and then obtain Taking the current image collected at the current time, since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, according to the reference image and the current image, Determining the current position of the drone; in the solution of the embodiment of the invention, the reference image is generated during the flight of the drone, and the current image is also acquired during the flight of the drone, and therefore, the generated reference The image can dynamically compensate for the difference in resolution generated during the flight of the UAV. Compared with the fixed resolution in the prior art, the UAV can better achieve dynamic matching during the return process, reducing system errors and thus improving The positioning accuracy of the return flight.
  • determining a reverse trajectory of the drone flying from the starting position to the returning position so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position.
  • the return position is flying to the starting position, which reduces the return planning of the flight path and improves the efficiency of determining the flight path when returning.
  • the drone can fly back to the starting position by following the reverse trajectory in the case of no signal or communication failure, so that the drone can smoothly return to the starting position.
  • the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • FIG. 1 is a flow chart of a method for positioning a drone according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for matching a scene to obtain a motion vector of a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of an apparatus for positioning a drone according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a system for positioning a drone according to an embodiment of the present invention.
  • installation In the description of the present invention, it should be noted that the terms “installation”, “connected”, and “connected” are to be understood broadly, and, for example, may be fixed connections, unless explicitly stated and defined otherwise. It can also be a detachable connection, or an integral connection; it can be a mechanical connection or an electrical connection; it can be directly connected, or can be indirectly connected through an intermediate medium, or can be internal communication between two components, and can be a wireless connection. It can also be a wired connection.
  • the specific meaning of the above terms in the present invention can be understood in a specific case by those skilled in the art.
  • the embodiment discloses a method for positioning the unmanned aerial vehicle.
  • a flow chart of the method for positioning the unmanned aerial vehicle is performed.
  • the methods of positioning include:
  • step S101 a reference image is generated during the flight of the drone.
  • the ground image can be collected after the drone takes off from the starting position, and the ground image collected by the drone during the moving to the destination is spliced, and the stitched result is used as the reference image.
  • the so-called ground image refers to an image taken by the drone in a bird's eye view during a flight, and the angle between the overhead viewing direction and the vertical direction is less than 90 degrees.
  • the overhead viewing direction may be vertically downward, in which case the angle of view of the overhead viewing angle is 0 degrees from the vertical.
  • the drone can store the generated reference image for subsequent use of the reference image.
  • the reference image can also be sent to other drones, so that other drones can also use the reference image.
  • the reference image generated by the drone since the hardware parameters of the drone do not change during the flight of the drone, the reference image generated by the drone itself can represent the drone.
  • the so-called departure trajectory refers to the flight path of the drone from the starting position to the destination position. trace.
  • Step S102 Acquire a current image collected at a current time.
  • the current image acquired at the current time can be acquired during the flight of the drone.
  • the drone acquires the image acquired by the image acquisition device on the drone at the current moment in order to determine the current time position.
  • Step S103 determining the current position of the drone based on the reference image and the current image.
  • the current image and the reference image may be compared to obtain a difference between the current image and the reference image, and the motion vector of the drone may be estimated according to the difference, thereby determining the drone's motion vector. current position.
  • the operation of generating the reference image may include: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image.
  • the ground image may be collected at preset intervals, and the preset interval may be determined according to prior knowledge to determine a time interval preset in the time domain, or a preset distance interval in the position.
  • the preset intervals may be equal intervals or non-equal intervals.
  • the overall mode splicing may be used, or the segmentation mode may be used for image splicing.
  • the segmentation mode may be used for image splicing.
  • there is usually an overlapping area between adjacent frame images and the front and rear two frames of the overlapping portion may be combined into a large seamless image.
  • the overlapping area of one of the images may be directly rounded off, and the extra portion of the image may be stitched to another frame image, and the fusion is performed in the seam region to obtain a mosaic image.
  • the ground image is acquired during the flight of the drone from the starting position to the returning position.
  • the so-called starting position refers to the position where the drone starts to take off;
  • the so-called return position is Refers to the position where the drone starts to return to the starting position after taking off.
  • the returning position is the destination of the drone, but in the specific implementation process, the returning position may also be the location where the drone receives the returning instruction during the flight to the destination.
  • the return location can also be a location where the drone encounters a special situation in the process of flying to the destination to determine the need to return. For example, during the flight, there are sudden situations such as insufficient power, no GPS signal, and drone failure. At this time, the flight control system in the drone determines the return flight.
  • the method may further include:
  • step S104 it is determined that the return flight.
  • the drone may actively determine the return flight, for example, the drone encounters a special situation during the flight and needs to return. For example, when the drone is in a state of insufficient power, no GPS signal, or a drone failure during the flight, the flight control system in the drone determines the return flight. After flying to the destination to complete the task, the drone can also take the initiative to determine the need to return.
  • the controller may also control the drone to return. Specifically, the drone receives an instruction sent by the controller to indicate a return flight. After receiving the instruction, the drone determines to return.
  • the controller may be a remote control dedicated to the drone, or a terminal that remotely controls the drone, such as a mobile terminal, a computer, a notebook, or the like.
  • the method further includes:
  • Step S105 determining a reverse trajectory of the drone flying from the starting position to the returning position.
  • the captured image is in the process of flying from the starting position to the returning position.
  • the flight path of the drone can be determined according to the image attributes.
  • the flight path from the return position to the starting position along the departure path forms a reverse trajectory of the drone flying from the starting position to the returning position.
  • the drone can perform the returning operation based on the reverse trajectory.
  • the step S103 may specifically include: matching the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and determining, according to the motion vector, the positioning of the drone relative to the reference image at the current time. information.
  • the positioning information includes at least one of the following: the position of the drone, the height of the drone, the attitude of the drone, the azimuth of the drone, the speed of the drone, and the heading of the drone.
  • the direction of the UAV refers to the relative angle between the current image and the reference image acquired by the aircraft at the current time.
  • the heading of the UAV refers to the actual flight direction of the UAV.
  • the current image and the reference image may be scene-matched to obtain a current time of the drone relative to the reference image.
  • the motion vector specifically, please refer to Figure 2.
  • the method shown in Figure 2 includes:
  • Step S201 selecting feature points of the reference image, and the selected feature points are used as reference feature points.
  • reference feature points such as texture-rich Object edge points, etc.
  • the feature points can be described by mathematical methods such as gradient histograms and local random binary features.
  • Step S202 determining a feature point matching the reference feature point in the current image, and the feature point obtained by the matching is used as the current feature point.
  • the pixels in the current image can be described by the same mathematical description, and the current feature points in the current image that match the reference feature points can be determined using mathematical knowledge.
  • Step S203 matching the current feature point with the reference feature point to obtain a motion vector of the drone relative to the reference image at the current time.
  • the current feature points and the reference feature points can be matched by an affine transformation model or a projective transformation model.
  • a description of the affine transformation model or the projective transformation model is as follows.
  • the affine transformation model can be established by means of equations. Specifically, the transformation model established by the equations is as follows:
  • the affine transformation model can also be established by the form of a matrix.
  • the transformation model established by the matrix is as follows:
  • (x, y) is the coordinate of the reference feature point in the reference image
  • (x', y') is the coordinate of the feature point in the current image that matches the reference feature point, a0, a1, a2, b0, b1, and b2 Transform parameters for affine.
  • the complete affine transformation parameters can be solved; when the matched feature points are more than three groups, the least squares solution can be solved. More precise affine transformation parameters.
  • the affine transformation parameters calculated from the affine transformation model can be used to represent the motion vector of the drone.
  • the projective transformation model can be established by means of equations. Specifically, the transformation model is established by the following formula:
  • the projective transformation matrix calculated from the projective transformation model can be used to represent the motion vector of the drone.
  • the UAV return positioning device includes: a reference module 301, an acquisition module 302, and a positioning module 303, where:
  • the reference module 301 is configured to generate a reference image during the flight of the drone; the acquisition module 302 is configured to acquire the current image acquired at the current time; the positioning module 303 is configured to collect the reference image generated by the reference module 301 and the acquisition module 302. The current image determines the current location of the drone.
  • the reference module 301 includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
  • the sampling unit is specifically configured to acquire a ground image during the flight of the drone from the starting position to the return position.
  • the sampling unit may be an imaging device, such as a camera, a digital camera, etc.; the splicing unit may be a processor or a chip or the like.
  • the acquisition module 302 can be an imaging device such as a camera, a digital camera, or the like.
  • the location module 303 can be a processor or a chip.
  • the sampling unit and the acquisition module 302 may be the same camera device.
  • the method further includes: a determining module, configured to determine a return flight.
  • the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
  • the determining module may be a wireless signal receiver, such as an antenna for receiving a WiFi signal, an antenna for receiving an LTE (Long Term Evolution) signal, or an antenna for receiving a Bluetooth signal.
  • the determining module can also include a processor on this basis.
  • the method further includes: a trajectory module, configured to determine, after the reference module 301 generates the reference image, a reverse trajectory of the drone flying from the starting position to the returning position.
  • a trajectory module configured to determine, after the reference module 301 generates the reference image, a reverse trajectory of the drone flying from the starting position to the returning position.
  • the method further includes: a returning module for flying from the returning position to the starting position according to the reverse trajectory determined by the trajectory module.
  • the foregoing trajectory module and the returning module may be a processor or a computing chip, respectively.
  • the above trajectory module and the returning module may be the same processor or a computing chip.
  • the positioning module includes: a matching unit configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to use the motion vector Determining the positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of the following: the position of the drone, the height of the drone, the posture of the drone, and the orientation of the drone The speed of the drone and the heading of the drone.
  • the matching unit is further configured to perform scene matching on the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
  • the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the current image a feature point matching the reference feature point, wherein the matched feature point is used as the current feature point; the vector sub-unit is used to match the current feature point with the reference feature point to obtain a drone relative to the current time The motion vector of the reference image.
  • the drone includes a body 401, an image capture device 402, and a processor (not shown), wherein:
  • the body 401 is used to carry various components of the drone, such as a battery, an engine (motor), and a camera Like first class;
  • the image capture device 402 is disposed on the body 401, and the image capture device 402 is configured to acquire images.
  • the image collection device 402 may be a camera.
  • image acquisition device 402 can be used for panoramic photography.
  • the image capture device 402 can include a multi-view camera, can also include a panoramic camera, and can include both a multi-view camera and a panoramic camera to capture images or video from multiple angles.
  • the processor is for performing the method disclosed in the embodiment shown in FIG.
  • the method and device for positioning a drone provided by the embodiment of the present invention provide a method and a device for positioning a drone according to an embodiment of the present invention, and generate a reference image during the flight of the drone, and the reference image can reflect The latest ground situation, and then, the current image acquired at the current time is acquired. Since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, The current position of the drone can be determined according to the reference image and the current image. In the solution of the embodiment of the present invention, the reference image is generated during the flight of the drone, and the current image is acquired during the flight of the drone.
  • the generated reference image can dynamically compensate for the difference in resolution generated during the flight of the drone, and the dynamic matching can be better achieved in the return process of the drone compared to the fixed resolution in the prior art.
  • the system error is reduced, thereby improving the positioning accuracy of the return flight.
  • determining a reverse trajectory of the drone flying from the starting position to the returning position so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position. Flying from the return position to the starting position reduces the amount of return planning for the flight path and improves the efficiency of determining the flight path when returning. In addition, drones encounter no signal Or in the case of a communication failure, the UAV can smoothly return to the starting position by flying from the returning position to the starting position in accordance with the reverse trajectory.
  • the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the device is implemented in a flow chart A function specified in a block or blocks of a process or multiple processes and/or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

A method and device for positioning an unmanned aerial vehicle. The method comprises: generating a reference image during flight of an unmanned aerial vehicle (S101); acquiring a current image acquired at a current time (S102); and determining, according to the reference image and the current image, a current position of the unmanned aerial vehicle (S103). The reference image and the current image have a certain degree of correlation, and therefore the current position of the unmanned aerial vehicle can be determined according to the reference image and the current image. Compared to a fixed resolution in the prior art, the present invention enables better dynamic matching during a return trip of an unmanned aerial vehicle, thus reducing systematic errors, and improving positioning precision of the return trip.

Description

对无人机进行定位的方法及装置Method and device for positioning drone
本申请要求于2016年12月28日申请的、申请号为201611240082.2、发明名称为“对无人机进行定位的方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。The present application claims priority to Chinese Patent Application No. 201611240082.2, filed on Dec. 28, 2016, entitled,,,,,,,,,,,,,,,,,,, in.
技术领域Technical field
本发明涉及无人机控制领域,具体涉及一种对无人机进行定位的方法及装置。The invention relates to the field of UAV control, and in particular to a method and a device for positioning a UAV.
背景技术Background technique
无人机在防灾救险、科学考察等领域有着广阔的应用,而飞行控制系统是无人机的重要组成部分,在无人机智能化和实用化中起着重要的作用。通常,无人机在飞行控制系统的控制下去往目的地作业完成后,可以按照原路自动返航。UAVs have broad applications in disaster prevention and rescue, scientific investigations, etc., and flight control systems are an important part of drones, playing an important role in the intelligent and practical use of drones. Usually, after the flight control system is controlled by the flight control system to the destination, the drone can automatically return to the original route.
为了实现对无人机自动返航时的定位,现有技术中,通常可以在无人机的飞行系统中存储第三方提供的地图数据,而后通过定位装置例如全球定位系统(Global Position System,GPS)来实现无人机的定位和导航。然而,第三方提供的地图数据的分辨率和无人机距离地面的高度有关,通常而言,无人机的离地飞行高度越高,分辨率越低。由于无人机在作业过程中的飞行高度会经常发生变化,由此容易造成地面目标的分辨率相差较大,匹配精度低,导致了返航时定位精度较差。 In order to realize the positioning when the drone is automatically returned to the air, in the prior art, the map data provided by the third party can usually be stored in the flight system of the drone, and then passed through a positioning device such as a Global Position System (GPS). To achieve the positioning and navigation of the drone. However, the resolution of the map data provided by the third party is related to the height of the drone from the ground. Generally, the higher the altitude of the drone's off-ground flight, the lower the resolution. Since the flying height of the drone during the operation will change frequently, the resolution of the ground target is likely to be large, and the matching accuracy is low, resulting in poor positioning accuracy when returning.
为此,如何提高定位精度成为亟待解决的技术问题。Therefore, how to improve the positioning accuracy has become a technical problem to be solved.
发明内容Summary of the invention
本发明要解决的技术问题在于如何提高定位精度。The technical problem to be solved by the present invention is how to improve the positioning accuracy.
为此,根据第一方面,本发明实施例公开了一种对无人机进行定位的方法,包括:To this end, according to the first aspect, an embodiment of the present invention discloses a method for positioning a drone, including:
在无人机飞行的过程中,生成基准图像;获取当前时刻采集的当前图像;根据基准图像和当前图像,确定无人机的当前位置。During the flight of the drone, a reference image is generated; the current image acquired at the current time is acquired; and the current position of the drone is determined according to the reference image and the current image.
可选地,生成基准图像,包括:在无人机飞行的过程中,采集地面图像;将地面图像进行拼接,得到基准图像。Optionally, generating a reference image includes: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image.
可选地,在无人机飞行的过程中,采集地面图像,包括:在无人机从起始位置飞向返航位置的过程中,采集地面图像。Optionally, during the flight of the drone, the ground image is collected, including: collecting the ground image during the flight of the drone from the starting position to the returning position.
可选地,在获取当前时刻采集的当前图像之前,本实施例公开的对无人机进行定位的方法还包括:确定返航。Optionally, before the acquiring the current image collected at the current time, the method for positioning the UAV disclosed in this embodiment further includes: determining the return flight.
可选地,在确定返航之前,本实施例公开的对无人机进行定位的方法还包括:接收控制器发送的用于指示返航的指令。Optionally, the method for positioning the UAV disclosed in this embodiment further includes: receiving, by the controller, an instruction sent by the controller for indicating the return flight.
可选地,在生成基准图像之后,本实施例公开的对无人机进行定位的方法还包括:确定无人机从起始位置飞到返航位置的逆向轨迹。Optionally, after the generating the reference image, the method for positioning the UAV disclosed in this embodiment further includes: determining a reverse trajectory of the UAV flying from the starting position to the returning position.
可选地,在确定逆向轨迹之后,本实施例公开的对无人机进行定位的方法还包括:按照逆向轨迹从返航位置向起始位置飞行。Optionally, after determining the reverse trajectory, the method for positioning the unmanned aerial vehicle disclosed in this embodiment further includes: flying from the returning position to the starting position according to the reverse trajectory.
可选地,根据基准图像和当前图像,确定无人机的当前位置,包括:对当前图像与基准图像进行匹配,得到无人机在当前时刻相对于基准图像 的运动矢量;根据运动矢量确定无人机在当前时刻相对于基准图像的定位信息;其中,定位信息包括下述至少之一:无人机的位置、无人机的高度、无人机的姿态、无人机的方位向、无人机的速度和无人机的航向。Optionally, determining a current location of the drone according to the reference image and the current image, including: matching the current image with the reference image, and obtaining the drone at the current time relative to the reference image a motion vector; determining, according to the motion vector, positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone The direction of the drone, the speed of the drone and the heading of the drone.
可选地,对当前图像与基准图像进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量,包括:将当前图像与基准图像进行景象匹配,得到无人机在当前时刻相对于基准图像的运动矢量。Optionally, the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: matching the current image with the reference image to obtain a scene of the drone relative to the reference at the current time. The motion vector of the image.
可选地,将当前图像与基准图像进行景象匹配,得到无人机在当前时刻相对于基准图像的运动矢量,包括:选取基准图像的特征点,其中,选取的特征点被用作为基准特征点;确定在当前图像中与基准特征点匹配的特征点,其中,匹配得到的特征点被用作为当前特征点;将当前特征点与基准特征点进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量。Optionally, the current image is matched with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, including: selecting a feature point of the reference image, wherein the selected feature point is used as the reference feature point. Determining a feature point matching the reference feature point in the current image, wherein the matched feature point is used as the current feature point; matching the current feature point with the reference feature point to obtain a drone at the current time relative to the reference The motion vector of the image.
根据第二方面,本发明实施例公开了一种对无人机进行定位的装置,包括:According to a second aspect, an embodiment of the present invention discloses an apparatus for positioning a drone, including:
基准模块,用于在无人机飞行的过程中,生成基准图像;采集模块,用于获取当前时刻采集的当前图像;定位模块,用于根据基准模块生成的基准图像和采集模块采集的当前图像,确定无人机的当前位置。a reference module for generating a reference image during the flight of the drone; an acquisition module for acquiring the current image acquired at the current time; and a positioning module for the reference image generated by the reference module and the current image acquired by the acquisition module , determine the current location of the drone.
可选地,基准模块包括:采样单元,用于在无人机飞行的过程中,采集地面图像;拼接单元,用于将采样单元采集的地面图像进行拼接,得到基准图像。Optionally, the reference module includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
可选地,采样单元具体用于在无人机从起始位置飞向返航位置的过程中,采集地面图像。Optionally, the sampling unit is specifically configured to collect a ground image during the flight of the drone from the starting position to the returning position.
可选地,本实施例公开的对无人机进行定位的装置还包括:确定模块, 用于确定返航。Optionally, the apparatus for positioning the UAV disclosed in this embodiment further includes: a determining module, Used to determine the return flight.
可选地,确定模块还用于接收控制器发送的用于指示返航的指令。Optionally, the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
可选地,本实施例公开的对无人机进行定位的装置还包括:轨迹模块,用于在基准模块生成基准图像之后,确定无人机从起始位置飞到返航位置的逆向轨迹。Optionally, the apparatus for positioning the UAV according to the embodiment further includes: a trajectory module, configured to determine, after the reference module generates the reference image, a reverse trajectory of the UAV flying from the starting position to the returning position.
可选地,本实施例公开的对无人机进行定位的装置还包括:返航模块,用于按照轨迹模块确定的逆向轨迹从返航位置向起始位置飞行。Optionally, the device for positioning the UAV disclosed in this embodiment further includes: a returning module, configured to fly from the return position to the starting position according to the reverse trajectory determined by the trajectory module.
可选地,定位模块包括:匹配单元,用于对当前图像与基准图像进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量;定位信息单元,用于根据运动矢量确定无人机在当前时刻相对于基准图像的定位信息;其中,定位信息包括下述至少之一:无人机的位置、无人机的高度、无人机的姿态、无人机的方位向、无人机的速度和无人机的航向。Optionally, the positioning module includes: a matching unit, configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to determine the drone according to the motion vector Positioning information relative to the reference image at the current time; wherein the positioning information includes at least one of: a position of the drone, a height of the drone, a posture of the drone, an azimuth of the drone, a drone The speed and heading of the drone.
可选地,匹配单元具体用于将当前图像与基准图像进行景象匹配,得到无人机在当前时刻相对于基准图像的运动矢量。Optionally, the matching unit is specifically configured to perform scene matching on the current image and the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
可选地,匹配单元包括:选取子单元,用于选取基准图像的特征点,其中,选取的特征点被用作为基准特征点;特征点确定子单元,用于确定在当前图像中与基准特征点匹配的特征点,其中,匹配得到的特征点被用作为当前特征点;矢量子单元,用于将当前特征点与基准特征点进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量。Optionally, the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the reference feature in the current image Point matching feature points, wherein the matched feature points are used as the current feature points; the vector subunits are used to match the current feature points with the reference feature points to obtain the movement of the drone relative to the reference image at the current time. Vector.
本发明技术方案,具有如下优点:The technical solution of the present invention has the following advantages:
本发明实施例提供的对无人机进行定位的方法及装置,在无人机飞行的过程中生成基准图像,基准图像能够地反映最新的地面情况,而后,获 取当前时刻采集的当前图像,由于基准图像和当前图像均在无人机飞行过程中获取采集的,因此,基准图像和当前图像之间存在一定的关联性,继而,根据基准图像和当前图像能够确定无人机的当前位置;本发明实施例的方案中,基准图像是在无人机飞行过程中生成的,而当前图像亦是无人机飞行过程中采集获取得到的,因此,生成的基准图像能够动态补偿适应无人机飞行过程中所产生的分辨率差异,相对于现有技术中固定的分辨率,无人机返航过程中能够更好地实现动态匹配,降低了系统误差,从而提高了返航的定位精度。The method and device for positioning a drone according to an embodiment of the present invention generate a reference image during the flight of the drone, and the reference image can reflect the latest ground situation, and then obtain Taking the current image collected at the current time, since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, according to the reference image and the current image, Determining the current position of the drone; in the solution of the embodiment of the invention, the reference image is generated during the flight of the drone, and the current image is also acquired during the flight of the drone, and therefore, the generated reference The image can dynamically compensate for the difference in resolution generated during the flight of the UAV. Compared with the fixed resolution in the prior art, the UAV can better achieve dynamic matching during the return process, reducing system errors and thus improving The positioning accuracy of the return flight.
作为可选的技术方案,在生成基准图像之后,确定无人机从起始位置飞到返航位置的逆向轨迹,使得无人机在从返航位置向起始位置飞行时,可以直接按照逆向轨迹从返航位置向起始位置飞行,减少了飞行轨迹的返航规划量,提高了返航时飞行轨迹确定的效率。此外,无人机在遇到无信号或者通信故障的情况下,通过按照逆向轨迹从返航位置向起始位置飞行,使得无人机能够顺利返回至起始位置。As an optional technical solution, after generating the reference image, determining a reverse trajectory of the drone flying from the starting position to the returning position, so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position. The return position is flying to the starting position, which reduces the return planning of the flight path and improves the efficiency of determining the flight path when returning. In addition, the drone can fly back to the starting position by following the reverse trajectory in the case of no signal or communication failure, so that the drone can smoothly return to the starting position.
另外,通常无人机在从起始位置向返航位置飞行时会规划出较优的譬如绕过障碍物等的去程轨迹,因此,使得按照去程轨迹的逆向轨迹从返航位置向起始位置飞行时,能够以较优的轨迹返航。In addition, usually the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position. When flying, you can return to a better trajectory.
附图说明DRAWINGS
为了更清楚地说明本发明具体实施方式或现有技术中的技术方案,下面将对具体实施方式或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施方式,对于本领域普 通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the specific embodiments of the present invention or the technical solutions in the prior art, the drawings to be used in the specific embodiments or the description of the prior art will be briefly described below, and obviously, the attached in the following description The figures are some embodiments of the invention, for the field For the sake of the skilled person, other drawings can be obtained from these drawings without any creative work.
图1为本发明实施例中一种对无人机进行定位的方法流程图;1 is a flow chart of a method for positioning a drone according to an embodiment of the present invention;
图2为本发明实施例中一种景象匹配得到无人机运动矢量的方法流程图;2 is a flowchart of a method for matching a scene to obtain a motion vector of a drone according to an embodiment of the present invention;
图3为本发明实施例中一种对无人机进行定位的装置结构示意图;3 is a schematic structural diagram of an apparatus for positioning a drone according to an embodiment of the present invention;
图4为本发明实施例中一种对无人机进行定位的系统结构示意图。FIG. 4 is a schematic structural diagram of a system for positioning a drone according to an embodiment of the present invention.
具体实施方式detailed description
下面将结合附图对本发明的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions of the present invention will be clearly and completely described in the following with reference to the accompanying drawings. It is obvious that the described embodiments are a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention.
在本发明的描述中,需要说明的是,术语“中心”、“上”、“下”、“左”、“右”、“竖直”、“水平”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。此外,术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性,也不能理解为先后顺序。In the description of the present invention, it is to be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inside", "outside", etc. The orientation or positional relationship of the indications is based on the orientation or positional relationship shown in the drawings, and is merely for the convenience of the description of the invention and the simplified description, rather than indicating or implying that the device or component referred to has a specific orientation, in a specific orientation. The construction and operation are therefore not to be construed as limiting the invention. In addition, the terms "first", "second", and "third" are used for descriptive purposes only, and are not to be construed as indicating or implying a relative importance or a prior order.
在本发明的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接, 也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,还可以是两个元件内部的连通,可以是无线连接,也可以是有线连接。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本发明中的具体含义。In the description of the present invention, it should be noted that the terms "installation", "connected", and "connected" are to be understood broadly, and, for example, may be fixed connections, unless explicitly stated and defined otherwise. It can also be a detachable connection, or an integral connection; it can be a mechanical connection or an electrical connection; it can be directly connected, or can be indirectly connected through an intermediate medium, or can be internal communication between two components, and can be a wireless connection. It can also be a wired connection. The specific meaning of the above terms in the present invention can be understood in a specific case by those skilled in the art.
此外,下面所描述的本发明不同实施方式中所涉及的技术特征只要彼此之间未构成冲突就可以相互结合。Further, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not constitute a conflict with each other.
为了提高无人机飞行的定位精度,本实施例公开了一种对无人机进行定位的方法,请参考图1,为该对无人机进行定位的方法流程图,该对无人机进行定位的方法包括:In order to improve the positioning accuracy of the drone flight, the embodiment discloses a method for positioning the unmanned aerial vehicle. Referring to FIG. 1 , a flow chart of the method for positioning the unmanned aerial vehicle is performed. The methods of positioning include:
步骤S101,在无人机飞行的过程中,生成基准图像。In step S101, a reference image is generated during the flight of the drone.
本实施例中,可以在无人机从起始位置起飞后开始采集地面图像,将无人机在前往目的地移动过程中所采集的地面图像进行拼接,将拼接后的结果作为基准图像。所称地面图像是指无人机在飞行过程中以俯视视角采集的图像,该俯视视角方向与竖直方向的夹角小于90度。优选地,该俯视视角方向可以是竖直向下的,在此情况下,俯视视角方向与竖直方向的夹角为0度。In this embodiment, the ground image can be collected after the drone takes off from the starting position, and the ground image collected by the drone during the moving to the destination is spliced, and the stitched result is used as the reference image. The so-called ground image refers to an image taken by the drone in a bird's eye view during a flight, and the angle between the overhead viewing direction and the vertical direction is less than 90 degrees. Preferably, the overhead viewing direction may be vertically downward, in which case the angle of view of the overhead viewing angle is 0 degrees from the vertical.
无人机可以储存生成的基准图像,以备后续继续使用该基准图像。作为一种可选的方式,无人机在生成基准图像之后,还可以将该基准图像发送给其他的无人机,以便其他的无人机也可以使用该基准图像。需要说明的是,对于同一台无人机,由于无人机的各项硬件参数在无人机飞行的过程中不会发生改变,所以无人机自身所生成的基准图像能够表征该无人机的去程轨迹。所谓去程轨迹是指无人机从起始位置飞往目的位置的飞行轨 迹。The drone can store the generated reference image for subsequent use of the reference image. As an alternative, after the dive machine generates the reference image, the reference image can also be sent to other drones, so that other drones can also use the reference image. It should be noted that, for the same drone, since the hardware parameters of the drone do not change during the flight of the drone, the reference image generated by the drone itself can represent the drone. The way to go. The so-called departure trajectory refers to the flight path of the drone from the starting position to the destination position. trace.
步骤S102,获取当前时刻采集的当前图像。Step S102: Acquire a current image collected at a current time.
在生成基准图像后,在无人机飞行过程中,可以获取当前时刻采集的当前图像。After the reference image is generated, the current image acquired at the current time can be acquired during the flight of the drone.
例如,在无人机返航的过程中,无人机为了确定当前时刻的位置,获取无人机上的图像采集设备在当前时刻所采集的图像。For example, in the process of returning the drone, the drone acquires the image acquired by the image acquisition device on the drone at the current moment in order to determine the current time position.
步骤S103,根据基准图像和当前图像,确定无人机的当前位置。Step S103, determining the current position of the drone based on the reference image and the current image.
本实施例中,在得到基准图像后,可以将当前图像和基准图像进行比对,得到当前图像和基准图像的差异,根据该差异可以估算无人机的运动矢量,由此确定无人机的当前位置。In this embodiment, after the reference image is obtained, the current image and the reference image may be compared to obtain a difference between the current image and the reference image, and the motion vector of the drone may be estimated according to the difference, thereby determining the drone's motion vector. current position.
在可选的实施例中,在步骤S101中,生成基准图像的操作可以包括:在无人机飞行的过程中,采集地面图像;将地面图像进行拼接,得到基准图像。在具体实施例中,可以按预设间隔采集地面图像,所称预设间隔可以根据先验知识来确定时域上预设的时间间隔,或者位置上预设的距离间隔。所称预设间隔可以是等间隔,也可以是非等间隔。In an optional embodiment, in step S101, the operation of generating the reference image may include: collecting a ground image during the flight of the drone; and splicing the ground image to obtain a reference image. In a specific embodiment, the ground image may be collected at preset intervals, and the preset interval may be determined according to prior knowledge to determine a time interval preset in the time domain, or a preset distance interval in the position. The preset intervals may be equal intervals or non-equal intervals.
在具体拼接过程中,可以采用整体模式拼接,也可以采用分段模式进行图像拼接。具体地,在拼接过程中,相邻帧图像之间通常会存在重叠区域,可以将有重叠部分的前后两帧图像拼成一副大型无缝图像。对于具有重叠的相邻的两帧图像,也可以将其中一帧图像的重叠区域直接舍去,将其多出的部分拼到另一帧图像上,在接缝区域进行融合,从而得到拼接图。In the specific splicing process, the overall mode splicing may be used, or the segmentation mode may be used for image splicing. Specifically, in the splicing process, there is usually an overlapping area between adjacent frame images, and the front and rear two frames of the overlapping portion may be combined into a large seamless image. For two adjacent frames with overlapping images, the overlapping area of one of the images may be directly rounded off, and the extra portion of the image may be stitched to another frame image, and the fusion is performed in the seam region to obtain a mosaic image. .
本发明实施例中,在无人机从起始位置飞向返航位置的过程中,采集地面图像。所称起始位置是指无人机在开始起飞的位置;所称返航位置是 指无人机起飞后开始向起始位置返回的位置。通常,返航位置为无人机前往的目的地,但是,在具体实施过程中,返航位置也可以是无人机在飞往目的地的过程中接收到返航指令时所在的位置。返航位置还可以是无人机在飞往目的地的过程中遇到特殊情况确定需要返航的位置。例如在飞行过程中出现电量不足、无GPS信号、无人机故障等突发情况,这时无人机中的飞控系统确定返航。In the embodiment of the present invention, the ground image is acquired during the flight of the drone from the starting position to the returning position. The so-called starting position refers to the position where the drone starts to take off; the so-called return position is Refers to the position where the drone starts to return to the starting position after taking off. Usually, the returning position is the destination of the drone, but in the specific implementation process, the returning position may also be the location where the drone receives the returning instruction during the flight to the destination. The return location can also be a location where the drone encounters a special situation in the process of flying to the destination to determine the need to return. For example, during the flight, there are sudden situations such as insufficient power, no GPS signal, and drone failure. At this time, the flight control system in the drone determines the return flight.
可选地,在执行步骤S102之前,还可以包括:Optionally, before performing step S102, the method may further include:
步骤S104,确定返航。In step S104, it is determined that the return flight.
在其中一种实施方式中,可以是无人机主动确定返航,例如无人机在飞行过程中遇到特殊情况需要返航。例如无人机在飞行过程中出现电量不足、无GPS信号、无人机故障等突发情况,这时无人机中的飞控系统确定返航。在飞到目的地完成作业任务后,无人机也可以主动确定需要返航。In one of the embodiments, the drone may actively determine the return flight, for example, the drone encounters a special situation during the flight and needs to return. For example, when the drone is in a state of insufficient power, no GPS signal, or a drone failure during the flight, the flight control system in the drone determines the return flight. After flying to the destination to complete the task, the drone can also take the initiative to determine the need to return.
在另外一种实施方式中,也可以是由控制器控制无人机返航。具体地,无人机接收控制器发送的用于指示返航的指令。无人机接收到该指令后,确定返航。本实施例中,控制器可以是无人机专用遥控器,也可以是对无人机进行遥控的终端,例如移动终端、计算机、笔记本等。In another embodiment, the controller may also control the drone to return. Specifically, the drone receives an instruction sent by the controller to indicate a return flight. After receiving the instruction, the drone determines to return. In this embodiment, the controller may be a remote control dedicated to the drone, or a terminal that remotely controls the drone, such as a mobile terminal, a computer, a notebook, or the like.
为了给无人机在返航时提供返航轨迹参考,可选地,在执行步骤S101之后,还包括:In order to provide the returning trajectory reference for the drone when returning, optionally, after performing step S101, the method further includes:
步骤S105,确定无人机从起始位置飞到返航位置的逆向轨迹。Step S105, determining a reverse trajectory of the drone flying from the starting position to the returning position.
本实施例中,对于一个无人机而言,由于该无人机的各项物理参数在飞行过程中都没有发生改变,因此在从起始位置飞到返航位置的过程中,采集的图像在拼接后,能够根据图像属性确定无人机的飞行轨迹。本实施 例中,沿着去程轨迹从返航位置飞到起始位置即形成无人机从起始位置飞到返航位置的逆向轨迹。在无人机返航时,无人机能够根据该逆向轨迹执行返航操作。In this embodiment, for a drone, since the physical parameters of the drone have not changed during the flight, the captured image is in the process of flying from the starting position to the returning position. After splicing, the flight path of the drone can be determined according to the image attributes. This implementation In the example, the flight path from the return position to the starting position along the departure path forms a reverse trajectory of the drone flying from the starting position to the returning position. When the drone returns, the drone can perform the returning operation based on the reverse trajectory.
可选地,步骤S103可以具体包括:对当前图像与基准图像进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量;根据运动矢量确定无人机在当前时刻相对于基准图像的定位信息。Optionally, the step S103 may specifically include: matching the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and determining, according to the motion vector, the positioning of the drone relative to the reference image at the current time. information.
本实施例中,定位信息包括下述至少之一:无人机的位置、无人机的高度、无人机的姿态、无人机的方位向、无人机的速度和无人机的航向。其中,无人机的方向位是指飞机在当前时刻采集的当前图像与基准图像的相对角度,无人机的航向是指无人机的实际的飞行方向。在将当前图像与基准图像进行匹配时,由于返航过程的飞行轨迹为去程轨迹的逆向轨迹,因此,通过匹配当前图像与基准图像能够得到无人机当前时刻相对于基准图像的运动矢量,通过该运动矢量能够得到无人机在当前时刻在基准图像中的位置、高度、姿态和方位向等信息,于是,可以确定无人机在当前时刻的位置。In this embodiment, the positioning information includes at least one of the following: the position of the drone, the height of the drone, the attitude of the drone, the azimuth of the drone, the speed of the drone, and the heading of the drone. . The direction of the UAV refers to the relative angle between the current image and the reference image acquired by the aircraft at the current time. The heading of the UAV refers to the actual flight direction of the UAV. When the current image is matched with the reference image, since the flight path of the returning process is the reverse trajectory of the outward trajectory, the motion vector of the current time of the drone relative to the reference image can be obtained by matching the current image with the reference image. The motion vector can obtain information such as the position, height, posture, and azimuth of the drone in the reference image at the current time, and thus, the position of the drone at the current time can be determined.
在具体实施例中,在将当前图像与基准图像进行匹配得到无人机当前时刻相对于基准图像的运动矢量时,可以将当前图像与基准图像进行景象匹配得到无人机当前时刻相对于基准图像的运动矢量,具体地,请参考图2。图2所示的方法包括:In a specific embodiment, when the current image is matched with the reference image to obtain a motion vector of the current time of the drone relative to the reference image, the current image and the reference image may be scene-matched to obtain a current time of the drone relative to the reference image. The motion vector, specifically, please refer to Figure 2. The method shown in Figure 2 includes:
步骤S201,选取所述基准图像的特征点,该选取的特征点被用作为基准特征点。Step S201, selecting feature points of the reference image, and the selected feature points are used as reference feature points.
可以选取容易识别的点或者建筑物作为基准特征点,例如纹理丰富的 物体边缘点等。在选取出基准特征点后,可以采用例如梯度直方图,局部随机二值特征等数学方式来描述特征点。You can select easily identifiable points or buildings as reference feature points, such as texture-rich Object edge points, etc. After the reference feature points are selected, the feature points can be described by mathematical methods such as gradient histograms and local random binary features.
步骤S202,确定在当前图像中与基准特征点匹配的特征点,该匹配得到的特征点被用作为当前特征点。Step S202, determining a feature point matching the reference feature point in the current image, and the feature point obtained by the matching is used as the current feature point.
在具体实施例中,可以通过相同的数学描述方式来描述当前图像中的像素,利用数学知识可以确定到当前图像中与基准特征点匹配的当前特征点。In a particular embodiment, the pixels in the current image can be described by the same mathematical description, and the current feature points in the current image that match the reference feature points can be determined using mathematical knowledge.
步骤S203,将当前特征点与基准特征点进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量。Step S203, matching the current feature point with the reference feature point to obtain a motion vector of the drone relative to the reference image at the current time.
可以通过仿射变换模型或者射影变换模型来对当前特征点和基准特征点进行匹配。有关仿射变换模型或者射影变换模型的描述如下。The current feature points and the reference feature points can be matched by an affine transformation model or a projective transformation model. A description of the affine transformation model or the projective transformation model is as follows.
(1)对于仿射变换模型,可以通过方程组的方式来建立仿射变换模型,具体地,通过方程组建立的变换模型如下:(1) For the affine transformation model, the affine transformation model can be established by means of equations. Specifically, the transformation model established by the equations is as follows:
Figure PCTCN2017072477-appb-000001
Figure PCTCN2017072477-appb-000001
其中,(x,y)为基准图像中基准特征点的坐标,(x',y')为当前图像中与基准特征点匹配的特征点的坐标,a、b、c、d、m和n为仿射变换参数。本实施例中,当匹配的特征点为三组不共线的特征点时,便可以解算出完整的仿射变换参数;当匹配的特征点为三组以上时,可以通过最小二乘解法求解出更精确的仿射变换参数。Where (x, y) is the coordinate of the reference feature point in the reference image, and (x', y') is the coordinate of the feature point matching the reference feature point in the current image, a, b, c, d, m, and n Transform parameters for affine. In this embodiment, when the matched feature points are three sets of feature points that are not collinear, the complete affine transformation parameters can be solved; when the matched feature points are more than three groups, the least squares solution can be solved. More precise affine transformation parameters.
在具体实施例中,还可以通过矩阵的形式来建立仿射变换模型,具体地,通过矩阵建立的变换模型如下: In a specific embodiment, the affine transformation model can also be established by the form of a matrix. Specifically, the transformation model established by the matrix is as follows:
Figure PCTCN2017072477-appb-000002
Figure PCTCN2017072477-appb-000002
其中,(x,y)为基准图像中基准特征点的坐标,(x',y')为当前图像中与基准特征点匹配的特征点的坐标,a0、a1、a2、b0、b1和b2为仿射变换参数。本实施例中,当匹配的特征点为三组不共线的特征点时,便可以解算出完整的仿射变换参数;当匹配的特征点为三组以上时,可以通过最小二乘解法求解出更精确的仿射变换参数。Where (x, y) is the coordinate of the reference feature point in the reference image, and (x', y') is the coordinate of the feature point in the current image that matches the reference feature point, a0, a1, a2, b0, b1, and b2 Transform parameters for affine. In this embodiment, when the matched feature points are three sets of feature points that are not collinear, the complete affine transformation parameters can be solved; when the matched feature points are more than three groups, the least squares solution can be solved. More precise affine transformation parameters.
根据仿射变换模型计算得到的仿射变换参数可以被用来表示无人机的运动矢量。The affine transformation parameters calculated from the affine transformation model can be used to represent the motion vector of the drone.
(2)对于射影变换模型,可以通过方程组的方式来建立射影变换模型,具体地,通过以下公式建立变换模型:(2) For the projective transformation model, the projective transformation model can be established by means of equations. Specifically, the transformation model is established by the following formula:
Figure PCTCN2017072477-appb-000003
Figure PCTCN2017072477-appb-000003
其中,(x,y)为基准图像中基准特征点的坐标,(x',y')为当前图像中与基准特征点匹配的特征点的坐标,(w'x' w'y' w')和(wx wy w)分别为(x,y)和(x',y')的齐次坐标,
Figure PCTCN2017072477-appb-000004
为射影变换矩阵,在具体实施例中,变换矩阵
Figure PCTCN2017072477-appb-000005
可以拆分为4部分,其中,
Figure PCTCN2017072477-appb-000006
表示线性变换,[a31 a32]用于平移,[a13 a23]T产生射影变换,a33=1。
Where (x, y) is the coordinate of the reference feature point in the reference image, and (x', y') is the coordinate of the feature point in the current image that matches the reference feature point, (w'x'w'y'w' And (wx wy w) are the homogeneous coordinates of (x, y) and (x', y', respectively.
Figure PCTCN2017072477-appb-000004
For a projective transformation matrix, in a particular embodiment, a transformation matrix
Figure PCTCN2017072477-appb-000005
Can be split into 4 parts, of which
Figure PCTCN2017072477-appb-000006
Represents a linear transformation, [a31 a32] for translation, [a13 a23] T produces a projective transformation, a33=1.
根据射影变换模型计算得到的射影变换矩阵可以被用来表示无人机的运动矢量。 The projective transformation matrix calculated from the projective transformation model can be used to represent the motion vector of the drone.
本实施例还公开了一种对无人机进行定位的装置,请参考图3,该无人机返航定位装置包括:基准模块301、采集模块302和定位模块303,其中:The embodiment also discloses a device for positioning the UAV. Referring to FIG. 3, the UAV return positioning device includes: a reference module 301, an acquisition module 302, and a positioning module 303, where:
基准模块301用于在无人机飞行的过程中,生成基准图像;采集模块302用于获取当前时刻采集的当前图像;定位模块303用于根据基准模块301生成的基准图像和采集模块302采集的当前图像,确定无人机的当前位置。The reference module 301 is configured to generate a reference image during the flight of the drone; the acquisition module 302 is configured to acquire the current image acquired at the current time; the positioning module 303 is configured to collect the reference image generated by the reference module 301 and the acquisition module 302. The current image determines the current location of the drone.
在可选的实施例中,基准模块301包括:采样单元,用于在无人机飞行的过程中,采集地面图像;拼接单元,用于将采样单元采集的地面图像进行拼接,得到基准图像。In an optional embodiment, the reference module 301 includes: a sampling unit, configured to collect a ground image during the flight of the drone; and a splicing unit configured to splicing the ground image collected by the sampling unit to obtain a reference image.
在可选的实施例中,采样单元具体用于在无人机从起始位置飞向返航位置的过程中,采集地面图像。In an alternative embodiment, the sampling unit is specifically configured to acquire a ground image during the flight of the drone from the starting position to the return position.
需要说明的是,采样单元可以是摄像装置,例如摄像头、数码相机等;拼接单元可以是处理器或芯片等。采集模块302可以是摄像装置,例如摄像头、数码相机等。定位模块303可以是处理器或芯片。可选地,采样单元和采集模块302可以是同一个摄像装置。It should be noted that the sampling unit may be an imaging device, such as a camera, a digital camera, etc.; the splicing unit may be a processor or a chip or the like. The acquisition module 302 can be an imaging device such as a camera, a digital camera, or the like. The location module 303 can be a processor or a chip. Alternatively, the sampling unit and the acquisition module 302 may be the same camera device.
在可选的实施例中,还包括:确定模块,用于确定返航。In an optional embodiment, the method further includes: a determining module, configured to determine a return flight.
在可选的实施例中,确定模块还用于接收控制器发送的用于指示返航的指令。In an optional embodiment, the determining module is further configured to receive an instruction sent by the controller to indicate a return flight.
在一种实现方式中,确定模块可以是无线信号接收器,例如用于接收WiFi信号的天线、用于接收LTE(Long Term Evolution,长期演进)信号的天线或者用于接收蓝牙信号的天线。在另一种实现方式中,确定模块还可以在此基础之上包括处理器。 In one implementation, the determining module may be a wireless signal receiver, such as an antenna for receiving a WiFi signal, an antenna for receiving an LTE (Long Term Evolution) signal, or an antenna for receiving a Bluetooth signal. In another implementation, the determining module can also include a processor on this basis.
在可选的实施例中,还包括:轨迹模块,用于在基准模块301生成基准图像之后,确定无人机从起始位置飞到返航位置的逆向轨迹。In an optional embodiment, the method further includes: a trajectory module, configured to determine, after the reference module 301 generates the reference image, a reverse trajectory of the drone flying from the starting position to the returning position.
在可选的实施例中,还包括:返航模块,用于按照轨迹模块确定的逆向轨迹从返航位置向起始位置飞行。In an optional embodiment, the method further includes: a returning module for flying from the returning position to the starting position according to the reverse trajectory determined by the trajectory module.
在一种实现方式中,上述轨迹模块和返航模块可以分别是处理器或计算芯片。可选地,上述轨迹模块和返航模块可以是同一个处理器或计算芯片。In an implementation manner, the foregoing trajectory module and the returning module may be a processor or a computing chip, respectively. Optionally, the above trajectory module and the returning module may be the same processor or a computing chip.
在可选的实施例中,定位模块包括:匹配单元,用于对当前图像与基准图像进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量;定位信息单元,用于根据运动矢量确定无人机在当前时刻相对于基准图像的定位信息;其中,定位信息包括下述至少之一:无人机的位置、无人机的高度、无人机的姿态、无人机的方位向、无人机的速度和无人机的航向。In an optional embodiment, the positioning module includes: a matching unit configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time; and a positioning information unit configured to use the motion vector Determining the positioning information of the drone relative to the reference image at the current time; wherein the positioning information includes at least one of the following: the position of the drone, the height of the drone, the posture of the drone, and the orientation of the drone The speed of the drone and the heading of the drone.
在可选的实施例中,匹配单元还用于将当前图像与基准图像进行景象匹配,得到无人机在当前时刻相对于基准图像的运动矢量。In an optional embodiment, the matching unit is further configured to perform scene matching on the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
在可选的实施例中,匹配单元包括:选取子单元,用于选取基准图像的特征点,其中,选取的特征点被用作为基准特征点;特征点确定子单元,用于确定在当前图像中与基准特征点匹配的特征点,其中,匹配得到的特征点被用作为当前特征点;矢量子单元,用于将当前特征点与基准特征点进行匹配,得到无人机在当前时刻相对于基准图像的运动矢量。In an optional embodiment, the matching unit includes: a selecting subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point; and the feature point determining subunit is configured to determine the current image a feature point matching the reference feature point, wherein the matched feature point is used as the current feature point; the vector sub-unit is used to match the current feature point with the reference feature point to obtain a drone relative to the current time The motion vector of the reference image.
本实施例还公开了一种无人机,请参考图4。该无人机包括:机身401、图像采集装置402和处理器(图中未示出),其中:This embodiment also discloses a drone, please refer to FIG. 4. The drone includes a body 401, an image capture device 402, and a processor (not shown), wherein:
机身401用于承载无人机的各个部件,例如电池、发动机(马达)、摄 像头等;The body 401 is used to carry various components of the drone, such as a battery, an engine (motor), and a camera Like first class;
图像采集装置402设置在机身401上,图像采集装置402用于采集图像。The image capture device 402 is disposed on the body 401, and the image capture device 402 is configured to acquire images.
需要说明的是,在本实施例中,图像采集装置402可以是摄像机。可选地,图像采集装置402可以用于全景摄像。例如,图像采集装置402可以包括多目摄像头,也可以包括全景摄像头,还可以同时包括多目摄像头和全景摄像头,以便从多角度采集图像或视频。It should be noted that, in this embodiment, the image collection device 402 may be a camera. Alternatively, image acquisition device 402 can be used for panoramic photography. For example, the image capture device 402 can include a multi-view camera, can also include a panoramic camera, and can include both a multi-view camera and a panoramic camera to capture images or video from multiple angles.
处理器用于执行图1所示实施例中所公开的方法。The processor is for performing the method disclosed in the embodiment shown in FIG.
本实施例提供的对无人机进行定位的方法及装置,本发明实施例提供的对无人机进行定位的方法及装置,在无人机飞行的过程中生成基准图像,基准图像能够地反映最新的地面情况,而后,获取当前时刻采集的当前图像,由于基准图像和当前图像均在无人机飞行过程中获取采集的,因此,基准图像和当前图像之间存在一定的关联性,继而,根据基准图像和当前图像能够确定无人机的当前位置;本发明实施例的方案中,基准图像是在无人机飞行过程中生成的,而当前图像亦是无人机飞行过程中采集获取得到的,因此,生成的基准图像能够动态补偿适应无人机飞行过程中所产生的分辨率差异,相对于现有技术中固定的分辨率,无人机返航过程中能够更好地实现动态匹配,降低了系统误差,从而提高了返航的定位精度。The method and device for positioning a drone provided by the embodiment of the present invention provide a method and a device for positioning a drone according to an embodiment of the present invention, and generate a reference image during the flight of the drone, and the reference image can reflect The latest ground situation, and then, the current image acquired at the current time is acquired. Since the reference image and the current image are acquired during the flight of the drone, there is a certain correlation between the reference image and the current image, and then, The current position of the drone can be determined according to the reference image and the current image. In the solution of the embodiment of the present invention, the reference image is generated during the flight of the drone, and the current image is acquired during the flight of the drone. Therefore, the generated reference image can dynamically compensate for the difference in resolution generated during the flight of the drone, and the dynamic matching can be better achieved in the return process of the drone compared to the fixed resolution in the prior art. The system error is reduced, thereby improving the positioning accuracy of the return flight.
在可选的实施例中,在生成基准图像之后,确定无人机从起始位置飞到返航位置的逆向轨迹,使得无人机在从返航位置向起始位置飞行时,可以直接按照逆向轨迹从返航位置向起始位置飞行,减少了飞行轨迹的返航规划量,提高了返航时飞行轨迹确定的效率。此外,无人机在遇到无信号 或者通信故障的情况下,通过按照逆向轨迹从返航位置向起始位置飞行,使得无人机能够顺利返回至起始位置。In an optional embodiment, after generating the reference image, determining a reverse trajectory of the drone flying from the starting position to the returning position, so that the drone can directly follow the reverse trajectory when flying from the returning position to the starting position. Flying from the return position to the starting position reduces the amount of return planning for the flight path and improves the efficiency of determining the flight path when returning. In addition, drones encounter no signal Or in the case of a communication failure, the UAV can smoothly return to the starting position by flying from the returning position to the starting position in accordance with the reverse trajectory.
另外,通常无人机在从起始位置向返航位置飞行时会规划出较优的譬如绕过障碍物等的去程轨迹,因此,使得按照去程轨迹的逆向轨迹从返航位置向起始位置飞行时,能够以较优的轨迹返航。In addition, usually the drone will plan a better trajectory such as bypassing obstacles when flying from the starting position to the returning position, so that the reverse trajectory according to the outward trajectory is from the returning position to the starting position. When flying, you can return to a better trajectory.
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art will appreciate that embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (system), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or FIG. These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing device to produce a machine for the execution of instructions for execution by a processor of a computer or other programmable data processing device. Means for implementing the functions specified in one or more of the flow or in a block or blocks of the flow chart.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个 流程或多个流程和/或方框图一个方框或多个方框中指定的功能。The computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device. The device is implemented in a flow chart A function specified in a block or blocks of a process or multiple processes and/or block diagrams.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device. The instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
显然,上述实施例仅仅是为清楚地说明所作的举例,而并非对实施方式的限定。对于所属领域的普通技术人员来说,在上述说明的基础上还可以做出其它不同形式的变化或变动。这里无需也无法对所有的实施方式予以穷举。而由此所引伸出的显而易见的变化或变动仍处于本发明创造的保护范围之中。 It is apparent that the above-described embodiments are merely illustrative of the examples, and are not intended to limit the embodiments. Other variations or modifications of the various forms may be made by those skilled in the art in light of the above description. There is no need and no way to exhaust all of the implementations. Obvious changes or variations resulting therefrom are still within the scope of the invention.

Claims (20)

  1. 一种对无人机进行定位的方法,其特征在于,包括:A method for positioning a drone, characterized in that it comprises:
    在无人机飞行的过程中,生成基准图像;During the flight of the drone, a reference image is generated;
    获取当前时刻采集的当前图像;Obtain the current image collected at the current time;
    根据所述基准图像和所述当前图像,确定所述无人机的当前位置。Determining a current location of the drone based on the reference image and the current image.
  2. 如权利要求1所述的方法,其特征在于,所述生成所述基准图像,包括:The method of claim 1 wherein said generating said reference image comprises:
    在所述无人机飞行的过程中,采集地面图像;Collecting a ground image during the flight of the drone;
    将所述地面图像进行拼接,得到所述基准图像。The ground image is spliced to obtain the reference image.
  3. 如权利要求2所述的方法,其特征在于,在所述无人机飞行的过程中,采集所述地面图像,包括:The method of claim 2, wherein during the flying of the drone, acquiring the ground image comprises:
    在所述无人机从起始位置飞向返航位置的过程中,采集所述地面图像。The ground image is acquired during the flight of the drone from the starting position to the return position.
  4. 如权利要求1-3中任一所述的方法,其特征在于,在获取当前时刻采集的当前图像之前,所述方法还包括:The method according to any one of claims 1 to 3, wherein before the acquiring the current image acquired at the current time, the method further comprises:
    确定返航。Determine the return flight.
  5. 如权利要求4所述的方法,其特征在于,在确定返航之前,所述方法还包括:The method of claim 4, wherein prior to determining the return flight, the method further comprises:
    接收控制器发送的用于指示返航的指令。The instruction sent by the controller to indicate the return flight.
  6. 如权利要求1-5中任一所述的方法,其特征在于,在生成所述基准图像之后,所述方法还包括:The method of any of claims 1-5, wherein after the generating the reference image, the method further comprises:
    确定所述无人机从起始位置飞到返航位置的逆向轨迹。Determining the reverse trajectory of the drone from the starting position to the returning position.
  7. 如权利要求6所述的方法,其特征在于,在确定所述逆向轨迹之后,所述方法还包括:The method of claim 6 wherein after determining the reverse trajectory, the method further comprises:
    按照所述逆向轨迹从所述返航位置向所述起始位置飞行。 Flying from the return position to the starting position in accordance with the reverse trajectory.
  8. 如权利要求1-7中任意一项所述的方法,其特征在于,所述根据所述基准图像和所述当前图像,确定所述无人机的当前位置,包括:The method according to any one of claims 1 to 7, wherein the determining the current location of the drone based on the reference image and the current image comprises:
    对所述当前图像与所述基准图像进行匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量;Matching the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time;
    根据所述运动矢量确定所述无人机在所述当前时刻相对于所述基准图像的定位信息;Determining, according to the motion vector, positioning information of the drone relative to the reference image at the current time;
    其中,所述定位信息包括下述至少之一:The positioning information includes at least one of the following:
    所述无人机的位置、所述无人机的高度、所述无人机的姿态、所述无人机的方位向、所述无人机的速度和所述无人机的航向。The position of the drone, the height of the drone, the attitude of the drone, the orientation of the drone, the speed of the drone, and the heading of the drone.
  9. 如权利要求8所述的方法,其特征在于,所述对所述当前图像与所述基准图像进行匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量,包括:The method according to claim 8, wherein said matching said current image with said reference image to obtain a motion vector of said drone at said current time relative to said reference image comprises :
    将所述当前图像与所述基准图像进行景象匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量。Performing scene matching on the current image and the reference image to obtain a motion vector of the drone relative to the reference image at the current time.
  10. 如权利要求9所述的方法,其特征在于,所述将所述当前图像与所述基准图像进行景象匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量,包括:The method according to claim 9, wherein said matching the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time, include:
    选取所述基准图像的特征点,其中,所述选取的特征点被用作为基准特征点;Selecting feature points of the reference image, wherein the selected feature points are used as reference feature points;
    确定在所述当前图像中与所述基准特征点匹配的特征点,其中,所述匹配得到的特征点被用作为当前特征点;Determining a feature point matching the reference feature point in the current image, wherein the matched feature point is used as a current feature point;
    将所述当前特征点与所述基准特征点进行匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量。Matching the current feature point with the reference feature point to obtain a motion vector of the drone relative to the reference image at the current time.
  11. 一种对无人机进行定位的装置,其特征在于,包括:A device for positioning a drone, characterized in that it comprises:
    基准模块,用于在无人机飞行的过程中,生成基准图像;a reference module for generating a reference image during flight of the drone;
    采集模块,用于获取当前时刻采集的当前图像; An acquisition module, configured to acquire a current image collected at a current time;
    定位模块,用于根据所述基准模块生成的基准图像和所述采集模块采集的当前图像,确定所述无人机的当前位置。And a positioning module, configured to determine a current location of the drone according to the reference image generated by the reference module and the current image collected by the acquisition module.
  12. 如权利要求11所述的装置,其特征在于,所述基准模块包括:The device of claim 11 wherein said reference module comprises:
    采样单元,用于在所述无人机飞行的过程中,采集地面图像;a sampling unit, configured to collect a ground image during the flight of the drone;
    拼接单元,用于将所述采样单元采集的地面图像进行拼接,得到所述基准图像。And a splicing unit, configured to splicing the ground image collected by the sampling unit to obtain the reference image.
  13. 如权利要求12所述的装置,其特征在于,所述采样单元具体用于在所述无人机从起始位置飞向返航位置的过程中,采集所述地面图像。The apparatus according to claim 12, wherein said sampling unit is specifically configured to acquire said ground image during said flying of said drone from a starting position to a returning position.
  14. 如权利要求11-13中任一所述的装置,其特征在于,还包括:The device of any of claims 11-13, further comprising:
    确定模块,用于确定返航。A determination module for determining the return flight.
  15. 如权利要求14所述的装置,其特征在于,所述确定模块还用于接收控制器发送的用于指示返航的指令。The apparatus according to claim 14, wherein said determining module is further configured to receive an instruction sent by the controller for indicating a return flight.
  16. 如权利要求11-15中任一所述的装置,其特征在于,还包括:The device of any of claims 11-15, further comprising:
    轨迹模块,用于在基准模块生成所述基准图像之后,确定所述无人机从起始位置飞到返航位置的逆向轨迹。a trajectory module, configured to determine, after the reference module generates the reference image, a reverse trajectory of the drone flying from a starting position to a returning position.
  17. 如权利要求16所述的装置,其特征在于,还包括:The device of claim 16 further comprising:
    返航模块,用于按照所述轨迹模块确定的逆向轨迹从所述返航位置向所述起始位置飞行。a returning module for flying from the returning position to the starting position according to a reverse trajectory determined by the trajectory module.
  18. 如权利要求11-17任意一项所述的装置,其特征在于,所述定位模块包括:The device according to any one of claims 11-17, wherein the positioning module comprises:
    匹配单元,用于对所述当前图像与所述基准图像进行匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量;a matching unit, configured to match the current image with the reference image to obtain a motion vector of the drone relative to the reference image at the current time;
    定位信息单元,用于根据所述运动矢量确定所述无人机在所述当前时刻相对于所述基准图像的定位信息;a positioning information unit, configured to determine positioning information of the drone relative to the reference image at the current time according to the motion vector;
    其中,所述定位信息包括下述至少之一:The positioning information includes at least one of the following:
    所述无人机的位置、所述无人机的高度、所述无人机的姿态、所述无 人机的方位向、所述无人机的速度和所述无人机的航向。Position of the drone, height of the drone, attitude of the drone, the absence The orientation of the man machine, the speed of the drone, and the heading of the drone.
  19. 如权利要求18所述的装置,其特征在于,所述匹配单元具体用于将所述当前图像与所述基准图像进行景象匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量。The device according to claim 18, wherein the matching unit is specifically configured to perform scene matching on the current image and the reference image to obtain the drone at the current time relative to the reference The motion vector of the image.
  20. 如权利要求19所述的装置,其特征在于,所述匹配单元包括:The device of claim 19, wherein the matching unit comprises:
    选取子单元,用于选取所述基准图像的特征点,其中,所述选取的特征点被用作为基准特征点;Selecting a subunit for selecting a feature point of the reference image, wherein the selected feature point is used as a reference feature point;
    特征点确定子单元,用于确定在所述当前图像中与所述基准特征点匹配的特征点,其中,所述匹配得到的特征点被用作为当前特征点;a feature point determining subunit, configured to determine a feature point matching the reference feature point in the current image, wherein the matched feature point is used as a current feature point;
    矢量子单元,用于将所述当前特征点与所述基准特征点进行匹配,得到所述无人机在所述当前时刻相对于所述基准图像的运动矢量。 And a vector subunit, configured to match the current feature point with the reference feature point to obtain a motion vector of the drone relative to the reference image at the current time.
PCT/CN2017/072477 2016-12-28 2017-01-24 Method and device for positioning unmanned aerial vehicle WO2018120350A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611240082.2A CN106774402A (en) 2016-12-28 2016-12-28 The method and device positioned to unmanned plane
CN201611240082.2 2016-12-28

Publications (1)

Publication Number Publication Date
WO2018120350A1 true WO2018120350A1 (en) 2018-07-05

Family

ID=58923493

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/072477 WO2018120350A1 (en) 2016-12-28 2017-01-24 Method and device for positioning unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN106774402A (en)
WO (1) WO2018120350A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113361552A (en) * 2020-03-05 2021-09-07 西安邮电大学 Positioning method and device
CN114348264A (en) * 2022-01-29 2022-04-15 国家海洋环境预报中心 Unmanned aerial vehicle search and rescue method and system based on marine environment
US12008910B2 (en) * 2017-08-04 2024-06-11 ideaForge Technology Pvt. Ltd UAV system emergency path planning on communication failure

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109214984B (en) * 2017-07-03 2023-03-14 臻迪科技股份有限公司 Image acquisition method and device, autonomous positioning navigation system and computing equipment
CN107291099A (en) * 2017-07-06 2017-10-24 杨顺伟 Unmanned plane makes a return voyage method and device
WO2019061111A1 (en) * 2017-09-27 2019-04-04 深圳市大疆创新科技有限公司 Path adjustment method and unmanned aerial vehicle
US10685229B2 (en) * 2017-12-21 2020-06-16 Wing Aviation Llc Image based localization for unmanned aerial vehicles, and associated systems and methods
CN110243357B (en) * 2018-03-07 2021-09-10 杭州海康机器人技术有限公司 Unmanned aerial vehicle positioning method and device, unmanned aerial vehicle and storage medium
CN108917768B (en) * 2018-07-04 2022-03-01 上海应用技术大学 Unmanned aerial vehicle positioning navigation method and system
WO2021056144A1 (en) * 2019-09-23 2021-04-01 深圳市大疆创新科技有限公司 Method and apparatus for controlling return of movable platform, and movable platform
CN111722179A (en) * 2020-06-29 2020-09-29 河南天安润信信息技术有限公司 Multipoint-distributed unmanned aerial vehicle signal direction finding method
TWI829005B (en) * 2021-08-12 2024-01-11 國立政治大學 High-altitude positioning center setting method and high-altitude positioning flight control method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method
CN104807456A (en) * 2015-04-29 2015-07-29 深圳市保千里电子有限公司 Method for automatic return flight without GPS (global positioning system) signal
CN104932515A (en) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 Automatic cruising method and cruising device
CN106204443A (en) * 2016-07-01 2016-12-07 成都通甲优博科技有限责任公司 A kind of panorama UAS based on the multiplexing of many mesh

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487555B (en) * 2016-01-14 2018-09-28 浙江华飞智能科技有限公司 A kind of station keeping method and device of unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method
CN104932515A (en) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 Automatic cruising method and cruising device
CN104807456A (en) * 2015-04-29 2015-07-29 深圳市保千里电子有限公司 Method for automatic return flight without GPS (global positioning system) signal
CN106204443A (en) * 2016-07-01 2016-12-07 成都通甲优博科技有限责任公司 A kind of panorama UAS based on the multiplexing of many mesh

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12008910B2 (en) * 2017-08-04 2024-06-11 ideaForge Technology Pvt. Ltd UAV system emergency path planning on communication failure
CN113361552A (en) * 2020-03-05 2021-09-07 西安邮电大学 Positioning method and device
CN114348264A (en) * 2022-01-29 2022-04-15 国家海洋环境预报中心 Unmanned aerial vehicle search and rescue method and system based on marine environment
CN114348264B (en) * 2022-01-29 2022-08-02 国家海洋环境预报中心 Unmanned aerial vehicle search and rescue method and system based on marine environment

Also Published As

Publication number Publication date
CN106774402A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
WO2018120350A1 (en) Method and device for positioning unmanned aerial vehicle
CN103118230B (en) A kind of panorama acquisition, device and system
US11073389B2 (en) Hover control
JP2020030204A (en) Distance measurement method, program, distance measurement system and movable object
WO2018120351A1 (en) Method and device for positioning unmanned aerial vehicle
WO2020014909A1 (en) Photographing method and device and unmanned aerial vehicle
US11906983B2 (en) System and method for tracking targets
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
CN110022444B (en) Panoramic photographing method for unmanned aerial vehicle and unmanned aerial vehicle using panoramic photographing method
US11057604B2 (en) Image processing method and device
CN113875222B (en) Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
WO2022077296A1 (en) Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium
WO2020048365A1 (en) Flight control method and device for aircraft, and terminal device and flight control system
WO2019183789A1 (en) Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle
JP6265576B1 (en) Imaging control apparatus, shadow position specifying apparatus, imaging system, moving object, imaging control method, shadow position specifying method, and program
WO2020237422A1 (en) Aerial surveying method, aircraft and storage medium
WO2020198963A1 (en) Data processing method and apparatus related to photographing device, and image processing device
WO2020237478A1 (en) Flight planning method and related device
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
JP2018201119A (en) Mobile platform, flying object, support apparatus, portable terminal, method for assisting in photography, program, and recording medium
US20230359204A1 (en) Flight control method, video editing method, device, uav and storage medium
TWI726536B (en) Image capturing method and image capturing apparatus
WO2020062255A1 (en) Photographing control method and unmanned aerial vehicle
JP2020036163A (en) Information processing apparatus, photographing control method, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17886784

Country of ref document: EP

Kind code of ref document: A1

WA Withdrawal of international application
NENP Non-entry into the national phase

Ref country code: DE