WO2020211812A1 - 一种飞行器降落方法及装置 - Google Patents

一种飞行器降落方法及装置 Download PDF

Info

Publication number
WO2020211812A1
WO2020211812A1 PCT/CN2020/085082 CN2020085082W WO2020211812A1 WO 2020211812 A1 WO2020211812 A1 WO 2020211812A1 CN 2020085082 W CN2020085082 W CN 2020085082W WO 2020211812 A1 WO2020211812 A1 WO 2020211812A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
matching
landing
point
image
Prior art date
Application number
PCT/CN2020/085082
Other languages
English (en)
French (fr)
Inventor
门泽华
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2020211812A1 publication Critical patent/WO2020211812A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • B64D45/08Landing aids; Safety measures to prevent collision with earth's surface optical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Definitions

  • This application relates to the technical field of aircraft, and in particular to an aircraft landing method and device.
  • Aircraft with flight function can land according to the landing instructions sent by the terminal.
  • some smart aircraft have the function of automatic return to home or one-key landing.
  • the commonly used landing method of aircraft is to use GPS (Global Positioning System) Assisted landing means sending the GPS coordinates of the target area to the aircraft, and the aircraft's control system controls the aircraft to land in the target area and corrects its landing position according to the GPS coordinates of the target area.
  • GPS Global Positioning System
  • the aircraft's control system controls the aircraft to land in the target area and corrects its landing position according to the GPS coordinates of the target area.
  • GPS coordinates due to the insufficient accuracy of GPS coordinates and the drift of the aircraft, it is difficult for the aircraft to accurately land on the target when landing in this way.
  • landing using GPS coordinates often has a large landing error, which can be several meters away. , Especially in areas where the landing target area is relatively small.
  • the technical problem to be solved by the present invention is to overcome the use of GPS positioning in the prior art to adjust the landing instructions of the aircraft to guide the aircraft to land. Due to the insufficient GPS coordinate accuracy and the drift of the aircraft, the aircraft landing error is large and it is difficult to reach smaller targets. The problem of accurate landing in the area.
  • the embodiment of the present invention provides an aircraft landing method, which includes: acquiring a current image of a preset take-off and landing point collected by the aircraft during a landing; and obtaining an image of the preset take-off and landing point collected by the aircraft during a take-off phase Matching the image and the pose information of the aircraft when the aircraft collected the matching image; performing feature matching on the current image and the matching image to obtain pairs of matching feature points; according to the pairs of matching feature points, Calculate the feature transformation matrix between the current image and the matching image; according to the feature transformation matrix, obtain the optical center point of the matching image projected to the optical center projection point of the current image; according to the optical center The coordinates of the projection point control the aircraft to land to the preset take-off and landing point.
  • the acquiring the matching image of the preset take-off and landing point and the pose information of the aircraft collected by the aircraft in the take-off phase includes: during the take-off phase, acquiring the The matching image of the preset take-off and landing point collected by the aircraft and the pose information of the aircraft.
  • the aircraft landing method further includes: when the current flying height of the aircraft is greater than a preset height, stopping collecting matching images and pose information of the aircraft.
  • the controlling the aircraft to land to the preset take-off and landing point according to the coordinates of the optical center projection point includes: obtaining the optical center point of the matching image and projecting it to the optical center of the current image The two-dimensional pixel coordinates of the projection point; according to the two-dimensional pixel coordinates, the three-dimensional coordinates of the optical center projection point in the world coordinate system are calculated; according to the three-dimensional coordinates, the aircraft is controlled to land to the preset take-off and landing point.
  • the calculating the three-dimensional coordinates of the optical center projection point in the world coordinate system according to the two-dimensional pixel coordinates includes: acquiring the pose of the aircraft when the matching image is captured by the aircraft And ground height; according to the matching feature point pairs, calculate the compensated yaw angle of the aircraft when shooting the current image; calculate according to the feature transformation matrix, the pose and the ground height The compensated ground height of the current image; according to the compensated yaw angle and the compensated ground height, the three-dimensional coordinates of the optical center projection point are updated to obtain the three-dimensional coordinates
  • the calculating the compensated yaw angle of the current image according to each pair of matching feature points includes: obtaining the first descriptor of the current image and the matching image according to the pair of matching feature points Calculate the deviation angle between the main direction of the first descriptor and the main direction of the second descriptor; calculate the average deviation angle of the deviation angle of each pair of matching feature points, and calculate the deviation
  • the angular mean value is determined as the compensated yaw angle.
  • the calculating the compensated ground height of the current image according to the characteristic transformation matrix, the pose and the ground height includes: decomposing the characteristic transformation matrix to obtain the current image relative to the ground height The relative pose of the matched image; according to the pose, the height to the ground, and the relative pose, the compensated height to the ground of the current image is obtained.
  • the matching feature point pair includes: a first matching feature point and a second matching feature point, the first matching feature point is located on the current image, and the second matching feature point is located on the matching image
  • the aircraft landing method further includes: obtaining corresponding projections of the second matching feature points of the matching image on the current image according to the feature transformation matrix Matching feature points; respectively calculating the distance error between the projection matching feature point and the corresponding first matching feature point, and determining whether the average value of the distance error is less than a preset distance threshold; when the distance error is When the average value is less than the preset distance threshold, the step of projecting the optical center point of the matched image to the optical center projection point of the current image is performed according to the feature transformation matrix.
  • the feature transformation matrix is a homography matrix.
  • the aircraft landing method further includes: when the average value of the distance error is not less than the preset distance threshold, replacing the characteristic transformation matrix from the homography matrix to the basic matrix, and re The step of obtaining corresponding projection matching feature points of each second matching feature point of the matching image on the current image according to the feature transformation matrix is performed.
  • the feature transformation matrix is replaced by the homography matrix with the basic matrix, and it is determined that the average value of the distance error is not less than a preset distance threshold, the acquisition of the preset starting point currently collected by the aircraft is performed. The step of the current image of the landing point and the matching image of the preset landing point collected by the aircraft during the take-off phase.
  • the embodiment of the present invention also provides an aircraft landing device, including: a current image acquisition module for acquiring a current image of a preset take-off and landing point acquired by the aircraft during landing; a matching image acquisition module for acquiring the The matching image of the preset take-off and landing point collected by the aircraft during the take-off phase and the pose information of the aircraft when the aircraft collected the matching image; a matching feature point pair generation module for comparing the current image and the current image The matching image performs feature matching to obtain each matching feature point pair; a feature transformation matrix calculation module for calculating a feature transformation matrix between the current image and the matching image according to each matching feature point pair; optical center projection Point generation module, used to obtain the optical center point of the matching image according to the feature transformation matrix and projected to the optical center projection point of the current image; landing instruction adjustment module, used to control the coordinates of the optical center projection point The aircraft lands to the preset take-off and landing point.
  • the matching image obtaining module is specifically configured to obtain the matching image of the preset take-off and landing point and the pose information of the aircraft collected by the aircraft at every preset flying height during the take-off phase.
  • the matching image acquisition module is further configured to stop collecting matching images and pose information of the aircraft.
  • the landing instruction adjustment module includes: a two-dimensional pixel coordinate acquisition sub-module for acquiring two-dimensional pixel coordinates of the optical center point of the matched image projected to the optical center projection point of the current image; A module for calculating the three-dimensional coordinates of the optical center projection point in the world coordinate system according to the two-dimensional pixel coordinates; a control sub-module for controlling the aircraft to land to the preset according to the three-dimensional coordinates Takeoff and landing point.
  • the calculation sub-module includes: an information acquisition unit for acquiring the pose and altitude of the aircraft when the aircraft is shooting the matching image; a compensation yaw angle calculation unit for acquiring The matching feature point pairs are used to calculate the compensated yaw angle of the aircraft when the current image is taken; the compensation ground height calculation unit is used to transform the matrix, the pose and the ground Height, calculating the compensated ground height of the current image; a three-dimensional coordinate update unit, configured to update the three-dimensional coordinates of the optical center projection point according to the compensated yaw angle and the compensated ground height, Obtain the three-dimensional coordinates.
  • the compensation yaw angle calculation unit includes: a descriptor generation subunit, configured to obtain a first descriptor of the current image and a second descriptor of the matching image according to the pair of matching feature points
  • the deviation angle calculation subunit is used to calculate the deviation angle between the main direction of the first descriptor and the main direction of the second descriptor;
  • the compensation yaw angle calculation subunit is used to calculate each pair of matching feature points And determine the average value of the deviation angle as the compensation yaw angle.
  • the compensation ground height calculation unit includes: a decomposition subunit for decomposing the feature transformation matrix to obtain the relative pose of the current image with respect to the matched image; a compensation ground height generation subunit, It is used to obtain the compensated ground height of the current image according to the pose, the height to the ground and the relative pose.
  • the aircraft landing device further includes: a projection matching feature point generating module, configured to obtain the corresponding projections of each second matching feature point of the matching image on the current image according to the feature transformation matrix Matching feature points; a judging module, configured to respectively calculate the distance error between each projection matching feature point and the corresponding first matching feature point, and determine whether the average value of the distance error is less than a preset distance threshold; When the average value of the distance error is less than the preset distance threshold, returning to the optical center projection point generating module.
  • a projection matching feature point generating module configured to obtain the corresponding projections of each second matching feature point of the matching image on the current image according to the feature transformation matrix Matching feature points
  • a judging module configured to respectively calculate the distance error between each projection matching feature point and the corresponding first matching feature point, and determine whether the average value of the distance error is less than a preset distance threshold; When the average value of the distance error is less than the preset distance threshold, returning to the optical center projection point generating module.
  • the feature transformation matrix is a homography matrix.
  • the aircraft landing device further includes: a conversion matrix replacement module, when the average value of the distance error is not less than the preset distance threshold, the conversion matrix replacement module is used to convert the characteristic transformation matrix from The homography matrix is replaced with a basic matrix, and the projection matching feature point generation module is returned.
  • a conversion matrix replacement module when the average value of the distance error is not less than the preset distance threshold, the conversion matrix replacement module is used to convert the characteristic transformation matrix from The homography matrix is replaced with a basic matrix, and the projection matching feature point generation module is returned.
  • the aircraft landing device further includes: a second judgment module for calculating the projection matching feature points and the corresponding first matching features respectively And determine whether the average value of the distance error is less than the preset distance threshold, and when it is determined that the average value of the distance error is not less than the preset distance threshold, return to the current image acquisition module.
  • the embodiment of the present invention also provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium storing computer instructions, and the computer instructions are used to make the computer execute the aforementioned aircraft landing method.
  • An embodiment of the present invention further provides a computer device, including: at least one processor; and a memory communicatively connected with the at least one processor, wherein the memory stores instructions that can be executed by the at least one processor, The instructions are executed by the at least one processor, so that the at least one processor executes the aforementioned aircraft landing method.
  • the embodiment of the present invention also provides an aircraft, including: an aircraft body, an image acquisition device, and a flight controller, wherein the image acquisition device and the flight controller are provided on the aircraft body; the image acquisition device Used to collect the current image of the preset take-off and landing point of the aircraft during landing, and collect the matching image of the preset take-off and landing point collected by the aircraft in the take-off phase and when the aircraft collects the matching image
  • the pose information of the aircraft and send the current image, matching image, and pose information to the flight controller; the flight controller is used to receive the current image, matching image, and pose information, and use
  • the aforementioned aircraft landing method controls the aircraft to land to the preset take-off and landing point.
  • the embodiments of the present invention provide an aircraft landing method and device.
  • the method performs feature matching on the current image of the preset take-off and landing point collected during the landing process of the aircraft with the matching image collected during the take-off phase of the aircraft to obtain each matching feature point
  • To calculate the feature transformation matrix of the current image and the matching image obtain the optical center projection point of the optical center point of the matching image on the current image through the feature transformation matrix, and adjust the aircraft landing to the preset starting point according to the coordinates of the optical center projection point Drop point. Therefore, a method for adjusting the landing instructions of the aircraft is realized by using feature matching to guide the aircraft to land to the take-off and landing point, thereby improving the accuracy of the landing of the aircraft.
  • the landing instruction can be adjusted only by successfully matching the collected images to avoid Real-time tracking of the landing point reduces the difficulty of accurate landing of the aircraft.
  • the embodiment of the present invention provides an aircraft.
  • accurate control of the aircraft landing is realized, and the aircraft is directed to land to the take-off and landing point, thereby improving the accuracy of landing of the aircraft.
  • the landing instruction can be adjusted by successfully matching the collected images, which avoids real-time tracking of the landing point and reduces the difficulty of accurate landing of the aircraft.
  • Fig. 1 is a flowchart of an aircraft landing method in an embodiment of the present invention
  • step S6 is a flowchart of step S6 in the landing method of the aircraft shown in FIG. 1 according to the embodiment of the present invention
  • FIG. 3 is a specific flowchart of step S623 in the embodiment of the present invention.
  • FIG. 4 is a flowchart of another embodiment of an aircraft landing method in an embodiment of the present invention.
  • FIG. 5 is a flowchart of still another embodiment of an aircraft landing method in an embodiment of the present invention.
  • Figure 6 is a schematic structural diagram of an aircraft landing device in an embodiment of the present invention.
  • Figure 7 is a schematic structural diagram of a computer device in an embodiment of the present invention.
  • Fig. 8 is a schematic structural diagram of an aircraft in an embodiment of the present invention.
  • the embodiment of the present invention provides an aircraft landing method.
  • the aircraft landing method includes:
  • Step S1 Obtain the current image of the preset take-off and landing point collected during the landing process of the aircraft.
  • Step S2 Obtain the matching images of the preset take-off and landing points collected by the aircraft during the take-off phase and the pose information of the aircraft when the matching images are collected by the aircraft.
  • the aircraft is equipped with a camera or video camera, which can collect images of preset take-off and landing points according to preset rules during the take-off and landing of the aircraft. For example, it can be taken with a downward-looking camera according to a fixed image acquisition time. The gray-scale image of the preset take-off and landing point is taken, and when the image of the preset take-off and landing point is taken during the take-off phase, the pose information of the aircraft is obtained through the shooting parameters of the down-looking camera.
  • Step S3 Perform feature matching on the current image and the matched image to obtain each matching feature point pair.
  • a feature matching algorithm can be used to achieve feature matching of the above two images.
  • scale-invariant feature transform Scale-invariant feature transform, SIFT
  • SIFT Scale-invariant feature transform
  • Step S4 Calculate the feature transformation matrix between the current image and the matched image according to each matching feature point pair.
  • the feature transformation matrix may be a homography matrix or a basic matrix.
  • Step S5 According to the feature transformation matrix, the optical center point of the matched image is projected to the optical center projection point of the current image.
  • the optical center point is a pixel point on the image corresponding to the optical center of the camera during the image capturing process of the camera or the camera.
  • Step S6 Control the aircraft to land to the preset take-off and landing point according to the coordinates of the optical center projection point.
  • the landing instruction of the aircraft is adjusted accordingly, so as to control the aircraft to accurately land to the preset take-off and landing point.
  • the aircraft landing method described above is to perform feature matching between the current image of the preset takeoff and landing point collected during the landing process and the matching image collected during the takeoff phase of the aircraft to obtain each matching feature point pair calculation
  • the feature transformation matrix of the current image and the matching image is used to obtain the optical center projection point of the optical center point of the matching image on the current image, and adjust the aircraft to land to the preset take-off and landing point according to the coordinates of the optical center projection point . Therefore, a method for adjusting the landing instructions of the aircraft is realized by using feature matching to guide the aircraft to land to the take-off and landing point, thereby improving the accuracy of the landing of the aircraft.
  • the landing instruction can be adjusted only by successfully matching the collected images to avoid Real-time tracking of the landing point reduces the difficulty of accurate landing of the aircraft.
  • step S1 the current image of the preset take-off and landing point collected during the landing of the aircraft is acquired.
  • the image acquisition equipment on the aircraft such as a camera, will collect current images of the preset take-off and landing points at a preset time interval, which is a useful tool for judging the aircraft and the preset take-off and landing points. Prepare for the positional relationship.
  • the matching image of the preset take-off and landing point collected by the aircraft during the take-off phase and the pose information of the aircraft when the matching image is collected by the aircraft are acquired.
  • the matching images of the preset take-off and landing points and the pose information of the aircraft collected by the aircraft are acquired every preset flying height, and the collection of the matching images is stopped when the current flying height of the aircraft is greater than the preset height And the pose information of the aircraft.
  • the aircraft will collect images of preset take-off and landing points at different take-off heights during the take-off phase. Any one of these images can be selected as the above-mentioned matching image, for example: you can select from the captured images An image with high definition and complete preset takeoff and landing point information is used as the matching image.
  • step S3 described above feature matching is performed on the current image and the matching image to obtain each matching feature point pair.
  • the SIFT algorithm is used to perform feature matching on the above two images to obtain multiple sets of matching feature points.
  • the SIFT algorithm can be implemented by using existing technology, and details are not described herein again.
  • the feature transformation matrix between the current image and the matching image is calculated according to each pair of matching feature points.
  • the feature transformation matrix may be any of a homography matrix or a basic matrix.
  • the basic matrix may be directly used as the feature transformation matrix.
  • the homography matrix is set as the feature transformation matrix first, and only when the homography matrix does not meet the preset conditions, The feature transformation matrix is updated to the basic matrix to ensure the optimal landing accuracy of the aircraft.
  • the optical center point of the matched image is projected to the optical center projection point of the current image according to the feature transformation matrix.
  • a point on any one of the images can be projected onto another image through the feature transformation matrix according to the feature transformation matrix to obtain that the point is in another image. The projection point of an image.
  • step S6 controlling the aircraft to land to a preset take-off and landing point according to the coordinates of the optical center projection point, specifically includes:
  • Step S61 Obtain the two-dimensional pixel coordinates of the optical center projection point of the matched image projected to the optical center projection point of the current image.
  • the projection point of the optical center of the matched image is projected onto the current image, and the projection point of interest of the current image.
  • the two-dimensional pixel coordinates of the projection point of the optical center of the matching image in the current image can be obtained, and the two-dimensional pixel coordinates reflect the deviation between the aircraft and the preset take-off and landing point.
  • Step S62 According to the two-dimensional pixel coordinates, calculate the three-dimensional coordinates of the optical center projection point from the two-dimensional pixel coordinates to the horizontal coordinate system with the aircraft as the origin.
  • the optical center projection point will deviate from the above-mentioned optical center point of the current image, that is, the optical center projection point and the optical center point of the current image
  • the deviation reflects the current flight status of the aircraft.
  • the three-dimensional world coordinates of the optical center projection point with the aircraft as the origin are obtained by calculation. According to the coordinate value, the distance deviation between the current aircraft and the preset take-off and landing point can be reflected.
  • the landing instructions are adjusted accordingly to ensure the accurate landing of the aircraft.
  • Step S63 Control the aircraft to land to the preset take-off and landing point according to the three-dimensional coordinates.
  • the horizontal relative displacement between the aircraft and the preset take-off and landing point is obtained according to the three-dimensional coordinates, and the above-mentioned steps S1 to S6 are continuously executed, and the above-mentioned horizontal relative displacement is controlled to land the above-mentioned aircraft.
  • the horizontal relative displacement is used as a position closed loop to restrain the aircraft from landing to the preset take-off and landing point.
  • the aforementioned step S62 specifically includes:
  • Step S621 Obtain the attitude and height of the aircraft when the aircraft is shooting the matching image.
  • the matching images of the preset take-off and landing points are collected during the take-off phase of the aircraft, the current flying height and the corresponding pose information of the aircraft are recorded.
  • Step S622 Calculate the compensated yaw angle of the aircraft when shooting the current image according to each matching feature point pair.
  • the aircraft will have a certain angle deviation due to the aircraft's flight attitude and other reasons during the flight to the preset take-off and landing point.
  • the angle deviation can be compensated by calculating the compensation yaw angle of the current image. To further improve the landing accuracy of the aircraft.
  • step S622 specifically includes:
  • Step S6221 Obtain the first descriptor of the current image and the second descriptor of the matched image according to the pair of matching feature points.
  • each set of matching feature point pairs consists of the first matching feature point located in the current image and the second matching feature point located in the matching image.
  • each matching feature point is assigned one or more main points.
  • the feature vector of the main direction of each matching feature point is obtained as the descriptor of the matching feature point, and the above-mentioned first descriptor and second descriptor respectively correspond to the above-mentioned first matching feature point and second matching feature point.
  • Step S6222 Calculate the deviation angle between the main direction of the first descriptor and the main direction of the second descriptor. In practical applications, each pair of matching feature points will produce a deviation angle.
  • Step S6223 Calculate the mean deviation angle of the deviation angle of each matching feature point pair, and determine the mean deviation angle as the compensation yaw angle.
  • the average value of the deviation angles of the above-mentioned matching feature point pairs is used as the compensation yaw angle to compensate the angle deviation of the aircraft from the preset take-off and landing point during the landing process.
  • Step S623 Calculate the compensated ground height of the current image according to the feature transformation matrix, the pose and the ground height.
  • the aircraft's flying altitude drops sharply during landing
  • there is a certain deviation in the positioning of the aircraft's ground altitude (flight altitude) and the aircraft's landing instructions need to be referred to
  • the current altitude of the aircraft over the ground will affect the accuracy of the aircraft landing when there is a deviation in the altitude over the ground. Therefore, it is necessary to further improve the accuracy of the aircraft landing by compensating the altitude over the ground.
  • step S623 specifically includes:
  • Step S6231 Decompose the feature transformation matrix to obtain the relative pose of the current image with respect to the matched image.
  • the pose of the current position relative to the position of the matched image can be obtained by decomposing the homography matrix or the basic matrix, and the decomposition process can be realized by using the existing technology, and will not be repeated here.
  • Step S6232 Obtain the compensated ground height of the current image according to the pose, the height to the ground and the relative pose.
  • the relative height difference between the current position of the aircraft and the position where the matching image is collected can be obtained by the above-mentioned pose and relative pose of the matching image, and the sum of the height difference and the ground height of the position where the matching image is collected is Compensation for the current altitude of the aircraft.
  • the difference between the compensation altitude and the current altitude of the aircraft obtained by GPS positioning and other altitude positioning methods can be used to adjust the aircraft's landing command to improve the accuracy of the aircraft landing .
  • Step S624 Update the three-dimensional coordinates of the projection point of the optical center according to the compensated yaw angle and the compensated ground height to obtain the three-dimensional coordinates.
  • the three-dimensional coordinates of the above-mentioned optical center projection point are adjusted correspondingly through the calculation of the above-mentioned compensation yaw angle and the compensation of the height to the ground, and then the flight status that can accurately reflect the current landing process of the aircraft can be obtained.
  • the landing instructions of the aircraft are adjusted accordingly to improve the accuracy of the aircraft landing.
  • the aforementioned aircraft landing method further includes:
  • Step S7 Obtain the corresponding projection matching feature points of each second matching feature point of the matching image on the current image according to the feature transformation matrix.
  • the accuracy of the image matching is determined by the distance error between the projected matching feature point on the current image and the corresponding matching feature point.
  • Step S8 Calculate the distance errors between the projection matching feature points and the corresponding first matching feature points, and determine whether the average value of the distance errors is less than a preset distance threshold.
  • the distance error between the projected matching feature point and the matching feature point in each set of feature point pairs is obtained by calculation, and the image matching meets the requirements by judging whether the average of these distance errors is less than the preset distance threshold .
  • the average value of the distance error is less than the preset distance threshold
  • perform the above step S5 when the average value of the distance error is not less than the preset distance threshold, return to the above step S1, reacquire the current current image of the aircraft or according to the actual The situation re-selects the images of other preset take-off and landing points taken during the take-off phase as the matching images.
  • the above steps S1 to S8 can be continuously executed during the entire landing process of the aircraft to continuously adjust the landing instructions of the aircraft until the aircraft lands to the preset take-off and landing point, for example:
  • the current image is collected at a time interval, and each time the aircraft acquires the current image, the above steps S1 to S8 are executed until the aircraft lands.
  • the aforementioned aircraft landing method further includes:
  • Step S9 Replace the feature transformation matrix from the homography matrix to the basic matrix, and repeat the above step S7.
  • Step S10 After the feature transformation matrix is replaced by the homography matrix with the basic matrix, when it is judged that the average value of the distance error is not less than the preset distance threshold, perform the above step S5, when the average value of the distance error is not less than the preset distance threshold , Return to the above step S1.
  • the homography matrix is based on the plane assumption to calculate the feature transformation matrix of the two images, in practical applications, the ground plane is used as the hypothetical plane to calculate the homography matrix. If the homography matrix is used, the above distance is calculated The average value of the error is not less than the preset distance threshold, indicating that the plane assumption is invalid, and the above-mentioned feature conversion matrix needs to be updated to the basic matrix. Since the homography matrix is more accurate than the basic matrix, in the embodiment of the present invention, the homography matrix is preferentially set as the feature transformation matrix. Only when the homography matrix does not meet the preset conditions, the feature transformation matrix Update to the basic matrix to ensure the optimal landing accuracy of the aircraft.
  • the aircraft landing method performs feature matching between the current image of the preset take-off and landing point collected during the landing process of the aircraft and the matching image collected during the take-off phase of the aircraft to obtain each match
  • the feature point pair calculates the feature transformation matrix of the current image and the matching image, and obtains the optical center projection point of the optical center point of the matching image on the current image through the feature transformation matrix, and controls the aircraft to land to the expected position according to the coordinates of the optical center projection point.
  • Set take-off and landing points are provided by the embodiment of the present invention performs feature matching between the current image of the preset take-off and landing point collected during the landing process of the aircraft and the matching image collected during the take-off phase of the aircraft to obtain each match
  • the feature point pair calculates the feature transformation matrix of the current image and the matching image, and obtains the optical center projection point of the optical center point of the matching image on the current image through the feature transformation matrix, and controls the aircraft to land to the expected position according to the coordinates of the optical center projection point.
  • the embodiment of the present invention provides an aircraft landing device.
  • the aircraft landing device includes:
  • the current image acquisition module 1 is used to acquire the current image of the preset take-off and landing point collected during the landing of the aircraft.
  • the detailed functions of the current image acquisition module 1 refer to the related description of step S1 in the foregoing embodiment.
  • the matching image acquisition module 2 acquires the matching images of the preset take-off and landing points collected by the aircraft during the take-off phase and the pose information of the aircraft when the matching images are collected by the aircraft.
  • the matching image acquisition module 2 For detailed functions of the matching image acquisition module 2, refer to the related description of step S2 in the foregoing embodiment.
  • the matching feature point pair generation module 3 is used to perform feature matching on the current image and the matching image to obtain each matching feature point pair.
  • the matching feature point pair generation module 3 refers to the related description of step S3 in the foregoing embodiment.
  • the feature transformation matrix calculation module 4 is used to calculate the feature transformation matrix between the current image and the matching image according to each matching feature point pair. For detailed functions of the feature transformation matrix calculation module 4, refer to the related description of step S4 in the foregoing embodiment.
  • the optical center projection point generating module 5 is used to obtain the optical center point of the matched image and project it to the optical center projection point of the current image according to the feature transformation matrix.
  • the optical center projection point generating module 5 refers to the related description of step S5 in the foregoing embodiment.
  • the landing instruction adjustment module 6 is used to control the aircraft to land to the preset take-off and landing point according to the coordinates of the optical center projection point. For detailed functions of the landing instruction adjustment module 6, refer to the related description of step S6 in the foregoing embodiment.
  • the aircraft landing device performs feature matching between the current image of the preset take-off and landing point collected during the landing of the aircraft and the matching image collected during the take-off phase of the aircraft to obtain each
  • the matching feature point pair calculates the feature transformation matrix of the current image and the matching image, and obtains the optical center projection point of the optical center point of the matched image on the current image through the feature transformation matrix, and controls the aircraft to land to the target according to the coordinates of the optical center projection point Preset take-off and landing points.
  • a method for adjusting the landing instructions of the aircraft is realized, guiding the aircraft to land to the take-off and landing point, thereby improving the accuracy of the landing of the aircraft. Only by successfully matching the collected images, the landing instructions can be adjusted, avoiding the need for landing. Real-time tracking of points reduces the difficulty of accurate landing of the aircraft.
  • the above-mentioned matching image acquisition module 2 is specifically configured to acquire the matching image of the preset take-off and landing point collected by the aircraft and the aircraft at every preset flight height during the take-off phase.
  • the pose information For detailed functions of the image acquisition module 2, refer to the related description of step S2 in the foregoing embodiment.
  • the matching image acquisition module 2 is further configured to stop collecting matching images and pose information of the aircraft.
  • the image acquisition module 2 refer to the related description of step S2 in the foregoing embodiment.
  • the landing instruction adjustment module 6 includes: a two-dimensional pixel coordinate acquisition sub-module for acquiring the optical center point of the matching image projected to the second of the optical center projection point of the current image Dimension pixel coordinates.
  • a two-dimensional pixel coordinate acquisition sub-module for acquiring the optical center point of the matching image projected to the second of the optical center projection point of the current image Dimension pixel coordinates.
  • the calculation sub-module is configured to calculate the three-dimensional coordinates of the optical center projection point in the world coordinate system according to the two-dimensional pixel coordinates.
  • the calculation sub-module For detailed functions of the calculation sub-module, refer to the related description of step S62 in the foregoing embodiment.
  • the control sub-module is used to control the aircraft to land to the preset take-off and landing point according to the three-dimensional coordinates.
  • control sub-module For detailed functions of the control sub-module, refer to the related description of step S63 in the foregoing embodiment.
  • the calculation sub-module includes: an information acquisition unit configured to acquire the pose and height of the aircraft when the aircraft is shooting the matching image.
  • an information acquisition unit configured to acquire the pose and height of the aircraft when the aircraft is shooting the matching image.
  • the compensated yaw angle calculation unit is configured to calculate the compensated yaw angle of the aircraft when the current image is taken according to the pair of matching feature points.
  • the compensation yaw angle calculation unit refers to the related description of step S622 in the foregoing embodiment.
  • the compensation ground height calculation unit is configured to calculate the compensation ground height of the current image according to the feature transformation matrix, the pose and the ground height. For detailed functions of the compensation ground height calculation unit, refer to the related description of step S623 in the above embodiment.
  • the three-dimensional coordinate update unit is configured to update the three-dimensional coordinates of the optical center projection point according to the compensated yaw angle and the compensated ground height to obtain the three-dimensional coordinates.
  • the three-dimensional coordinate update unit For the detailed functions of the three-dimensional coordinate update unit, refer to the related description of step S624 in the foregoing embodiment.
  • the compensation yaw angle calculation unit includes: a descriptor generating subunit, configured to obtain the first descriptor of the current image and the matching image according to the pair of matching feature points The second descriptor.
  • a descriptor generating subunit configured to obtain the first descriptor of the current image and the matching image according to the pair of matching feature points The second descriptor.
  • the deviation angle calculation subunit is configured to calculate the deviation angle between the main direction of the first descriptor and the main direction of the second descriptor.
  • the deviation angle calculation subunit refer to the related description of step S6222 in the foregoing embodiment.
  • the compensation yaw angle calculation subunit is used to calculate the average deviation angle of the deviation angle of each matching feature point pair, and determine the average deviation angle as the compensation yaw angle.
  • the compensation yaw angle calculation subunit For detailed functions of the compensation yaw angle calculation subunit, refer to the related description of step S6223 in the foregoing embodiment.
  • the compensation ground height calculation unit includes: a decomposition subunit for decomposing the feature transformation matrix to obtain the relative pose of the current image relative to the matching image.
  • a decomposition subunit for decomposing the feature transformation matrix to obtain the relative pose of the current image relative to the matching image.
  • the compensation ground height generating subunit is used to obtain the compensated ground height of the current image according to the pose, the ground height, and the relative pose.
  • the sub-unit for generating height compensation to the ground refer to the related description of step S6232 in the foregoing embodiment.
  • the aircraft landing device further includes: a projection matching feature point generating module, configured to obtain each second matching feature point of the matching image on the current image according to the feature transformation matrix The corresponding projections match the feature points.
  • a projection matching feature point generating module configured to obtain each second matching feature point of the matching image on the current image according to the feature transformation matrix The corresponding projections match the feature points.
  • the judgment module is configured to calculate the distance error between each projection matching feature point and the corresponding first matching feature point, and determine whether the average value of the distance error is less than a preset distance threshold; when the distance When the average value of the error is less than the preset distance threshold, return to the optical center projection point generating module.
  • the feature transformation matrix is a homography matrix.
  • the aircraft landing device further includes: a conversion matrix replacement module, when the average value of the distance error is not less than the preset distance threshold, the conversion matrix replacement module is used to The feature transformation matrix is replaced with a basic matrix from the homography matrix, and the projection matching feature point generation module is returned.
  • a conversion matrix replacement module when the average value of the distance error is not less than the preset distance threshold, the conversion matrix replacement module is used to The feature transformation matrix is replaced with a basic matrix from the homography matrix, and the projection matching feature point generation module is returned.
  • the aircraft landing device further includes: a second judgment module for calculating the projection matching feature points and the corresponding And determine whether the average value of the distance error is less than the preset distance threshold, and when it is determined that the average value of the distance error is not less than the preset distance threshold, return to the current image acquisition module 1.
  • a second judgment module for calculating the projection matching feature points and the corresponding And determine whether the average value of the distance error is less than the preset distance threshold, and when it is determined that the average value of the distance error is not less than the preset distance threshold, return to the current image acquisition module 1.
  • the aircraft landing device performs feature matching between the current image of the preset take-off and landing point collected during the landing of the aircraft and the matching image collected during the take-off phase of the aircraft to obtain each
  • the matching feature point pair calculates the feature transformation matrix of the current image and the matching image, and obtains the optical center projection point of the optical center point of the matched image on the current image through the feature transformation matrix, and controls the aircraft to land to the target according to the coordinates of the optical center projection point Preset take-off and landing points.
  • a method for adjusting the landing instructions of the aircraft is realized, guiding the aircraft to land to the take-off and landing point, thereby improving the accuracy of the landing of the aircraft. Only by successfully matching the collected images, the landing instructions can be adjusted, avoiding any changes to the landing. Real-time tracking of points reduces the difficulty of accurate landing of the aircraft.
  • the embodiment of the present invention provides a non-transitory computer storage medium, the computer storage medium stores computer-executable instructions, and the computer-executable instructions can execute the aircraft landing method in any of the foregoing method embodiments, wherein the foregoing storage medium may be Disk, optical disk, read-only memory (Read-Only Memory, ROM), random access memory (RAM), flash memory (Flash Memory), hard disk (Hard Disk Drive, abbreviation: HDD) or Solid-State Drive (SSD), etc.; the storage medium may also include a combination of the foregoing types of memories.
  • the foregoing storage medium may be Disk, optical disk, read-only memory (Read-Only Memory, ROM), random access memory (RAM), flash memory (Flash Memory), hard disk (Hard Disk Drive, abbreviation: HDD) or Solid-State Drive (SSD), etc.
  • the storage medium may also include a combination of the foregoing types of memories.
  • the program can be stored in a computer readable storage medium. At this time, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disk, a read-only memory (ROM) or a random access memory (RAM), etc.
  • An embodiment of the present invention provides a computer device.
  • a schematic structural diagram of the computer device is shown in FIG. 7.
  • the computer device includes one or more processors 410 and a memory 420.
  • one processor 410 is taken as an example.
  • the foregoing computer equipment may further include: an input device 430 and an output device 440.
  • the processor 410, the memory 420, the input device 430, and the output device 440 may be connected by a bus or in other ways. In FIG. 7, the connection by a bus is taken as an example.
  • the processor 410 may be a central processing unit (Central Processing Unit, CPU).
  • the processor 410 may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), or Chips such as other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, or a combination of the above types of chips.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 420 can be used to store non-transitory software programs, non-transitory computer executable programs and modules, such as program instructions/modules corresponding to the aircraft landing method in the embodiment of the present application
  • the processor 410 executes various functional applications and data processing of the server by running non-transitory software programs, instructions, and modules stored in the memory 420, that is, realizing the aircraft landing method of the foregoing method embodiment.
  • the memory 420 may include a storage program area and a storage data area.
  • the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the processing device of the aircraft landing method, etc. .
  • the memory 420 may include a high-speed random access memory, and may also include a non-transitory memory, such as at least one magnetic disk storage device, a flash memory device, or other non-transitory solid-state storage devices.
  • the memory 420 may optionally include memories remotely provided with respect to the processor 410, and these remote memories may be connected to the aircraft landing device via a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the input device 430 may receive inputted numeric or character information, and generate key signal inputs related to user settings and function control related to the processing device of the aircraft landing operation.
  • the output device 440 may include a display device such as a display screen.
  • One or more modules are stored in the memory 420, and when executed by one or more processors 410, the methods shown in FIGS. 1 to 5 are executed.
  • the embodiment of the present invention also provides an aircraft.
  • a schematic structural diagram of the aircraft is shown in FIG. 8.
  • the aircraft includes: an aircraft body 101, an image acquisition device 102, and a flight controller 103, wherein the image acquisition device 102 and the aircraft
  • the controller 103 is arranged on the aircraft body 101; the image acquisition device 102 is used to acquire the current image of the predetermined take-off and landing point of the aircraft during the landing process, and to collect the aircraft Preset the matching image of the take-off and landing point and the pose information of the aircraft when the aircraft collects the matching image, and send the current image, matching image, and pose information to the flight controller 103;
  • the flight controller 103 is configured to receive the current image, the matching image, and the pose information, and use the aircraft landing method as in the foregoing embodiment to control the aircraft to land to the preset take-off and landing point.
  • the flight controller 103 controlling the aircraft to land refer to the relevant description of the aircraft landing method in the foregoing embodiment, and details are not described herein again.
  • the aircraft provided by the embodiments of the present invention realizes precise control of the aircraft landing by setting the image acquisition device and the flight controller on the aircraft, and guides the aircraft to land to the take-off and landing point, thereby improving
  • the landing accuracy of the aircraft can be adjusted only by successfully matching the collected images, avoiding real-time tracking of the landing point and reducing the difficulty of accurate landing of the aircraft.

Abstract

一种飞行器降落方法、装置及实现该方法的存储介质、计算机设备和飞行器。该方法包括:获取飞行器在降落过程中采集的预设起降点的当前图像;获取飞行器在起飞阶段采集的预设起降点的匹配图像及飞行器的位姿信息;对当前图像和匹配图像进行特征匹配,得到各匹配特征点对;根据各匹配特征点对计算当前图像与匹配图像之间的特征变换矩阵;根据特征变换矩阵得到匹配图像的光心点投影至当前图像的光心投影点;根据光心投影点的坐标控制飞行器降落至预设起降点。该方法能调整飞行器降落指令,指引飞行器降落至起降点,提高飞行器的降落精确度,通过对采集图像的成功匹配即可对降落指令做出调整,避免对降落点的实时跟踪,降低飞行器精准降落的难度。

Description

一种飞行器降落方法及装置
本申请要求于2019年4月19日提交中国专利局、申请号为201910318700.8、申请名称为“一种飞行器降落方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及飞行器技术领域,具体涉及一种飞行器降落方法及装置。
背景技术
随着飞行器的不断发展,飞行器应用的领域也越来越广。带有飞行功能的飞行器可以根据终端发送的降落指令进行降落,当然,有些智能飞行器具有自动返航或者一键降落的功能,目前,飞行器常用的降落方式是使用GPS(Global Positioning System,全球定位系统)辅助降落,即将目标区域的GPS坐标发送至飞行器,飞行器的控系统控制飞行器在目标区域降落,并根据目标区域的GPS坐标修正自身的降落位置。但是,由于GPS坐标精度不足,以及飞行器的漂移等问题,飞行器采用这种方式降落时很难准确的降落到目标上,同时,使用GPS坐标进行降落往往具有很大的降落误差,可差距数米,特别是在降落目标区域比较小的区域并不适用。
发明内容
本发明要解决的技术问题在于克服现有技术中利用GPS定位等调整飞行器的降落指令,指引飞行器降落,由于GPS坐标精度不足,以及飞行器的漂移,造成飞行器降落误差大、难以在较小的目标区域进行精准降落的问题。
本发明实施例提供了一种飞行器降落方法,包括:获取所述飞行器在降落过程中采集的预设起降点的当前图像;获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息;对所述当前图像和所述匹配图像进行特征匹配,得到各匹配特征点对;根据所述各匹配特征点对,计算所述当前图像与所述匹配图像之间的特征变换矩阵;根据所述特征变换矩阵,得到所述匹配图像的光心点投影至所述当前图像的光心投影点;根据所述光心投影点的坐标,控制所述飞行器降落至所述预设起降点。
可选地,所述获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器的位姿信息,包括:在起飞阶段,每隔预设飞行高度,获取所述飞行器采集的所述预设起降点的匹配图像和所述飞行器的位姿信息。
可选地,所述飞行器降落方法,还包括:当所述飞行器的当前飞行高度大于预设高度时,停止采集匹配图像和所述飞行器的位姿信息。
可选地,所述根据所述光心投影点的坐标,控制所述飞行器降落至所述预设起降点,包括:获取所述匹配图像的光心点投影至所述当前图像的光心投影点的二维像素坐标;根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标;根据所述三维坐标,控制所述飞行器降落至所述预设起降点。
可选地,所述根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标,包括:获取所述飞行器在拍摄所述匹配图像时,所述飞行器的位姿及对地高度;根据所述各匹配特征点对,计算所述飞行器在拍摄所述当前图像时的补偿偏航角;根据所述特征变换矩阵、所述位姿及所述对地高度,计算所述当前图像的补偿对地高度;根据所述补偿偏航角、所述补偿对地高度,对所述光心投影点的所述三维坐标进行更新,得到所述三维坐标
可选地,所述根据各所述匹配特征点对计算所述当前图像的补偿偏航角,包括:根据所述匹配特征点对,得到所述当前图像的第一描述子及所述匹配图像的第二描述子;计算所述第一描述子的主方向与所述第二描述子主方向的偏差角;计算各所述匹配特征点对的偏差角的偏差角均值,并将所述偏差角均值确定为所述补偿偏航角。
可选地,所述根据所述特征变换矩阵、所述位姿及所述对地高度计算所述当前图像的补偿对地高度,包括:分解所述特征变换矩阵得到所述当前图像相对于所述匹配图像的相对位姿;根据所述位姿、所述对地高度及所述相对位姿,得到所述当前图像的补偿对地高度。
可选地,所述匹配特征点对包括:第一匹配特征点和第二匹配特征点,所述第一匹配特征点位于所述当前图像上,所述第二匹配特征点位于所述匹配图像上,在所述根据所述各匹配特征点对,计算所述当前图像与所述匹配图像之间的特征变换矩阵之后,在所述根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点之前,所述飞行器降落方法还包括:根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点;分别计算所述投影匹配特征点与对应的所述第一匹配特征点之间的距离误差,并判断所述距离误差的平均值是否小于预设距离阈值;当所述距离误差的平均值小于所述预设距离阈值时,执行所述根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点的步骤。
可选地,所述特征变换矩阵为单应矩阵。
可选地,所述飞行器降落方法,还包括:当所述距离误差的平均值不小于所述预设距离阈值时,将所述特征变换矩阵由所述单应矩阵替换为基础矩阵,并重新执行所述根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点的步骤。可选地,当所述特征变换矩阵由所述单应矩阵替换为基础矩阵后,并且判断所述距离误差的平均值 不小于预设距离阈值时,执行所述获取飞行器当前采集的预设起降点的当前图像及所述飞行器在起飞阶段采集的所述预设起降点的匹配图像的步骤。
本发明实施例还提供了一种飞行器降落装置,包括:当前图像获取模块,用于获取所述飞行器在降落过程中采集的预设起降点的当前图像;匹配图像获取模块用于获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息;匹配特征点对生成模块,用于对所述当前图像和所述匹配图像进行特征匹配,得到各匹配特征点对;特征变换矩阵计算模块,用于根据所述各匹配特征点对计算所述当前图像与所述匹配图像之间的特征变换矩阵;光心投影点生成模块,用于根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点;降落指令调整模块,用于根据所述光心投影点的坐标控制所述飞行器降落至所述预设起降点。
可选地,所述匹配图像获取模块具体用于在起飞阶段,每隔预设飞行高度,获取所述飞行器采集的所述预设起降点的匹配图像和所述飞行器的位姿信息。
可选地,当所述飞行器的当前飞行高度大于预设高度时,所述匹配图像获取模块还用于停止采集匹配图像和所述飞行器的位姿信息。
可选地,所述降落指令调整模块包括:二维像素坐标获取子模块,用于获取所述匹配图像的光心点投影至所述当前图像的光心投影点的二维像素坐标;计算子模块,用于根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标;控制子模块,用于根据所述三维坐标,控制所述飞行器降落至所述预设起降点。
可选地,所述计算子模块包括:信息获取单元,用于获取所述飞行器在拍摄所述匹配图像时,所述飞行器的位姿及对地高度;补偿偏航角计算单元,用于根据所述各匹配特征点对,计算所述飞行器在拍摄所述当前图像时的补偿偏航角;补偿对地高度计算单元,用于根据所述特征变换矩阵、所述位姿及所述对地高度,计算所述当前图像的补偿对地高度;三维坐标更新单元,用于根据所述补偿偏航角、所述补偿对地高度,对所述光心投影点的所述三维坐标进行更新,得到所述三维坐标。
可选地,所述补偿偏航角计算单元包括:描述子生成子单元,用于根据所述匹配特征点对,得到所述当前图像的第一描述子及所述匹配图像的第二描述子;偏差角计算子单元,用于计算所述第一描述子的主方向与所述第二描述子主方向的偏差角;补偿偏航角计算子单元,用于计算各所述匹配特征点对的偏差角的偏差角均值,并将所述偏差角均值确定为所述补偿偏航角。
可选地,所述补偿对地高度计算单元包括:分解子单元,用于分解所述特征变换矩阵得到所述当前图像相对于所述匹配图像的相对位姿;补偿对地高度生成子单元,用于根据所述位姿、所述对地高度及所述相对位姿,得到所述当前图像的补偿对地高度。
可选地,所述飞行器降落装置还包括:投影匹配特征点生成模块,用于根 据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点;判断模块,用于分别计算所述各投影匹配特征点与对应的所述第一匹配特征点之间的距离误差,并判断所述距离误差的平均值是否小于预设距离阈值;当所述距离误差的平均值小于所述预设距离阈值时,返回所述光心投影点生成模块。
可选地,所述特征变换矩阵为单应矩阵。
可选地,所述飞行器降落装置还包括:转换矩阵替换模块,当所述距离误差的平均值不小于所述预设距离阈值时,所述转换矩阵替换模块用于将所述特征变换矩阵由所述单应矩阵替换为基础矩阵,返回所述投影匹配特征点生成模块。
可选地,当所述特征变换矩阵由所述单应矩阵替换为基础矩阵后,所述飞行器降落装置还包括:第二判断模块,用于分别计算投影匹配特征点与对应的第一匹配特征点之间的距离误差,并判断距离误差的平均值是否小于预设距离阈值,当判断所述距离误差的平均值不小于预设距离阈值时,返回所述当前图像获取模块。
本发明实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行上述的飞行器降落方法。
本发明实施例还提供了一种计算机设备,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器执行上述的飞行器降落方法。
本发明实施例还提供了一种飞行器,包括:飞行器本体、图像采集设备及飞行控制器,其中,所述图像采集设备与所述飞行控制器设置于所述飞行器本体上;所述图像采集设备用于采集所述飞行器在降落过程中的预设起降点的当前图像,并采集所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息,并将所述当前图像、匹配图像及位姿信息发送至所述飞行控制器;所述飞行控制器用于接收所述当前图像、匹配图像及位姿信息,并采用上述的飞行器降落方法控制所述飞行器降落至所述预设起降点。
本发明技术方案,具有如下优点:
本发明实施例提供了一种飞行器降落方法及装置,该方法通过将飞行器在降落过程中采集的预设起降点的当前图像与飞行器起飞阶段采集的匹配图像进行特征匹配,得到各匹配特征点对计算当前图像与匹配图像的特征变换矩阵,通过该特征变换矩阵得到匹配图像的光心点在当前图像上的光心投影点,并根据该光心投影点的坐标调整飞行器降落至预设起降点。从而利用特征匹配实现了一种飞行器降落指令的调整方法,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调 整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
本发明实施例提供了一种飞行器,通过在飞行器上设置图像采集设备及飞行控制器,实现了对飞行器降落的精准控制,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
附图说明
为了更清楚地说明本发明具体实施方式或现有技术中的技术方案,下面将对具体实施方式或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施方式,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例中一种飞行器降落方法的流程图;
图2为本发明实施例图1所示飞行器降落方法中步骤S6的流程图;
图3为本发明实施例中步骤S623的具体流程图;
图4为本发明实施例中一种飞行器降落方法另一实施例的流程图;
图5为本发明实施例中一种飞行器降落方法再一实施例的流程图;
图6为本发明实施例中一种飞行器降落装置的结构示意图;
图7为本发明实施例中一种计算机设备的结构示意图;
图8为本发明实施例中一种飞行器的结构示意图。
具体实施方式
下面将结合附图对本发明的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
在本发明的描述中,需要说明的是,术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性。
此外,下面所描述的本发明不同实施方式中所涉及的技术特征只要彼此之间未构成冲突就可以相互结合。
本发明实施例提供了一种飞行器降落方法,如图1所示,该飞行器降落方法包括:
步骤S1:获取飞行器在降落过程中采集的预设起降点的当前图像。
步骤S2:获取飞行器在起飞阶段采集的预设起降点的匹配图像和飞行器采集匹配图像时飞行器的位姿信息。在实际应用中,飞行器上搭载有照相机或摄像机,可以在飞行器起飞和降落过程中按照预先设置的规则,采集预设起降点的图像,例如,可以按照固定的图像采集时间通过下视相机拍摄的预设起降点的灰度图像,并且在起飞阶段拍摄预设起降点的图像时,通过下视相机的拍 摄参数等得到飞行器的位姿信息。
步骤S3:对当前图像和匹配图像进行特征匹配,得到各匹配特征点对。具体地,在本发明实施例中,可以采用特征匹配算法实现上述两个图像的特征匹配,需要说明的是,在本发明实施例中是采用采用尺度不变特征变换(Scale-invariant feature transform,简称SIFT)算法来实现上述两个图像的特征匹配,在实际应用中,还可以采用其他特征匹配算法进行特征匹配,本发明并不以此为限。
步骤S4:根据各匹配特征点对,计算当前图像与匹配图像之间的特征变换矩阵。具体地,该特征变换矩阵可以是单应矩阵或基础矩阵。
步骤S5:根据特征变换矩阵得到匹配图像的光心点投影至当前图像的光心投影点。具体地,该光心点为上述的照相机或摄像机在图像拍摄过程中摄像机光心所对应的图像上的像素点。
步骤S6:根据光心投影点的坐标控制飞行器降落至预设起降点。在实际应用中,当飞行器在上述预设起降点附近准备降落时,如果偏离起降点则上述的光心投影点会与上述当前图像的实际光心点在位置上产生偏差,因此,可以根据该光心投影点的坐标信息对该飞行器的降落指令进行相应的调整,进而控制飞行器可以精准降落至预设起降点。
通过上述步骤S1至步骤S6,上述的飞行器降落方法,通过将飞行器在降落过程中采集的预设起降点的当前图像与飞行器起飞阶段采集的匹配图像进行特征匹配,得到各匹配特征点对计算当前图像与匹配图像的特征变换矩阵,通过该特征变换矩阵得到匹配图像的光心点在当前图像上的光心投影点,并根据该光心投影点的坐标调整飞行器降落至预设起降点。从而利用特征匹配实现了一种飞行器降落指令的调整方法,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
具体地,在一实施例中,上述的步骤S1,获取飞行器在降落过程中采集的预设起降点的当前图像。在实际应用中,当飞行器在降落过程中,飞行器上所搭载的图像采集设备例如摄像机,会按照预设的时间间隔采集预设起降点的当前图像,为判断飞行器与预设起降点的位置关系做准备。
具体地,在一实施例中,上述的步骤S2,获取飞行器在起飞阶段采集的预设起降点的匹配图像和飞行器采集匹配图像时飞行器的位姿信息。具体地,在起飞阶段,每隔预设飞行高度,获取飞行器采集的预设起降点的匹配图像和飞行器的位姿信息,当飞行器的当前飞行高度大于预设高度时,停止采集上述匹配图像和飞行器的位姿信息。在实际应用中,在飞行器在起飞阶段会在不同的起飞高度采集预设起降点的图像,可以选取这些图像中的任意一张图像作为上述的匹配图像,例如:可以从拍摄的图像中选择一个清晰度高、预设起降点信息保留完整的图像作为该匹配图像。
具体地,在一实施例中,上述的步骤S3,对当前图像和匹配图像进行特 征匹配,得到各匹配特征点对。在实际应用中,采用SIFT算法对上述的两个图像进行特征匹配,得到多组匹配特征点,具体地,该SIFT算法可以采用现有技术实现,在此不再进行赘述。
具体地,在一实施例中,上述的步骤S4,根据各匹配特征点对计算当前图像与匹配图像之间的特征变换矩阵。具体地,该特征变换矩阵可以为单应矩阵或基础矩阵的任意一种,在实际应用中,可以是直接使用基础矩阵作为该特征变换矩阵。较佳地,由于单应矩阵比基础矩阵的准确性更好,因而在本发明实施例中优先将单应矩阵设置为特征变换矩阵,只有在单应矩阵不满足预设条件的情况下,再将特征变换矩阵更新为基础矩阵,以保证飞行器的降落精度的最优。
具体地,在一实施例中,上述的步骤S5,根据特征变换矩阵得到匹配图像的光心点投影至当前图像的光心投影点。具体地,在得到上述当前图像与匹配图像的之间的特征变换矩阵后,可以根据该特征变换矩阵将其中任意一个图像上的点通过特征变换矩阵投影至另一个图像上,得到该点在另一图像的投影点。
在一较佳实施例中,如图2所示,上述的步骤S6,根据光心投影点的坐标控制飞行器降落至预设起降点,具体包括:
步骤S61:获取匹配图像的光心点投影至当前图像的光心投影点的二维像素坐标。在实际应用中,由于飞行器的飞行位置是在不断变化的,使得所获取的当前图像与匹配图像之间存在差异,匹配图像的光心投影点投影至当前图像上,以当前图像的关心投影点为原点,可以得到匹配图像光心投影点在当前图像的二维像素坐标,该二维像素坐标反映出飞行器与预设起降点之间的偏差。
步骤S62:根据二维像素坐标,计算光心投影点从二维像素坐标到以飞行器为原点的水平坐标系的三维坐标。在实际应用中,如果上述的飞行器在降落过程中偏离上述预设起降点,则该光心投影点会偏离上述当前图像的光心点,即该光心投影点与当前图像的光心点的偏差反应的飞行器的当前飞行状态,通过计算得到该光心投影点以飞行器为原点的三维世界坐标,根据该坐标值即可以反映出当前飞行器与预设起降点的距离偏差,即可对降落指令进行对应的调整,从而保证飞行器的精准降落。
步骤S63:根据三维坐标控制飞行器降落至预设起降点。在实际应用中,根据该三维坐标得到飞行器与预设起降点的水平相对位移,通过持续的执行上述的步骤S1至步骤S6,并将上述的水平相对位移控制上述的飞行器进行降落,即将该水平相对位移作为位置闭环约束飞行器降落至预设起降点。
具体地,在一实施例中,如图3所示,上述的步骤S62具体包括:
步骤S621:获取飞行器在拍摄匹配图像时,飞行器的位姿及对地高度。在实际应用中,在飞行器起飞阶段采集预设起降点的匹配图像的同时,记录当前飞行器的飞行高度及对应的位姿信息。
步骤S622:根据各匹配特征点对计算飞行器在拍摄当前图像时的补偿偏 航角。在实际应用中,飞行器在向预设起降点飞行过程中由于飞机的飞行姿态等原因会与产生一定的角度偏差,可以通过计算上述当前图像的补偿偏航角,对该角度偏差进行弥补,以进一步提高飞行器的降落精确度。
具体地,在一实施例中,如图3所示,上述的步骤S622具体包括:
步骤S6221:根据匹配特征点对,得到当前图像的第一描述子及匹配图像的第二描述子。在实际应用中,每一组匹配特征点对由位于当前图像的第一匹配特征点和位于匹配图像的第二匹配特征点组成,基于图像梯度,对每个匹配特征点赋予一个或多个主方向,得到每个匹配特征点主方向的特征向量作为该匹配特征点的描述子,上述的第一描述子与第二描述子分别对应上述的第一匹配特征点和第二匹配特征点。
步骤S6222:计算第一描述子的主方向与第二描述子主方向的偏差角。在实际应用中,每一组匹配特征点对都会产生一个偏差角。
步骤S6223:计算各匹配特征点对的偏差角的偏差角均值,并将偏差角均值确定为补偿偏航角。在实际应用中,将上述匹配特征点对的偏差角的平均值作为补偿偏航角,以补偿飞行器在降落过程与预设起降点的角度偏差。
步骤S623:根据特征变换矩阵、位姿及对地高度计算当前图像的补偿对地高度。在实际应用中,由于传统的GPS等定位方式精度的局限,并且飞行器在降落过程中飞行高度急剧下降,对飞行器的对地高度(飞行高度)的定位存在一定的偏差,飞行器的降落指令需要参考飞行器当前的对地高度,当对地高度有偏差时,会影响飞行器降落的准确性,因而需要通过补偿对地高度的方式进一步的提高飞行器降落的精准度。
具体地,在一实施例中,如图3所示,上述的步骤S623具体包括:
步骤S6231:分解特征变换矩阵得到当前图像相对于匹配图像的相对位姿。具体地,在实际应用中,通过分解单应矩阵或基础矩阵即可得到当前位置相对于匹配图像位置的位姿,该分解过程采用现有技术即可实现,在此不再进行赘述。
步骤S6232:根据位姿、对地高度及相对位姿,得到当前图像的补偿对地高度。在实际应用中,通过上述匹配图像的位姿、相对位姿即可得到飞行器当前位置与采集匹配图像的位置的相对高度差值,该高度差值与采集匹配图像位置的对地高度之和即为当前飞行器的补偿对地高度,具体地,该补偿对地高度与GPS定位等高度定位方法得到的当前飞行器的对地高度的差异可以用来调整飞行器的降落指令,以提高飞行器降落的精准度。
步骤S624:根据补偿偏航角、补偿对地高度对光心投影点的三维坐标进行更新,得到三维坐标。在实际应用中,通过上述的补偿偏航角及补偿对地高度的计算,对上述光心投影点的三维坐标进行相应的调整,进而得到可以准确反映飞行器当前降落过程的飞行状况,进而可以据此对飞行器的降落指令进行相应的调整,以提高飞行器降落的精准度。
在一较佳实施例中,如图4所示,上述的飞行器降落方法还包括:
步骤S7:根据特征变换矩阵得到匹配图像的各第二匹配特征点在当前图像上的对应的各投影匹配特征点。在实际应用中,由于受到飞行器飞行位姿及照相机或摄像机拍摄环境等因素的影响,上述的当前图像及匹配图像可能存在较大的匹配误差,从而影响后续对降落指令调整的精确性,因此在本发明实施例中,通过上述当前图像上的投影匹配特征点与对应的匹配特征点距离误差来判断图像匹配的精确性。
步骤S8:分别计算投影匹配特征点与对应的第一匹配特征点之间的距离误差,并判断距离误差的平均值是否小于预设距离阈值。在实际应用中,通过计算得到每一组特征点对中投影匹配特征点与匹配特征点的距离误差,并通过判断上述这些距离误差的平均值是否小于预设距离阈值来判断图像匹配是否满足要求,当距离误差的平均值小于预设距离阈值时,执行上述的步骤S5,当距离误差的平均值不小于预设距离阈值时,返回上述的步骤S1,重新获取飞行器当前的当前图像或者根据实际情况重新选择在起飞阶段拍摄的其他的预设起降点的图像作为匹配图像。
在实际应用中,在飞行器的降落全过程中可以通过持续的执行上述的步骤S1至步骤S8,从而不断地调整飞行器的降落指令,直至飞行器降落至预设起降点,例如:飞行器按照预设时间间隔采集当前图像,在飞行器每次采集到当前图像后,执行上述的步骤S1至步骤S8,直至飞行器降落。
在另一可替换的实施例中,如图5所示,当上述的特征变换矩阵为单应矩阵,并且距离误差的平均值不小于预设距离阈值时,上述的飞行器降落方法还包括:
步骤S9:将特征变换矩阵由单应矩阵替换为基础矩阵,并重新上述步骤S7。
步骤S10:当特征变换矩阵由单应矩阵替换为基础矩阵后,判断距离误差的平均值不小于预设距离阈值时,执行上述的步骤S5,当距离误差的平均值不小于预设距离阈值时,返回上述的步骤S1。
具体地,由于单应矩阵是基于平面假设来计算两幅图像的特征变换矩阵,在实际应用中,是以地平面作为假设平面来计算单应矩阵,如果采用单应矩阵后,计算上述的距离误差的平均值不小于预设距离阈值,则说明平面假设失效,需要将上述的特征转换矩阵更新为基础矩阵。由于单应矩阵比基础矩阵的准确性更好,因而在本发明实施例中优先将单应矩阵设置为特征变换矩阵,只有在单应矩阵不满足预设条件的情况下,再将特征变换矩阵更新为基础矩阵,以保证飞行器的降落精度的最优。
通过上述步骤S1至步骤S10,本发明实施例提供的飞行器降落方法,通过将飞行器在降落过程中采集的预设起降点的当前图像与飞行器起飞阶段采集的匹配图像进行特征匹配,得到各匹配特征点对计算当前图像与匹配图像的特征变换矩阵,通过该特征变换矩阵得到匹配图像的光心点在当前图像上的光心投影点,并根据该光心投影点的坐标控制飞行器降落至预设起降点。从而实 现了一种飞行器降落指令的调整方法,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
本发明实施例提供了一种飞行器降落装置,如图6所示,该飞行器降落装置包括:
当前图像获取模块1,用于获取飞行器在降落过程中采集的预设起降点的当前图像。当前图像获取模块1的详细功能参见上述实施例中步骤S1的相关描述。
匹配图像获取模块2,获取飞行器在起飞阶段采集的预设起降点的匹配图像和飞行器采集匹配图像时飞行器的位姿信息。匹配图像获取模块2的详细功能参见上述实施例中步骤S2的相关描述。
匹配特征点对生成模块3,用于对当前图像和匹配图像进行特征匹配,得到各匹配特征点对。匹配特征点对生成模块3的详细功能参见上述实施例中步骤S3的相关描述。
特征变换矩阵计算模块4,用于根据各匹配特征点对计算当前图像与匹配图像之间的特征变换矩阵。特征变换矩阵计算模块4的详细功能参见上述实施例中步骤S4的相关描述。
光心投影点生成模块5,用于根据特征变换矩阵得到匹配图像的光心点投影至当前图像的光心投影点。光心投影点生成模块5的详细功能参见上述实施例中步骤S5的相关描述。
降落指令调整模块6,用于根据光心投影点的坐标控制飞行器降落至预设起降点。降落指令调整模块6的详细功能参见上述实施例中步骤S6的相关描述。
通过上述各个组成部分的协同合作,本发明实施例提供的飞行器降落装置,通过将飞行器在降落过程中采集的预设起降点的当前图像与飞行器起飞阶段采集的匹配图像进行特征匹配,得到各匹配特征点对计算当前图像与匹配图像的特征变换矩阵,通过该特征变换矩阵得到匹配图像的光心点在当前图像上的光心投影点,并根据该光心投影点的坐标控制飞行器降落至预设起降点。从而实现了一种飞行器降落指令的调整方法,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
具体地,在一实施例中,上述的匹配图像获取模块2具体用于在起飞阶段,每隔预设飞行高度,获取所述飞行器采集的所述预设起降点的匹配图像和所述飞行器的位姿信息。图像获取模块2的详细功能参见上述实施例中步骤S2的相关描述。
具体地,在一实施例中,当所述飞行器的当前飞行高度大于预设高度时,所述匹配图像获取模块2还用于停止采集匹配图像和所述飞行器的位姿信息。 图像获取模块2的详细功能参见上述实施例中步骤S2的相关描述。
具体地,在一实施例中,所述降落指令调整模块6包括:二维像素坐标获取子模块,用于获取所述匹配图像的光心点投影至所述当前图像的光心投影点的二维像素坐标。二维像素坐标获取子模块的详细功能参见上述实施例中步骤S61的相关描述。
计算子模块,用于根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标。计算子模块的详细功能参见上述实施例中步骤S62的相关描述。
控制子模块,用于根据所述三维坐标,控制所述飞行器降落至所述预设起降点。控制子模块的详细功能参见上述实施例中步骤S63的相关描述。
具体地,在一实施例中,所述计算子模块包括:信息获取单元,用于获取所述飞行器在拍摄所述匹配图像时,所述飞行器的位姿及对地高度。信息获取单元的详细功能参见上述实施例中步骤S621的相关描述。
补偿偏航角计算单元,用于根据所述各匹配特征点对,计算所述飞行器在拍摄所述当前图像时的补偿偏航角。补偿偏航角计算单元的详细功能参见上述实施例中步骤S622的相关描述。
补偿对地高度计算单元,用于根据所述特征变换矩阵、所述位姿及所述对地高度,计算所述当前图像的补偿对地高度。补偿对地高度计算单元的详细功能参见上述实施例中步骤S623的相关描述。
三维坐标更新单元,用于根据所述补偿偏航角、所述补偿对地高度,对所述光心投影点的所述三维坐标进行更新,得到所述三维坐标。三维坐标更新单元的详细功能参见上述实施例中步骤S624的相关描述。
具体地,在一实施例中,所述补偿偏航角计算单元包括:描述子生成子单元,用于根据所述匹配特征点对,得到所述当前图像的第一描述子及所述匹配图像的第二描述子。描述子生成子单元的详细功能参见上述实施例中步骤S6221的相关描述。
偏差角计算子单元,用于计算所述第一描述子的主方向与所述第二描述子主方向的偏差角。偏差角计算子单元的详细功能参见上述实施例中步骤S6222的相关描述。
补偿偏航角计算子单元,用于计算各所述匹配特征点对的偏差角的偏差角均值,并将所述偏差角均值确定为所述补偿偏航角。补偿偏航角计算子单元的详细功能参见上述实施例中步骤S6223的相关描述。
具体地,在一实施例中,所述补偿对地高度计算单元包括:分解子单元,用于分解所述特征变换矩阵得到所述当前图像相对于所述匹配图像的相对位姿。分解子单元的详细功能参见上述实施例中步骤S6231的相关描述。
补偿对地高度生成子单元,用于根据所述位姿、所述对地高度及所述相对位姿,得到所述当前图像的补偿对地高度。补偿对地高度生成子单元的详细功能参见上述实施例中步骤S6232的相关描述。
具体地,在一实施例中,所述飞行器降落装置还包括:投影匹配特征点生成模块,用于根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点。投影匹配特征点生成模块的详细功能参见上述实施例中步骤S7的相关描述。
判断模块,用于分别计算所述各投影匹配特征点与对应的所述第一匹配特征点之间的距离误差,并判断所述距离误差的平均值是否小于预设距离阈值;当所述距离误差的平均值小于所述预设距离阈值时,返回所述光心投影点生成模块。判断模块的详细功能参见上述实施例中步骤S8的相关描述。
具体地,在一实施例中,所述特征变换矩阵为单应矩阵。
具体地,在一实施例中,所述飞行器降落装置还包括:转换矩阵替换模块,当所述距离误差的平均值不小于所述预设距离阈值时,所述转换矩阵替换模块用于将所述特征变换矩阵由所述单应矩阵替换为基础矩阵,返回所述投影匹配特征点生成模块。转换矩阵替换模块的详细功能参见上述实施例中步骤S9的相关描述。
具体地,在一实施例中,当所述特征变换矩阵由所述单应矩阵替换为基础矩阵后,所述飞行器降落装置还包括:第二判断模块,用于分别计算投影匹配特征点与对应的第一匹配特征点之间的距离误差,并判断距离误差的平均值是否小于预设距离阈值,当判断所述距离误差的平均值不小于预设距离阈值时,返回所述当前图像获取模块1。第二判断模块的详细功能参见上述实施例中步骤S10的相关描述。
通过上述各个组成部分的协同合作,本发明实施例提供的飞行器降落装置,通过将飞行器在降落过程中采集的预设起降点的当前图像与飞行器起飞阶段采集的匹配图像进行特征匹配,得到各匹配特征点对计算当前图像与匹配图像的特征变换矩阵,通过该特征变换矩阵得到匹配图像的光心点在当前图像上的光心投影点,并根据该光心投影点的坐标控制飞行器降落至预设起降点。从而实现了一种飞行器降落指令的调整方法,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
本发明实施例提供一种非暂态计算机存储介质,该计算机存储介质存储有计算机可执行指令,该计算机可执行指令可执行上述任意方法实施例中的飞行器降落方法,其中,上述存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)、随机存储记忆体(Random Access Memory,RAM)、快闪存储器(Flash Memory)、硬盘(Hard Disk Drive,缩写:HDD)或固态硬盘(Solid-State Drive,SSD)等;该存储介质还可以包括上述种类的存储器的组合。
本领域技术人员可以理解,实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成的,程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,的存储介质可为磁碟、光盘、只读存储记忆体(ROM)或随机存储记忆体(RAM) 等。
本发明实施例提供一种计算机设备,其结构示意图如图7所示,该计算机设备包括:一个或多个处理器410以及存储器420,图7中以一个处理器410为例。
上述的计算机设备还可以包括:输入装置430和输出装置440。
处理器410、存储器420、输入装置430和输出装置440可以通过总线或者其他方式连接,图7中以通过总线连接为例。
处理器410可以为中央处理器(Central Processing Unit,CPU)。处理器410还可以为其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等芯片,或者上述各类芯片的组合。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器420作为一种非暂态计算机可读存储介质,可用于存储非暂态软件程序、非暂态计算机可执行程序以及模块,如本申请实施例中的飞行器降落方法对应的程序指令/模块,处理器410通过运行存储在存储器420中的非暂态软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例的飞行器降落方法。
存储器420可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据飞行器降落方法的处理装置的使用所创建的数据等。此外,存储器420可以包括高速随机存取存储器,还可以包括非暂态存储器,例如至少一个磁盘存储器件、闪存器件、或其他非暂态固态存储器件。在一些实施例中,存储器420可选包括相对于处理器410远程设置的存储器,这些远程存储器可以通过网络连接至飞行器降落装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置430可接收输入的数字或字符信息,以及产生与飞行器降落操作的处理装置有关的用户设置以及功能控制有关的键信号输入。输出装置440可包括显示屏等显示设备。
一个或者多个模块存储在存储器420中,当被一个或者多个处理器410执行时,执行如图1-图5所示的方法。
上述产品可执行本发明实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本发明实施例中详尽描述的技术细节,具体可参见如图1-图5所示的实施例中的相关描述。
本发明实施例还提供了一种飞行器,其结构示意图如图8所示,该飞行器包括:飞行器本体101、图像采集设备102及飞行控制器103,其中,所述图 像采集设备102与所述飞行控制器103设置于所述飞行器本体101上;所述图像采集设备102用于采集所述飞行器在降落过程中的预设起降点的当前图像,并采集所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息,并将所述当前图像、匹配图像及位姿信息发送至所述飞行控制器103;所述飞行控制器103用于接收所述当前图像、匹配图像及位姿信息,并采用如上述实施例中的飞行器降落方法控制所述飞行器降落至所述预设起降点。飞行控制器103控制飞行器降落的详细内容参见上述实施例中关于飞行器降落方法的相关描述,在此不再进行赘述。
通过上述各个组成部分的协同合作,本发明实施例提供的飞行器,通过在飞行器上设置图像采集设备及飞行控制器,实现了对飞行器降落的精准控制,指引飞行器降落至起降点,从而提高了飞行器的降落精确度,仅通过对采集图像的成功匹配即可对降落指令做出调整,避免了对降落点的实时跟踪,降低了飞行器精准降落的难度。
显然,上述实施例仅仅是为清楚地说明所作的举例,而并非对实施方式的限定。对于所属领域的普通技术人员来说,在上述说明的基础上还可以做出其它不同形式的变化或变动。这里无需也无法对所有的实施方式予以穷举。而由此所引伸出的显而易见的变化或变动仍处于本发明创造的保护范围之中。

Claims (25)

  1. 一种飞行器降落方法,其特征在于,包括:
    获取所述飞行器在降落过程中采集的预设起降点的当前图像;
    获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息;
    对所述当前图像和所述匹配图像进行特征匹配,得到各匹配特征点对;
    根据所述各匹配特征点对,计算所述当前图像与所述匹配图像之间的特征变换矩阵;
    根据所述特征变换矩阵,得到所述匹配图像的光心点投影至所述当前图像的光心投影点;
    根据所述光心投影点的坐标,控制所述飞行器降落至所述预设起降点。
  2. 根据权利要求1所述的飞行器降落方法,其特征在于,所述获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器的位姿信息,包括:
    在起飞阶段,每隔预设飞行高度,获取所述飞行器采集的所述预设起降点的匹配图像和所述飞行器的位姿信息。
  3. 根据权利要求1或2所述的飞行器降落方法,其特征在于,该方法还包括:
    当所述飞行器的当前飞行高度大于预设高度时,停止采集匹配图像和所述飞行器的位姿信息。
  4. 根据权利要求1-3中任一项所述的飞行器降落方法,其特征在于,所述根据所述光心投影点的坐标,控制所述飞行器降落至所述预设起降点,包括:
    获取所述匹配图像的光心点投影至所述当前图像的光心投影点的二维像素坐标;
    根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标;
    根据所述三维坐标,控制所述飞行器降落至所述预设起降点。
  5. 根据权利要求4所述的飞行器降落方法,其特征在于,所述根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标,包括:
    获取所述飞行器在拍摄所述匹配图像时,所述飞行器的位姿及对地高度;
    根据所述各匹配特征点对,计算所述飞行器在拍摄所述当前图像时的补偿偏航角;
    根据所述特征变换矩阵、所述位姿及所述对地高度,计算所述当前图像的 补偿对地高度;
    根据所述补偿偏航角、所述补偿对地高度,对所述光心投影点的所述三维坐标进行更新,得到所述三维坐标。
  6. 根据权利要求5所述的飞行器降落方法,其特征在于,所述根据各所述匹配特征点对,计算所述当前图像的补偿偏航角,包括:
    根据所述匹配特征点对,得到所述当前图像的第一描述子及所述匹配图像的第二描述子;
    计算所述第一描述子的主方向与所述第二描述子主方向的偏差角;
    计算各所述匹配特征点对的偏差角的偏差角均值,并将所述偏差角均值确定为所述补偿偏航角。
  7. 根据权利要求5所述的飞行器降落方法,其特征在于,所述根据所述特征变换矩阵、所述位姿及所述对地高度计算所述当前图像的补偿对地高度,包括:
    分解所述特征变换矩阵得到所述当前图像相对于所述匹配图像的相对位姿;
    根据所述位姿、所述对地高度及所述相对位姿,得到所述当前图像的补偿对地高度。
  8. 根据权利要求1-7任一项所述的飞行器降落方法,其特征在于,所述匹配特征点对包括:第一匹配特征点和第二匹配特征点,所述第一匹配特征点位于所述当前图像上,所述第二匹配特征点位于所述匹配图像上,在所述根据所述各匹配特征点对,计算所述当前图像与所述匹配图像之间的特征变换矩阵之后,在所述根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点之前,所述飞行器降落方法还包括:
    根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点;
    分别计算所述各投影匹配特征点与对应的所述第一匹配特征点之间的距离误差,并判断所述距离误差的平均值是否小于预设距离阈值;
    当所述距离误差的平均值小于所述预设距离阈值时,执行所述根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点的步骤。
  9. 根据权利要求8所述的飞行器降落方法,其特征在于,所述特征变换矩阵为单应矩阵。
  10. 根据权利要求9所述的飞行器降落方法,其特征在于,还包括:当所 述距离误差的平均值不小于所述预设距离阈值时,将所述特征变换矩阵由所述单应矩阵替换为基础矩阵,并重新执行所述根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点的步骤。
  11. 根据权利要求10所述的飞行器降落方法,其特征在于,当所述特征变换矩阵由所述单应矩阵替换为基础矩阵后,并且判断所述距离误差的平均值不小于预设距离阈值时,执行所述获取飞行器当前采集的预设起降点的当前图像及所述飞行器在起飞阶段采集的所述预设起降点的匹配图像的步骤。
  12. 一种飞行器降落装置,其特征在于,包括:
    当前图像获取模块,用于获取所述飞行器在降落过程中采集的预设起降点的当前图像;
    匹配图像获取模块,用于获取所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息;
    匹配特征点对生成模块,用于对所述当前图像和所述匹配图像进行特征匹配,得到各匹配特征点对;
    特征变换矩阵计算模块,用于根据所述各匹配特征点对计算所述当前图像与所述匹配图像之间的特征变换矩阵;
    光心投影点生成模块,用于根据所述特征变换矩阵得到所述匹配图像的光心点投影至所述当前图像的光心投影点;
    降落指令调整模块,用于根据所述光心投影点的坐标控制所述飞行器降落至所述预设起降点。
  13. 根据权利要求12所述的飞行器降落装置,其特征在于,所述匹配图像获取模块具体用于在起飞阶段,每隔预设飞行高度,获取所述飞行器采集的所述预设起降点的匹配图像和所述飞行器的位姿信息。
  14. 根据权利要求12或13所述的飞行器降落装置,其特征在于,当所述飞行器的当前飞行高度大于预设高度时,所述匹配图像获取模块还用于停止采集匹配图像和所述飞行器的位姿信息。
  15. 根据权利要求12-14中任一项所述的飞行器降落装置,其特征在于,所述降落指令调整模块包括:
    二维像素坐标获取子模块,用于获取所述匹配图像的光心点投影至所述当前图像的光心投影点的二维像素坐标;
    计算子模块,用于根据所述二维像素坐标,计算所述光心投影点在世界坐标系下的三维坐标;
    控制子模块,用于根据所述三维坐标,控制所述飞行器降落至所述预设起降点。
  16. 根据权利要求15所述的飞行器降落装置,其特征在于,所述计算子模块包括:
    信息获取单元,用于获取所述飞行器在拍摄所述匹配图像时,所述飞行器的位姿及对地高度;
    补偿偏航角计算单元,用于根据所述各匹配特征点对,计算所述飞行器在拍摄所述当前图像时的补偿偏航角;
    补偿对地高度计算单元,用于根据所述特征变换矩阵、所述位姿及所述对地高度,计算所述当前图像的补偿对地高度;
    三维坐标更新单元,用于根据所述补偿偏航角、所述补偿对地高度,对所述光心投影点的所述三维坐标进行更新,得到所述三维坐标。
  17. 根据权利要求16所述的飞行器降落装置,其特征在于,所述补偿偏航角计算单元包括:
    描述子生成子单元,用于根据所述匹配特征点对,得到所述当前图像的第一描述子及所述匹配图像的第二描述子;
    偏差角计算子单元,用于计算所述第一描述子的主方向与所述第二描述子主方向的偏差角;
    补偿偏航角计算子单元,用于计算各所述匹配特征点对的偏差角的偏差角均值,并将所述偏差角均值确定为所述补偿偏航角。
  18. 根据权利要求16所述的飞行器降落装置,其特征在于,所述补偿对地高度计算单元包括:
    分解子单元,用于分解所述特征变换矩阵得到所述当前图像相对于所述匹配图像的相对位姿;
    补偿对地高度生成子单元,用于根据所述位姿、所述对地高度及所述相对位姿,得到所述当前图像的补偿对地高度。
  19. 根据权利要求12-18任一项所述的飞行器降落装置,其特征在于,所述飞行器降落装置还包括:
    投影匹配特征点生成模块,用于根据所述特征变换矩阵得到所述匹配图像的各第二匹配特征点在所述当前图像上的对应的各投影匹配特征点;
    判断模块,用于分别计算所述各投影匹配特征点与对应的所述第一匹配特征点之间的距离误差,并判断所述距离误差的平均值是否小于预设距离阈值,当所述距离误差的平均值小于所述预设距离阈值时,返回所述光心投影点生成模块。
  20. 根据权利要求19所述的飞行器降落装置,其特征在于,所述特征变换矩阵为单应矩阵。
  21. 根据权利要求20所述的飞行器降落装置,其特征在于,所述飞行器降落装置还包括:转换矩阵替换模块,当所述距离误差的平均值不小于所述预设距离阈值时,所述转换矩阵替换模块用于将所述特征变换矩阵由所述单应矩阵替换为基础矩阵,返回所述投影匹配特征点生成模块。
  22. 根据权利要求21所述的飞行器降落装置,其特征在于,当所述特征变换矩阵由所述单应矩阵替换为基础矩阵后,所述飞行器降落装置还包括:
    第二判断模块,用于分别计算投影匹配特征点与对应的第一匹配特征点之间的距离误差,并判断距离误差的平均值是否小于预设距离阈值,当判断所述距离误差的平均值不小于预设距离阈值时,返回所述当前图像获取模块。
  23. 一种非暂态计算机可读存储介质,其特征在于,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令被处理器执行时实现如权利要求1-11任一项所述的飞行器降落方法。
  24. 一种计算机设备,其特征在于,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器执行如权利要求1-11任一项所述的飞行器降落方法。
  25. 一种飞行器,其特征在于,包括:飞行器本体、图像采集设备及飞行控制器,其中,
    所述图像采集设备与所述飞行控制器设置于所述飞行器本体上;
    所述图像采集设备用于采集所述飞行器在降落过程中的预设起降点的当前图像,并采集所述飞行器在起飞阶段采集的所述预设起降点的匹配图像和所述飞行器采集所述匹配图像时所述飞行器的位姿信息,并将所述当前图像、匹配图像及位姿信息发送至所述飞行控制器;
    所述飞行控制器用于接收所述当前图像、匹配图像及位姿信息,并采用如权利要求1-11任一项所述的飞行器降落方法控制所述飞行器降落至所述预设起降点。
PCT/CN2020/085082 2019-04-19 2020-04-16 一种飞行器降落方法及装置 WO2020211812A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910318700.8 2019-04-19
CN201910318700.8A CN110001980B (zh) 2019-04-19 2019-04-19 一种飞行器降落方法及装置

Publications (1)

Publication Number Publication Date
WO2020211812A1 true WO2020211812A1 (zh) 2020-10-22

Family

ID=67173144

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085082 WO2020211812A1 (zh) 2019-04-19 2020-04-16 一种飞行器降落方法及装置

Country Status (2)

Country Link
CN (1) CN110001980B (zh)
WO (1) WO2020211812A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093772A (zh) * 2021-04-13 2021-07-09 中国计量大学 一种无人机机库精确降落方法
CN113608542A (zh) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 一种无人机自动降落的控制方法以及设备
CN113821047A (zh) * 2021-08-18 2021-12-21 杭州电子科技大学 一种基于单目视觉的无人机自主降落方法
CN114200954A (zh) * 2021-10-28 2022-03-18 佛山中科云图智能科技有限公司 基于Apriltag的无人机降落方法、装置、介质和电子设备
CN114296477A (zh) * 2021-12-17 2022-04-08 南京航空航天大学 一种面向空地协同作战的无人机动平台自主降落方法
CN114355984A (zh) * 2022-03-18 2022-04-15 北京卓翼智能科技有限公司 系留无人机的控制方法、控制装置、控制器及存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110001980B (zh) * 2019-04-19 2021-11-26 深圳市道通智能航空技术股份有限公司 一种飞行器降落方法及装置
CN110968107A (zh) * 2019-10-25 2020-04-07 深圳市道通智能航空技术有限公司 一种降落控制方法、飞行器及存储介质
CN112070814B (zh) * 2020-08-31 2024-04-02 杭州迅蚁网络科技有限公司 一种靶标角度识别方法、装置
CN112859888B (zh) * 2021-01-18 2023-09-12 中国商用飞机有限责任公司北京民用飞机技术研究中心 辅助垂直起降机着陆方法、装置、计算机设备及存储介质
CN113377118A (zh) * 2021-07-14 2021-09-10 中国计量大学 一种基于视觉的无人机机库多阶段精确降落方法
CN114489140A (zh) * 2022-02-16 2022-05-13 中国电子科技集团公司第五十四研究所 一种无标识环境下的无人机精准自主起降方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4868567A (en) * 1986-09-03 1989-09-19 Precitronic Gesellschaft Fur Feinmechanik Und Electronic Gmbh Landing approach aid for aircraft
CN106127201A (zh) * 2016-06-21 2016-11-16 西安因诺航空科技有限公司 一种基于视觉定位降落末端的无人机降落方法
CN106542105A (zh) * 2016-09-05 2017-03-29 珠海市磐石电子科技有限公司 飞行器移动降落方法和系统
JP2018021491A (ja) * 2016-08-02 2018-02-08 株式会社日立製作所 システム及び飛行ルート生成方法
CN108983807A (zh) * 2017-06-05 2018-12-11 北京臻迪科技股份有限公司 一种无人机定点降落方法及系统
CN109085851A (zh) * 2018-09-12 2018-12-25 哈尔滨工业大学(威海) 无人机定点降落方法
CN110001980A (zh) * 2019-04-19 2019-07-12 深圳市道通智能航空技术有限公司 一种飞行器降落方法及装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102435188B (zh) * 2011-09-15 2013-10-02 南京航空航天大学 一种用于室内环境的单目视觉/惯性全自主导航方法
CN102722697B (zh) * 2012-05-16 2015-06-03 北京理工大学 一种无人飞行器视觉自主导引着陆的目标跟踪方法
CN104076817A (zh) * 2014-06-18 2014-10-01 北京计算机技术及应用研究所 一种高清视频航拍多模传感器自外感知智能导航系统及其方法
CN104932522B (zh) * 2015-05-27 2018-04-17 深圳市大疆创新科技有限公司 一种飞行器的自主降落方法和系统
CN107014380B (zh) * 2017-05-26 2020-01-07 西安科技大学 基于飞行器的视觉导航与惯性导航的组合导航方法
CN109307510A (zh) * 2017-07-28 2019-02-05 广州极飞科技有限公司 飞行导航方法、装置和无人飞行器
CN109292099B (zh) * 2018-08-10 2020-09-25 顺丰科技有限公司 一种无人机着陆判断方法、装置、设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4868567A (en) * 1986-09-03 1989-09-19 Precitronic Gesellschaft Fur Feinmechanik Und Electronic Gmbh Landing approach aid for aircraft
CN106127201A (zh) * 2016-06-21 2016-11-16 西安因诺航空科技有限公司 一种基于视觉定位降落末端的无人机降落方法
JP2018021491A (ja) * 2016-08-02 2018-02-08 株式会社日立製作所 システム及び飛行ルート生成方法
CN106542105A (zh) * 2016-09-05 2017-03-29 珠海市磐石电子科技有限公司 飞行器移动降落方法和系统
CN108983807A (zh) * 2017-06-05 2018-12-11 北京臻迪科技股份有限公司 一种无人机定点降落方法及系统
CN109085851A (zh) * 2018-09-12 2018-12-25 哈尔滨工业大学(威海) 无人机定点降落方法
CN110001980A (zh) * 2019-04-19 2019-07-12 深圳市道通智能航空技术有限公司 一种飞行器降落方法及装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093772A (zh) * 2021-04-13 2021-07-09 中国计量大学 一种无人机机库精确降落方法
CN113608542A (zh) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 一种无人机自动降落的控制方法以及设备
CN113608542B (zh) * 2021-08-12 2024-04-12 山东信通电子股份有限公司 一种无人机自动降落的控制方法以及设备
CN113821047A (zh) * 2021-08-18 2021-12-21 杭州电子科技大学 一种基于单目视觉的无人机自主降落方法
CN114200954A (zh) * 2021-10-28 2022-03-18 佛山中科云图智能科技有限公司 基于Apriltag的无人机降落方法、装置、介质和电子设备
CN114296477A (zh) * 2021-12-17 2022-04-08 南京航空航天大学 一种面向空地协同作战的无人机动平台自主降落方法
CN114355984A (zh) * 2022-03-18 2022-04-15 北京卓翼智能科技有限公司 系留无人机的控制方法、控制装置、控制器及存储介质

Also Published As

Publication number Publication date
CN110001980B (zh) 2021-11-26
CN110001980A (zh) 2019-07-12

Similar Documents

Publication Publication Date Title
WO2020211812A1 (zh) 一种飞行器降落方法及装置
US10754354B2 (en) Hover control
JP6882505B2 (ja) 無人機の飛行を制御するための方法及び装置
US20210141378A1 (en) Imaging method and device, and unmanned aerial vehicle
US11656356B2 (en) Ranging method based on laser radar system, device and readable storage medium
EP3315414B1 (en) Geo-location or navigation camera, and aircraft and navigation method therefor
WO2019113966A1 (zh) 一种避障方法、装置和无人机
JP2021534481A (ja) 障害物又は地面の認識及び飛行制御方法、装置、機器及び記憶媒体
WO2018099198A1 (zh) 无人机姿态控制方法、装置及无人机
US20170305546A1 (en) Autonomous navigation method and system, and map modeling method and system
CN106054924B (zh) 一种无人机伴飞方法、伴飞装置和伴飞系统
WO2019037088A1 (zh) 一种曝光的控制方法、装置以及无人机
WO2021078264A1 (zh) 一种降落控制方法、飞行器及存储介质
WO2018120351A1 (zh) 一种对无人机进行定位的方法及装置
WO2018120350A1 (zh) 对无人机进行定位的方法及装置
WO2021035731A1 (zh) 无人飞行器的控制方法、装置及计算机可读存储介质
US11221635B2 (en) Aerial vehicle heading control method and apparatus and electronic device
CN112799095A (zh) 静态地图生成方法、装置、计算机设备及存储介质
CN110622091A (zh) 云台的控制方法、装置、系统、计算机存储介质及无人机
CN107783555B (zh) 一种基于无人机的目标定位方法、装置及系统
CN111766900A (zh) 无人机高精度自主降落的系统、方法及存储介质
WO2019080113A1 (zh) 无人机的巡检规划方法、控制终端、无人机及无人机系统
CN112947526B (zh) 一种无人机自主降落方法和系统
US20220084415A1 (en) Flight planning method and related apparatus
CN113409459A (zh) 高精地图的生产方法、装置、设备和计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20791999

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20791999

Country of ref document: EP

Kind code of ref document: A1