CN116136695A - Unmanned aerial vehicle landing method and device - Google Patents

Unmanned aerial vehicle landing method and device Download PDF

Info

Publication number
CN116136695A
CN116136695A CN202111370004.5A CN202111370004A CN116136695A CN 116136695 A CN116136695 A CN 116136695A CN 202111370004 A CN202111370004 A CN 202111370004A CN 116136695 A CN116136695 A CN 116136695A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
landing
accuracy
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111370004.5A
Other languages
Chinese (zh)
Inventor
马代亮
陈刚
李颖杰
刘宝旭
毛一年
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202111370004.5A priority Critical patent/CN116136695A/en
Publication of CN116136695A publication Critical patent/CN116136695A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The specification discloses a unmanned aerial vehicle landing method and device, through satellite positioning data, confirm unmanned aerial vehicle's first position, and when unmanned aerial vehicle satisfies the landing condition of confirming based on state information, confirm unmanned aerial vehicle's second position according to unmanned aerial vehicle's vision positioning data, and according to the relation of first position and second position, confirm the degree of accuracy of first position, then when this degree of accuracy satisfied the condition of predetermineeing, according to satellite positioning data, guide this unmanned aerial vehicle to descend, when this degree of accuracy did not satisfy the condition of predetermineeing, then according to vision positioning data, guide unmanned aerial vehicle to descend. According to the method, the accuracy of the first position is determined through the state information of the satellite positioning data and the visual positioning data, so that whether the accuracy meets the preset condition or not can be achieved, a proper landing mode is selected, and the landing safety of the unmanned aerial vehicle is ensured.

Description

Unmanned aerial vehicle landing method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a landing method and apparatus for an unmanned aerial vehicle.
Background
At present, with the progress of technology and the maturation of unmanned technology, unmanned aerial vehicles have successfully realized application in the delivery field, and are often applied to scenes such as take-out, express delivery and delivery. In the unmanned aerial vehicle distribution process, how to control the unmanned aerial vehicle to accurately land is one of the problems to be solved by the service provider.
One common unmanned landing method is based on a real-time dynamic differential positioning (Real Time Kinematic, RTK) technique. Specifically, the service provider may set a base station with a known coordinate, acquire a satellite positioning signal of the base station, determine a satellite positioning coordinate corresponding to the base station, and further determine a positioning error. Therefore, in the unmanned aerial vehicle landing stage, the unmanned aerial vehicle can acquire the satellite positioning coordinates of the unmanned aerial vehicle, and acquire the current positioning error through the base station, so that the accurate coordinates of the unmanned aerial vehicle can be determined according to the satellite positioning coordinates and the positioning error of the unmanned aerial vehicle. And controlling the unmanned aerial vehicle to accurately drop to the target drop point according to the coordinates of the target drop point with the known coordinates and the accurate coordinates of the unmanned aerial vehicle.
However, in the prior art, when the satellite positioning coordinates of the unmanned aerial vehicle are determined, too many obstacles such as buildings around the unmanned aerial vehicle may appear, so that the determined satellite positioning coordinates are still inaccurate under the influence of multipath effects, and potential safety hazards exist when landing is performed based on the determined coordinates of the unmanned aerial vehicle and the coordinates of the target landing points.
Disclosure of Invention
The present disclosure provides a method and apparatus for landing an unmanned aerial vehicle, so as to partially solve the above-mentioned problems in the prior art.
The technical scheme adopted in the specification is as follows:
the specification provides an unmanned aerial vehicle landing method, comprising:
determining a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
when the unmanned aerial vehicle meets a landing condition, determining a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle, wherein the landing condition is determined by the state information of the satellite positioning data;
determining the accuracy of the first position according to the relation between the first position and the second position;
when the accuracy meets a preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the first position and the position of the target drop point through satellite positioning data of the unmanned aerial vehicle;
and when the accuracy does not meet the preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the second position and the position of the target drop point through the visual positioning data of the unmanned aerial vehicle.
Optionally, the determining the first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle specifically includes:
determining the position to be corrected of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
Determining a positioning error according to satellite positioning data of a base station communicating with the unmanned aerial vehicle and the position of the base station;
and calibrating the position to be corrected according to the positioning error to obtain the first position.
Optionally, the determining the accuracy of the first position according to the relation between the first position and the second position specifically includes:
determining a distance between the first location and the second location;
when the distance is greater than a preset distance threshold, determining the accuracy as a first accuracy;
determining the accuracy as a second accuracy when the distance is not greater than the distance threshold, the second accuracy not being equal to the first accuracy;
correspondingly, the accuracy meets a preset condition, including:
the accuracy is the first accuracy.
Optionally, before the determining the second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle, the method further comprises:
acquiring a ground image acquired by the unmanned aerial vehicle, and identifying a target object of the ground image;
and determining that the target falling point exists according to the target object identification result.
Optionally, the method further comprises:
when the unmanned aerial vehicle does not meet the landing condition, acquiring a ground image acquired by the unmanned aerial vehicle;
determining the position of the target falling point through the ground image;
and according to the position of the target landing point, guiding the unmanned aerial vehicle to land to the target landing point through inertial navigation data.
Optionally, the acquiring the ground image acquired by the unmanned aerial vehicle specifically includes:
in the descending process of the unmanned aerial vehicle, acquiring multi-frame ground images at preset time intervals;
correspondingly, determining the position of the target landing point through the ground image specifically comprises:
and respectively carrying out object recognition on two adjacent frames of images in the multi-frame ground image, determining the historical position of the object landing point in the two adjacent frames of images, and carrying out object tracking on the position of the object landing point based on the historical position.
Optionally, the unmanned aerial vehicle is in the process of descending, and the method further comprises:
and when the historical position cannot be determined by the target object identification, controlling the unmanned aerial vehicle to stop descending, and redefining the target descent point.
The present specification provides an unmanned aerial vehicle landing device, comprising:
the first determining module is used for determining a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
the second determining module is used for determining a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle when the unmanned aerial vehicle meets the landing condition, wherein the landing condition is determined by the state information of the satellite positioning data;
an accuracy determining module, configured to determine an accuracy of the first position according to a relationship between the first position and the second position;
the first landing module is used for guiding the unmanned aerial vehicle to land to the target landing point according to the first position and the position of the target landing point when the accuracy meets the preset condition through satellite positioning data of the unmanned aerial vehicle;
and the second landing module is used for guiding the unmanned aerial vehicle to land to the target landing point according to the second position and the position of the target landing point through the visual positioning data of the unmanned aerial vehicle when the accuracy does not meet the preset condition.
The present description provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described unmanned aerial vehicle landing method.
The present specification provides a drone comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the drone landing method described above when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
in the unmanned aerial vehicle landing method provided by the specification, the first position of the unmanned aerial vehicle is determined through satellite positioning data, when the unmanned aerial vehicle meets the landing condition determined based on state information, the second position of the unmanned aerial vehicle is determined according to the visual positioning data of the unmanned aerial vehicle, the accuracy of the first position is determined according to the relation between the first position and the second position, when the accuracy meets the preset condition, the unmanned aerial vehicle is guided to land according to the satellite positioning data, and when the accuracy does not meet the preset condition, the unmanned aerial vehicle is guided to land according to the visual positioning data.
According to the method, the accuracy of the first position is determined through the state information of the satellite positioning data and the visual positioning data, so that whether the accuracy meets the preset condition or not can be achieved, a proper landing mode is selected, and the landing safety of the unmanned aerial vehicle is ensured.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
fig. 1 is a schematic flow chart of a landing method of an unmanned aerial vehicle provided in the present specification;
fig. 2 is a schematic view of a scenario when the unmanned aerial vehicle provided in the present specification is landing;
fig. 3 is a schematic flow chart of the unmanned aerial vehicle landing method provided in the present specification;
fig. 4 is a schematic view of the unmanned aerial vehicle landing device provided in the present specification;
fig. 5 is a schematic view of the unmanned aerial vehicle corresponding to fig. 1 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
In general, there is a positioning error between a position determined by the unmanned aerial vehicle according to satellite positioning data and an actual position of the unmanned aerial vehicle, the positioning error is caused by reflection of an ionosphere and the like, and the positioning error is constant within a certain distance range. Therefore, the actual coordinates of the unmanned aerial vehicle can be determined according to the satellite positioning data of the unmanned aerial vehicle and the positioning error of the position of the unmanned aerial vehicle, and the unmanned aerial vehicle is controlled to accurately land to the target landing point based on the actual coordinates and the actual coordinates of the target landing point.
In practice, however, besides positioning errors, the position determined by the unmanned aerial vehicle according to satellite positioning data may be affected by multipath errors caused by too many obstacles such as buildings near the unmanned aerial vehicle, so that the actual position of the unmanned aerial vehicle cannot be determined based on the satellite positioning data and the positioning errors of the unmanned aerial vehicle, and further the unmanned aerial vehicle cannot be controlled to accurately land to a target landing point. Then when unmanned aerial vehicle needs to drop in scenes such as hangar top, because unmanned aerial vehicle can't drop to hangar top accurately, lead to unmanned aerial vehicle unable return hangar, when having the potential safety hazard, also influenced distribution efficiency.
Fig. 1 is a schematic flow chart of a landing method of an unmanned aerial vehicle according to an embodiment of the present disclosure.
S100: and determining a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle.
The method is different from the prior art that the position information of the unmanned aerial vehicle is directly determined based on satellite positioning data and positioning errors of the unmanned aerial vehicle, and the influence of the surrounding environment of the unmanned aerial vehicle on the satellite positioning data is not considered. Based on this, the drone may first determine the first location from its own satellite positioning data.
In one or more embodiments provided herein, the unmanned aerial vehicle landing method provided herein may be performed by an unmanned aerial vehicle. Of course, the server of the service provider can also determine the actual coordinates of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle and the like, and then determine the movement strategy of the unmanned aerial vehicle based on the actual coordinates of the unmanned aerial vehicle and the actual coordinates of the target landing point, and further control the unmanned aerial vehicle to land. For convenience of description, the unmanned aerial vehicle executing the unmanned aerial vehicle landing method will be described later as an example.
Specifically, the unmanned aerial vehicle may first acquire satellite positioning data of the unmanned aerial vehicle, and determine a position corresponding to the satellite positioning data as a position to be corrected;
secondly, the unmanned aerial vehicle can send an acquisition request to the base station according to the time stamp of the satellite positioning data of the unmanned aerial vehicle. The base station may send the satellite positioning data of the base station and the actual coordinates of the base station at the same time as the satellite positioning data of the unmanned aerial vehicle to the unmanned aerial vehicle according to the acquisition request.
Then, the unmanned aerial vehicle can confirm the positioning error according to the actual coordinate of the base station and satellite positioning data of the base station received.
Finally, the unmanned aerial vehicle can calibrate the initial calibration position according to the positioning error. That is, a final calibration position is determined as the first position of the drone based on the positioning error and the initial correction position.
Of course, the step of determining the first position may be that the unmanned aerial vehicle determines the first position corresponding to each moment in the driving process of the unmanned aerial vehicle. And when the unmanned aerial vehicle needs to land, acquiring a first position corresponding to the current moment from the prestored first positions corresponding to the moments. In the step of determining the positioning error, the positioning error corresponding to each moment may be determined by the base station based on the actual coordinates of the base station and the satellite positioning data determined at each moment, and then the unmanned aerial vehicle may directly determine the positioning error corresponding to each moment from the base station.
Further, when the first position of the unmanned aerial vehicle is determined according to the satellite positioning data of the unmanned aerial vehicle, if the satellite positioning data is a global positioning system (Global Positioning System, GPS) signal, a global navigation satellite system (Global Navigation Satellite System, GNSS) signal, or the like, the unmanned aerial vehicle can determine the first position of the unmanned aerial vehicle according to the received GPS signal and/or GNSS signal, or the like.
S102: and when the unmanned aerial vehicle meets a landing condition, determining a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle, wherein the landing condition is determined by the state information of the satellite positioning data.
In one or more embodiments provided herein, no safety hazard exists only when the satellite positioning data is accurate, the unmanned aerial vehicle performs a precise landing according to the satellite positioning data. And because the satellite positioning data has the corresponding accuracy, before the accuracy of the first position is determined according to the visual positioning data, whether the unmanned aerial vehicle meets the landing condition can be judged according to the state information of the satellite positioning data.
Specifically, in the scenario of RTK guided landing, the satellite positioning data generally includes the most accurate, generally accurate, inaccurate, etc. states, such as a fixed solution state, a floating solution state, etc., and the unmanned aerial vehicle may further determine, according to the state information of the satellite positioning data, whether the unmanned aerial vehicle meets the landing condition.
If so, the satellite positioning data of the unmanned aerial vehicle can be considered to be accurate, namely, the accuracy of the first position can be continuously judged according to the visual positioning data.
If the satellite positioning data of the unmanned aerial vehicle is not satisfied, the satellite positioning data of the unmanned aerial vehicle can be considered to be inaccurate, and the unmanned aerial vehicle can not be guided to land according to the satellite positioning data.
The fixed solution state represents that the satellite positioning data of the unmanned aerial vehicle is in the most accurate state, the error of the fixed solution state represents that the satellite positioning data of the unmanned aerial vehicle is in the centimeter level, and the floating solution state represents that the satellite positioning data of the unmanned aerial vehicle is in the generally accurate state, and the error of the floating solution state represents that the satellite positioning data of the unmanned aerial vehicle is in the decimeter-to-meter level.
Therefore, when the satellite positioning data is in the most accurate state, it can be determined that the unmanned aerial vehicle satisfies the landing condition, and when the satellite positioning data is in other states, it can be determined that the unmanned aerial vehicle does not satisfy the landing condition.
Of course, the state information of the satellite navigation data may be each precision factor when the unmanned aerial vehicle performs positioning by the GPS signal, for example, a clock error precision factor, a horizontal component precision factor, a vertical component precision factor, and the like. The method can also judge whether the position is in the judging result of each area possibly having the multipath effect according to each area possibly having the multipath effect, which is marked in advance, of the unmanned aerial vehicle when the unmanned aerial vehicle descends. The specific content and form of the status information may be set as required, which is not limited in this specification.
In addition, the above-mentioned landing condition may also be that the satellite positioning data of the unmanned aerial vehicle is determined accurately according to the current precision factor of the unmanned aerial vehicle, or that the current area of the unmanned aerial vehicle is located outside the area where the multipath effect exists, etc., and the content of the specific landing condition may be set according to the need, which is not limited in this specification.
Further, in the present description, the accuracy of the first position of the unmanned aerial vehicle may be determined based on the visual positioning data, so that after determining that the unmanned aerial vehicle meets the landing condition, the unmanned aerial vehicle may determine the second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle itself.
Specifically, the unmanned aerial vehicle may determine, according to the timestamp of the satellite positioning data determined in step S100, image data at the same time as the satellite positioning data, and IMU data between the image data at the previous time and the image data.
Then, the unmanned aerial vehicle can recognize the target object of the image data, determine the position of the target object, and determine the relative distance between the unmanned aerial vehicle and the target landing point through IMU data.
Finally, the drone may determine a second location of the drone based on the coordinates of the target drop point and the relative distance. Of course, the unmanned aerial vehicle may also determine the current second position of the unmanned aerial vehicle based on the position of the target drop point in the image data and the IMU data. The visual positioning data includes IMU data and image data.
It should be noted that, the state information of the satellite positioning data may be carried in the satellite positioning data received by the unmanned aerial vehicle, or may be determined by the unmanned aerial vehicle according to the received satellite positioning data, for example, when two paths of GPS signals are received, the unmanned aerial vehicle determines that the unmanned aerial vehicle is in an RTK floating solution state. The specific form and determination manner of the status information can be set according to the need, and the present specification does not limit the form and determination manner.
S104: and determining the accuracy of the first position according to the relation between the first position and the second position.
In one or more embodiments provided in the present specification, when the unmanned aerial vehicle lands, the unmanned aerial vehicle landing method provided in the present specification may determine whether the position of the unmanned aerial vehicle is accurate based on the visual positioning data, so as to avoid resource waste and existing potential safety hazards caused by landing by guiding the unmanned aerial vehicle to land by using satellite guidance when the positioning error is large. Thus, the drone may determine the accuracy of the first location based on the relationship between the first location and the second location.
Specifically, the unmanned aerial vehicle can determine the relative distance between the unmanned aerial vehicle and the target landing point according to the first position and the pre-stored position of the target landing point. And judging whether the difference between the relative distance and the relative distance between the unmanned aerial vehicle and the target landing point determined based on the IMU data is larger than a preset distance threshold value. If the accuracy of the first position is larger than the preset condition, the accuracy of the first position can be determined to not meet the preset condition, and if the accuracy of the first position is smaller than the preset condition, the accuracy of the first position can be determined to meet the preset condition. The visual positioning data includes IMU data and image data.
Of course, since determining satellite guiding conditions based on position is more accurate than determining satellite guiding conditions based on relative distance, the unmanned aerial vehicle can also determine visual positioning data corresponding to the unmanned aerial vehicle according to visual positioning through IMU data and image data. And then, determining the distance between the visual positioning coordinate and the first position of the unmanned aerial vehicle, and judging whether the distance is larger than a preset distance threshold value. If yes, it can be determined that the accuracy of the first position does not meet the preset condition. If the accuracy of the first position is smaller than the preset condition, the accuracy of the first position can be determined to meet the preset condition.
In addition, the accuracy may further include a first accuracy and a second accuracy, where the accuracy of the first accuracy is higher than the second accuracy, and therefore, the accuracy satisfies a preset condition, and may be the first accuracy, and correspondingly, the accuracy does not satisfy the preset condition, and may be the second accuracy.
Further, since the multipath effect and the like are caused by environmental factors, an area where the multipath effect may exist may be marked in the high-precision map in advance according to the range in which the unmanned aerial vehicle performs the distribution task. When the unmanned aerial vehicle needs to land, determining the accuracy of the first position according to the current position and the area possibly having multipath effect by the unmanned aerial vehicle.
Specifically, the unmanned aerial vehicle can first determine the position of each designated area from a pre-stored high-precision map. Wherein, each appointed area is an area with possibly multipath effect due to too many obstacles such as buildings around the unmanned aerial vehicle. Each designated area may be calibrated based on each obstacle in the high-precision map, i.e., an area where the obstacle is too many and the volume of the obstacle is too large may be designated as a designated area. The designated area may be a two-dimensional area or a three-dimensional area. However, in this specification, the unmanned aerial vehicle landing method is used, and therefore, the specified area is generally a two-dimensional area. Of course, the specified area may also be an area where electromagnetic devices are too many, so that the received signal may be affected due to electromagnetic interference when the unmanned aerial vehicle flies in the area, and the specific rule for determining the specified area may be set as required, which is not limited in this specification.
Then, the unmanned aerial vehicle can judge whether the first position is located in the appointed area according to the first position of the unmanned aerial vehicle. If so, the accuracy of the unmanned aerial vehicle can be determined to be the second accuracy, and the unmanned aerial vehicle cannot be controlled to drop to the target drop point according to the first position. If not, the accuracy of the unmanned aerial vehicle can be determined to be the first accuracy, and the unmanned aerial vehicle can be controlled to land to the target landing point according to the first position. Of course, the step of determining the accuracy of the first position by the unmanned aerial vehicle according to the current position and the pre-marked area where multipath effect may exist may be the step of determining the state information of the satellite positioning data by the unmanned aerial vehicle. If the position is located in the designated area, the state information corresponding to the position can be determined to be satellite positioning data inaccuracy, and if the position is located outside the designated area, the state information corresponding to the position can be determined to be satellite data accuracy.
In addition, in the field of unmanned aerial vehicle delivery, the unmanned aerial vehicle is generally controlled to fall to a target falling point according to the position of the unmanned aerial vehicle and the position of the target falling point. And then the steps of changing electricity, placing the delivery objects, loading the delivery objects and the like are executed at the target drop point. The unmanned aerial vehicle landing method provided by the specification is also applied to a scene that the unmanned aerial vehicle is landed to a target landing point. Based on this, the drone may first determine the target drop point at which to drop.
In particular, the drone may determine the delivery task that itself is performing. And then determining a corresponding target drop point according to the execution stage of the distribution task. For example, during the pick-up phase, the drone may determine the current target drop point as the pick-up location. During the delivery phase, the drone may determine the current target drop point as the delivery location. The delivery tasks in the present specification are tasks that are distributed to the unmanned aerial vehicle by the service provider, and may include not only tasks such as express delivery and take-out delivery, but also scheduling tasks. The specific distribution task can be set according to the requirement, and the specification does not limit the task type.
Of course, the target landing point can also be pre-stored in the unmanned aerial vehicle, and when the unmanned aerial vehicle determines that the unmanned aerial vehicle needs to land, the target landing point pre-stored by the unmanned aerial vehicle is obtained, and the unmanned aerial vehicle is controlled to descend to the target landing point.
S106: and when the accuracy meets the preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the first position and the position of the target drop point through satellite positioning data of the unmanned aerial vehicle.
In one or more embodiments provided in the present specification, when the first position of the unmanned aerial vehicle is determined to accurately meet the preset condition, the unmanned aerial vehicle may be controlled to land to the target landing point by guiding the satellite to land according to the coordinates of the first position and the target landing point.
Specifically, in the unmanned aerial vehicle descending process, the unmanned aerial vehicle can acquire the coordinates of the unmanned aerial vehicle in the descending process according to a preset time interval. And controlling the unmanned aerial vehicle to move along the direction close to the target landing point according to the self coordinates and the actual coordinates of the target landing point until the unmanned aerial vehicle is landed on the target landing point. The preset time interval may be 1/24 second or 1/60 second, and specific values of the time interval may be set according to needs, which is not limited in this specification.
The target landing point may be a parking apron including a beacon such as a two-dimensional code or a parking apron label, or may be a landing platform corresponding to a hangar. The specific target drop point can be set according to the needs, and the specification does not limit the drop point.
S108: and when the accuracy does not meet the preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the second position and the position of the target drop point through the visual positioning data of the unmanned aerial vehicle.
In one or more embodiments of the present disclosure, the unmanned aerial vehicle may control the unmanned aerial vehicle to land to the target landing point according to the visual guidance landing when the accuracy of the first position does not satisfy the preset condition.
Specifically, in one or more embodiments provided in the present disclosure, the unmanned aerial vehicle may acquire images of the target landing points continuously acquired during the descent process, and track the target landing points determined in step S102 through the area stabilization according to the acquired images of the target landing points, so as to adjust the descent trajectory of the unmanned aerial vehicle until the unmanned aerial vehicle descends to the target landing points.
Firstly, images of the target landing point can be continuously acquired according to a preset time interval when the unmanned aerial vehicle descends, so that the target landing point can be tracked according to the acquired images. The preset time interval may be 1/24 second or 1/60 second, and specific values of the time interval may be set according to needs, which is not limited in this specification.
Then, the unmanned aerial vehicle can control the unmanned aerial vehicle to move along the direction close to the target falling point according to the determined target falling point until the unmanned aerial vehicle falls to the target falling point.
Of course, because unmanned aerial vehicle probably appears the shake in the process of descending, consequently, this unmanned aerial vehicle can be at unmanned aerial vehicle in-process of descending, according to preset time interval, gathers multiframe ground image. Then, respectively carrying out object recognition on two adjacent frames of images in the acquired ground image, determining the historical position of the object landing point in the two adjacent frames of images and the current position of the object landing point, and carrying out object tracking on the position of the object landing point based on the historical position so as to avoid that the position of the object cannot be determined when the unmanned aerial vehicle shakes.
Of course, when the unmanned aerial vehicle cannot be tracked based on the historical position of the target landing point, or the historical position of the target landing point cannot be determined, the unmanned aerial vehicle can control itself to stop descending, i.e., hover, and redetermine the target landing point.
Based on the unmanned aerial vehicle landing method shown in fig. 1, a first position of the unmanned aerial vehicle is determined through satellite positioning data, when the unmanned aerial vehicle meets the landing condition determined based on state information, a second position of the unmanned aerial vehicle is determined according to visual positioning data of the unmanned aerial vehicle, and accuracy of the first position is determined according to the relation between the first position and the second position, when the accuracy meets the preset condition, the unmanned aerial vehicle is guided to land according to the satellite positioning data, and when the accuracy does not meet the preset condition, the unmanned aerial vehicle is guided to land according to the visual positioning data. According to the method, the accuracy of the first position is determined through the state information of the satellite positioning data and the visual positioning data, so that whether the accuracy meets the preset condition or not can be achieved, a proper landing mode is selected, and the landing safety of the unmanned aerial vehicle is ensured.
Further, in the unmanned aerial vehicle descending process, in order to avoid the sudden occurrence of an obstacle due to the target landing point, for example: people, vehicles and the like, the unmanned aerial vehicle cannot detect the obstacle, and the unmanned aerial vehicle continues to land to have unexpected conditions, and a distance threshold value, such as two meters, can be preset. In the descending process of the unmanned aerial vehicle, the descending distance of the unmanned aerial vehicle is acquired. When the descending distance of the unmanned aerial vehicle reaches a preset distance threshold value, controlling the unmanned aerial vehicle to stop descending, and collecting a ground image corresponding to the current height. And re-determining a target landing point according to the ground image corresponding to the current height, and controlling the unmanned aerial vehicle to land according to the re-determined target landing point. Wherein the redefining of the target drop point may be achieved by performing step S102.
Furthermore, in the unmanned aerial vehicle descending process, there may be an obstacle suddenly entering the target descending point, and the height of the unmanned aerial vehicle at this time is lower, so that the unmanned aerial vehicle cannot fly upwards to a preset first height according to the collected ground image, for example, the unmanned aerial vehicle descends to a height of two meters away from the ground, a vehicle suddenly appears in the target descending point, the target descending point cannot be determined according to the ground image collected by the unmanned aerial vehicle, at this time, a first height can be preset, when the obstacle suddenly appears at the target descending point, for example, the target descending point cannot descend according to the target descending point and the target descending point cannot be redetermined according to the collected ground image, and after the first height is reached, the target descending point can be redetermined according to the collected ground image.
In the specification, under the premise that the unmanned aerial vehicle fails to execute satellite landing guidance and also fails to execute visual guidance, the unmanned aerial vehicle can redetermine a target landing point and control the unmanned aerial vehicle to fly to the vicinity of the redetermined target landing point, and the steps are repeated until the unmanned aerial vehicle accurately lands.
In addition, in one or more embodiments provided in the present disclosure, environmental factors may also affect the accuracy and efficiency of the landing of the unmanned aerial vehicle, for example, the unmanned aerial vehicle may not fly during the five-stage wind, so the unmanned aerial vehicle may further obtain environmental information, and determine whether the unmanned aerial vehicle performs the landing operation or whether the unmanned aerial vehicle performs the accurate landing operation according to the obtained environmental information.
Further, before determining the second position of the unmanned aerial vehicle in step S104, the unmanned aerial vehicle may further determine whether the self visual positioning data is accurate. To ensure the accurate landing of the unmanned aerial vehicle.
Specifically, the unmanned aerial vehicle can firstly determine a ground image acquired by the unmanned aerial vehicle, and identify a target object of the ground image to determine an identification result.
Then, the unmanned aerial vehicle can judge whether the target falling point exists in the ground image according to the identification result.
If so, the visual positioning data is correct, and the unmanned aerial vehicle can determine the second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle.
If not, the drone may redetermine the target drop point and control the drone to fly above the redetermined target drop point and perform the steps described above in steps S100-S104.
Furthermore, when the accuracy degree is determined to not meet the preset condition, the unmanned aerial vehicle can be considered to be incapable of performing satellite guiding landing, and the unmanned aerial vehicle can be controlled to land to a target landing point through visual guiding according to the collected image data and IMU data.
In addition, in the unmanned aerial vehicle landing process, the condition that the coordinates of the target landing point are inaccurate can also occur, so that the unmanned aerial vehicle can also determine the first coordinates of the target landing point according to the acquired image data, the IMU data and the determined second position, and determine the second coordinates of the target landing point according to the pre-stored position information of the target landing point. And further judging whether the distance between the first coordinate and the second coordinate of the target falling point is smaller than a preset distance threshold value. If the position information of the target falling point is smaller than the target falling point, the unmanned aerial vehicle can be considered to be accurate, and the unmanned aerial vehicle can be controlled to fall based on the position information of the target falling point. If the position information of the target landing point is not less than the target landing point, the unmanned aerial vehicle can be regarded as inaccurate, and the unmanned aerial vehicle can land by adopting visual guidance, so that the unmanned aerial vehicle lands at the target landing point.
It should be noted that, when the first position determined according to the satellite positioning data fails, that is, the first position determined according to the satellite positioning data is inaccurate, and the position information of the target landing point is inaccurate, and the landing failure according to the visual positioning data, that is, the position of the target landing point cannot be identified, the unmanned aerial vehicle may hover, redetermine the target landing point, and descend according to the redetermined target landing point. Or return voyage is carried out according to the route of executing the task. Specific policies may be set as needed, which is not limited in this specification.
The present disclosure also provides an embodiment of a method for landing an unmanned aerial vehicle, specifically as shown in fig. 2.
Fig. 2 is a schematic view of a landing scenario of an unmanned aerial vehicle provided in the present specification, where the schematic view includes a target landing point, a server and the unmanned aerial vehicle. The target landing point can pre-determine the landing environment parameters around the target landing point, and upload the determined landing environment parameters to the server. The drop environment parameter may include that the target drop point is within a predetermined specified area.
The server can determine the corresponding relation between the identification of each target landing point and each landing environment parameter according to the received landing environment parameters of each target landing point and store the corresponding relation. And when the unmanned aerial vehicle needs to land, returning the landing environment parameters corresponding to the target landing point and the coordinates of the target landing point to the unmanned aerial vehicle according to the received mark of the target landing point carried in the acquisition request sent by the unmanned aerial vehicle.
The unmanned aerial vehicle can judge whether the target falling point of the unmanned aerial vehicle is outside the appointed area according to the falling environment parameters returned by the server.
If so, the unmanned aerial vehicle can determine the coordinates of the unmanned aerial vehicle according to the self-configured airborne RTK device and the collected satellite positioning data, and can accurately land to the target landing point through satellite guiding according to the determined coordinates of the unmanned aerial vehicle and the received coordinates of the target landing point.
If not, the unmanned aerial vehicle can collect images below the unmanned aerial vehicle through self-configured airborne image recognition equipment, and the positions of visual beacons in the images are recognized, so that the unmanned aerial vehicle is controlled to drop to a target drop point through IMU data according to the positions of the visual beacons. The visual beacon is an identification which is arranged at the target landing point and can be used for enabling the unmanned aerial vehicle to recognize the position of the target landing point in an image acquired by the unmanned aerial vehicle.
In addition, since the unmanned aerial vehicle generally descends by vertical landing, the landing environment parameter received by the unmanned aerial vehicle is the landing environment parameter around the target landing point. In a scenario where the unmanned aerial vehicle flies in a diagonally downward direction to the target landing point, it is obvious that it is not enough to determine the environmental landing parameters around the target landing point only. The unmanned aerial vehicle can determine a flight path from the current position to the target landing point according to the received coordinates of the target landing point and the received coordinates of the unmanned aerial vehicle, and judge whether the specified area exists in the flight path according to a pre-stored high-precision map marked with the specified area. Of course, the step of determining the flight path described above may also be performed by a server.
The specific content of the environmental drop parameters and how to determine whether the RTK is available based on the environmental drop parameters can be set as desired, and this description is not limited.
Based on the flowchart of the unmanned aerial vehicle landing method shown in fig. 1, the present specification also provides a detailed flowchart of the unmanned aerial vehicle landing method shown in fig. 3.
Fig. 3 is a schematic flow chart of the unmanned aerial vehicle landing method provided in the present specification. In the figure, when the unmanned aerial vehicle needs to land, the target landing point can be determined first, the unmanned aerial vehicle is controlled to land towards the target landing point, and the unmanned aerial vehicle can determine the first position of the unmanned aerial vehicle according to the received satellite positioning data.
After the first position of the unmanned aerial vehicle is determined, the unmanned aerial vehicle can judge whether the unmanned aerial vehicle meets the landing condition or not. I.e., whether the RTK is a fixed solution. Wherein the drop condition is determined based on state information of satellite positioning data.
If yes, the RTK state can be considered to be good, and the unmanned aerial vehicle meets landing conditions. The unmanned aerial vehicle can control the unmanned aerial vehicle to land according to the coordinates of the target landing point and the first position of the unmanned aerial vehicle, namely, RTK landing is adopted.
If not, the RTK state can be considered to be poor, and the unmanned aerial vehicle does not meet the landing condition, so that the unmanned aerial vehicle can descend by adopting visual guidance according to the acquired image data and IMU data.
Further, when the RTK state is good, the unmanned aerial vehicle can judge whether the coordinates of the landing point are reliable. Specifically, the unmanned aerial vehicle can adopt a predetermined designated area to judge whether the unmanned aerial vehicle is located outside the designated area, and then judge whether the first position of the unmanned aerial vehicle is accurate. Wherein the designated area is a predetermined area where multipath effects may exist.
When the first position is outside the designated area, the accuracy of the first position of the drone may be considered high, i.e., the coordinates of the landing point are reliable, and the drone may continue to perform RTK landings.
When the first position is located in the designated area, it may be considered that the first position of the unmanned aerial vehicle may not be accurate enough under the influence of multipath effects, i.e. the coordinates of the landing point are unreliable, and therefore the unmanned aerial vehicle may descend by visual guidance landing.
Furthermore, when the unmanned aerial vehicle drops by adopting the RTK, the problem that the designated area marked in advance does not contain the inaccurate satellite positioning data of all unmanned aerial vehicles may occur, and then the unmanned aerial vehicle can judge whether the first position of the unmanned aerial vehicle is accurate according to the visual navigation data, namely, whether the longitude and latitude heights of the drop points are accurate is judged. Therefore, the unmanned aerial vehicle can determine the second position of the unmanned aerial vehicle according to the visual positioning data, and judge whether the difference between the first position and the second position is smaller than a preset difference threshold value based on the first position and the second position.
If the unmanned aerial vehicle determines that the difference between the first position and the second position is smaller than the preset difference threshold, the accuracy of determining the first position is higher on the premise that the second position determined according to the visual positioning data is considered to be more accurate. That is, the coordinates of the landing points are determined to be reliable based on the visual positioning data and the satellite positioning data, and therefore, the unmanned aerial vehicle can determine that the accuracy of the first position satisfies the preset condition and continue to land with the RTK.
If the difference between the first position and the second position is greater than a preset difference threshold, it can be determined that the accuracy of the first position does not meet the preset condition. That is, it is not reliable to determine the coordinates of the landing points from the visual positioning data and the satellite positioning data, and therefore, the unmanned aerial vehicle can descend by visual guidance landing.
In addition, when unmanned aerial vehicle adopts the vision to guide the landing, can appear that the target falls the drop point and is sheltered from, leads to unable determining the condition that the target falls the drop point, consequently, at unmanned aerial vehicle at the in-process that descends, sustainable judgement can discern whether the target falls the drop point, i.e. whether the vision guides the landing normally.
If so, the drone may continue to perform vision guided descent.
If not, the unmanned aerial vehicle can hover and redetermine the target landing point or return.
Furthermore, before the unmanned aerial vehicle descends, the surrounding environment information, such as weather conditions, wind power levels and the like, and the information of whether the target landing point is normal or not, and if the environment is bad or the target landing point cannot work normally, the unmanned aerial vehicle can redetermine the target landing point or return.
It should be noted that, the above description is given by taking the satellite guiding landing as an example of RTK guiding landing, but the satellite guiding landing also includes multiple satellite positioning technologies such as dual-frequency satellite positioning and differential satellite positioning, and the specific satellite positioning data can be set according to the needs in the process of guiding the unmanned aerial vehicle to land to the target landing point through the satellite positioning data, which is not limited in this specification.
The unmanned aerial vehicle landing method provided by one or more embodiments of the specification further provides a corresponding unmanned aerial vehicle landing device based on the same thought, as shown in fig. 4.
Fig. 4 is a landing device of an unmanned aerial vehicle provided in the present specification, including:
the first determining module 200 determines a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle.
And a second determining module 202, configured to determine, when the unmanned aerial vehicle meets a landing condition, a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle, where the landing condition is determined by state information of the satellite positioning data.
An accuracy determining module 204, configured to determine an accuracy of the first position according to a relationship between the first position and the second position.
And the first landing module 206 is configured to guide, when the accuracy meets a preset condition, the unmanned aerial vehicle to land to the target landing point according to the first position and the position of the target landing point and through satellite positioning data of the unmanned aerial vehicle.
And a second landing module 208, configured to, when the accuracy does not meet the preset condition, guide the unmanned aerial vehicle to land to the target landing point according to the second position and the position of the target landing point and through the visual positioning data of the unmanned aerial vehicle.
The module further comprises:
and a third landing module 210, configured to acquire a ground image acquired by the unmanned aerial vehicle when the unmanned aerial vehicle does not meet the landing condition, determine a position of the target landing point according to the ground image, and guide the unmanned aerial vehicle to land to the target landing point according to inertial navigation data according to the position of the target landing point.
Optionally, the second determining module 202 is configured to determine a position to be corrected of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle, determine a positioning error according to satellite positioning data of a base station that communicates with the unmanned aerial vehicle and a position of the base station, and calibrate the position to be corrected according to the positioning error to obtain the first position.
Optionally, the updating module 204 is configured to determine a distance between the first location and the second location, determine the accuracy as a first accuracy when the distance is greater than a preset distance threshold, determine the accuracy as a second accuracy when the distance is not greater than the distance threshold, and the second accuracy is not equal to the first accuracy, and correspondingly, the accuracy satisfies a preset condition, where: the accuracy is the first accuracy.
Optionally, the second determining module 202 is configured to acquire a ground image acquired by the unmanned aerial vehicle, identify a target object of the ground image, and determine that the target landing point exists according to a result of identifying the target object.
Optionally, the third landing module 210 is configured to collect multiple frames of ground images at a preset time interval during the descent process of the unmanned aerial vehicle, respectively identify objects in two adjacent frames of images in the multiple frames of ground images, determine a historical position of a target landing point in the two adjacent frames of images, and track the position of the target landing point based on the historical position.
Optionally, the landing module 204 is configured to control the unmanned aerial vehicle to stop descending and redetermine the target landing point when the target object identification fails to determine the historical position.
The present specification also provides a computer readable storage medium storing a computer program operable to perform the unmanned aerial vehicle landing method provided in fig. 1 above.
The present specification also provides a computer readable storage medium storing a computer program operable to perform the unmanned aerial vehicle landing method provided in fig. 1 above.
The specification also provides a schematic block diagram of the unmanned aerial vehicle shown in fig. 5. As shown in fig. 5, the unmanned aerial vehicle includes a processor, an internal bus, a network interface, a memory and a nonvolatile memory, and may include hardware required by other services. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to realize the unmanned aerial vehicle landing method described in fig. 1. Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (10)

1. A method of unmanned aerial vehicle landing, comprising:
determining a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
when the unmanned aerial vehicle meets a landing condition, determining a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle, wherein the landing condition is determined by the state information of the satellite positioning data;
Determining the accuracy of the first position according to the relation between the first position and the second position;
when the accuracy meets a preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the first position and the position of the target drop point through satellite positioning data of the unmanned aerial vehicle;
and when the accuracy does not meet the preset condition, guiding the unmanned aerial vehicle to drop to the target drop point according to the second position and the position of the target drop point through the visual positioning data of the unmanned aerial vehicle.
2. The method according to claim 1, wherein the determining the first position of the drone based on satellite positioning data of the drone, specifically comprises:
determining the position to be corrected of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
determining a positioning error according to satellite positioning data of a base station communicating with the unmanned aerial vehicle and the position of the base station;
and calibrating the position to be corrected according to the positioning error to obtain the first position.
3. The method of claim 1, wherein determining the accuracy of the first location based on the relationship of the first location and the second location, comprises:
Determining a distance between the first location and the second location;
when the distance is greater than a preset distance threshold, determining the accuracy as a first accuracy;
determining the accuracy as a second accuracy when the distance is not greater than the distance threshold, the second accuracy not being equal to the first accuracy;
correspondingly, the accuracy meets a preset condition, including:
the accuracy is the first accuracy.
4. The method of claim 1, wherein prior to said determining a second location of the drone based on the visual positioning data of the drone, the method further comprises:
acquiring a ground image acquired by the unmanned aerial vehicle, and identifying a target object of the ground image;
and determining that the target falling point exists according to the target object identification result.
5. The method of claim 1, wherein the method further comprises:
when the unmanned aerial vehicle does not meet the landing condition, acquiring a ground image acquired by the unmanned aerial vehicle;
determining the position of the target falling point through the ground image;
and according to the position of the target landing point, guiding the unmanned aerial vehicle to land to the target landing point through inertial navigation data.
6. The method of claim 5, wherein the acquiring the ground image acquired by the unmanned aerial vehicle specifically comprises:
in the descending process of the unmanned aerial vehicle, acquiring multi-frame ground images at preset time intervals;
correspondingly, determining the position of the target landing point through the ground image specifically comprises:
and respectively carrying out object recognition on two adjacent frames of images in the multi-frame ground image, determining the historical position of the object landing point in the two adjacent frames of images, and carrying out object tracking on the position of the object landing point based on the historical position.
7. The method of claim 6, wherein the drone is in the process of descent, the method further comprising:
and when the historical position cannot be determined by the target object identification, controlling the unmanned aerial vehicle to stop descending, and redefining the target descent point.
8. An unmanned aerial vehicle landing device, comprising:
the first determining module is used for determining a first position of the unmanned aerial vehicle according to satellite positioning data of the unmanned aerial vehicle;
the second determining module is used for determining a second position of the unmanned aerial vehicle according to the visual positioning data of the unmanned aerial vehicle when the unmanned aerial vehicle meets the landing condition, wherein the landing condition is determined by the state information of the satellite positioning data;
An accuracy determining module, configured to determine an accuracy of the first position according to a relationship between the first position and the second position;
the first landing module is used for guiding the unmanned aerial vehicle to land to the target landing point according to the first position and the position of the target landing point when the accuracy meets the preset condition through satellite positioning data of the unmanned aerial vehicle;
and the second landing module is used for guiding the unmanned aerial vehicle to land to the target landing point according to the second position and the position of the target landing point through the visual positioning data of the unmanned aerial vehicle when the accuracy does not meet the preset condition.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1-7.
10. A drone comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of the preceding claims 1 to 7 when the program is executed.
CN202111370004.5A 2021-11-18 2021-11-18 Unmanned aerial vehicle landing method and device Pending CN116136695A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111370004.5A CN116136695A (en) 2021-11-18 2021-11-18 Unmanned aerial vehicle landing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111370004.5A CN116136695A (en) 2021-11-18 2021-11-18 Unmanned aerial vehicle landing method and device

Publications (1)

Publication Number Publication Date
CN116136695A true CN116136695A (en) 2023-05-19

Family

ID=86333183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111370004.5A Pending CN116136695A (en) 2021-11-18 2021-11-18 Unmanned aerial vehicle landing method and device

Country Status (1)

Country Link
CN (1) CN116136695A (en)

Similar Documents

Publication Publication Date Title
US20190137274A1 (en) Device, method, and program
CN109823552B (en) Vision-based unmanned aerial vehicle accurate landing method, storage medium, device and system
CN111142559A (en) Aircraft autonomous navigation method and system and aircraft
US8930130B2 (en) Method for constructing a trajectory of an aircraft by state vector
EP3454016B1 (en) Automatic flight control systems and methods
CN108681335A (en) Imitative ground flying method and device of the plant protection drone in hillside fields
CN111309053B (en) Unmanned aerial vehicle control method, unmanned aerial vehicle return control method, unmanned aerial vehicle, medium and control system
CN113050664A (en) Unmanned aerial vehicle landing method and device
CN111238450A (en) Visual positioning method and device
CN114295119A (en) Map construction method and device
CN111024084A (en) Automatic driving method, device, equipment and storage medium for automatic driving vehicle
CN107764258B (en) Navigation management method of flight management system
CN110567467A (en) map construction method and device based on multiple sensors and storage medium
CN112051857A (en) Switching method of positioning system in dynamic recovery of vehicle-mounted unmanned aerial vehicle
CN112119428A (en) Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position
CN116136695A (en) Unmanned aerial vehicle landing method and device
CN112883871B (en) Model training and unmanned vehicle motion strategy determining method and device
CN116300842A (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN113031656B (en) Unmanned aerial vehicle control method and device
CN114440902A (en) Method and device for constructing elevation map
CA3175666A1 (en) Systems and methods for mobile aerial flight planning and image capturing based on structure footprints
CN205353775U (en) Unmanned aerial vehicle
CN112631336A (en) Control method, system and device of unmanned aerial vehicle
CN116184466A (en) Method and device for determining landing point of unmanned aerial vehicle
CN114322987B (en) Method and device for constructing high-precision map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination