CN112884900A - Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest - Google Patents

Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest Download PDF

Info

Publication number
CN112884900A
CN112884900A CN202110185350.XA CN202110185350A CN112884900A CN 112884900 A CN112884900 A CN 112884900A CN 202110185350 A CN202110185350 A CN 202110185350A CN 112884900 A CN112884900 A CN 112884900A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
attitude
laser radar
attitude correction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110185350.XA
Other languages
Chinese (zh)
Inventor
左欢金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Golden Starlight Intelligent Technology Co ltd
Original Assignee
Guangdong Golden Starlight Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Golden Starlight Intelligent Technology Co ltd filed Critical Guangdong Golden Starlight Intelligent Technology Co ltd
Priority to CN202110185350.XA priority Critical patent/CN112884900A/en
Publication of CN112884900A publication Critical patent/CN112884900A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application discloses a landing positioning method and device of an unmanned aerial vehicle, a storage medium and an unmanned aerial vehicle nest, wherein the method comprises the following steps: in the unmanned aerial vehicle landing process, when unmanned aerial vehicle enters into the scanning range of the laser radar that unmanned aerial vehicle nest set up, scan through laser radar to unmanned aerial vehicle and obtain a cloud image, according to a cloud data estimation unmanned aerial vehicle's current gesture, compare the gesture correction parameter between current gesture and the target gesture, unmanned aerial vehicle nest instructs unmanned aerial vehicle to carry out the gesture correction according to the gesture correction parameter, can control the steady landing of unmanned aerial vehicle to unmanned aerial vehicle nest, unmanned aerial vehicle inside need not set up inertial measurement unit, unmanned aerial vehicle's hardware cost and weight reduction have been reduced.

Description

Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest
Technical Field
The application relates to the field of positioning, in particular to a landing positioning method and device for an unmanned aerial vehicle, a storage medium and an unmanned aerial vehicle nest.
Background
With the wide application of unmanned aerial vehicles in various industries, the unmanned aerial vehicles also need better performance to adapt to the requirements of various industries in the aspects of flight speed, energy consumption, positioning accuracy and the like based on different requirements of different industries.
The unmanned aerial vehicle nest is the platform that unmanned aerial vehicle takes off and land, in order to realize that accurate landing of unmanned aerial vehicle arrives the unmanned aerial vehicle nest, most all integrate inertial measurement unit such as gyroscope among the unmanned aerial vehicle, the accurate steady landing of supplementary unmanned aerial vehicle arrives on the unmanned aerial vehicle nest. However, as the unmanned aerial vehicle is developed towards miniaturization, the battery capacity and the size of the unmanned aerial vehicle are limited, and how to improve the cruising ability of the unmanned aerial vehicle and reduce the hardware cost is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a landing positioning method and device for an unmanned aerial vehicle, a storage medium and an unmanned aerial vehicle nest, and can solve the problems of high hardware cost and complex structure caused by the fact that the unmanned aerial vehicle in the related art lands in an auxiliary mode through an inertia measurement unit. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a landing positioning method for an unmanned aerial vehicle, the method includes:
detecting whether the unmanned aerial vehicle enters a scanning range of the laser radar or not in the landing process of the unmanned aerial vehicle;
if so, scanning the unmanned aerial vehicle through the laser radar to obtain a point cloud image;
determining the current attitude of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image, and calculating an attitude correction parameter between the current attitude and a target attitude;
sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
In a second aspect, an embodiment of the present application provides an unmanned aerial vehicle's landing positioner, include:
the detection unit is used for detecting whether the unmanned aerial vehicle enters the scanning range of the laser radar or not in the landing process of the unmanned aerial vehicle;
the scanning unit is used for scanning the unmanned aerial vehicle through the laser radar to obtain a point cloud image if the unmanned aerial vehicle is detected to be in the correct position;
the calculation unit is used for determining the current posture of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image and calculating a posture correction parameter between the current posture and a target posture;
the sending unit is used for sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a fourth aspect, an embodiment of the present application provides an unmanned aerial vehicle nest, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
The beneficial effects brought by the technical scheme provided by some embodiments of the application at least comprise:
in the unmanned aerial vehicle landing process, when unmanned aerial vehicle enters into the scanning range of the laser radar that unmanned aerial vehicle nest set up, scan through laser radar to unmanned aerial vehicle and obtain a cloud image, according to a cloud data estimation unmanned aerial vehicle's current gesture, compare the gesture correction parameter between current gesture and the target gesture, unmanned aerial vehicle nest instructs unmanned aerial vehicle to carry out the gesture correction according to the gesture correction parameter, realize like this that need not set up inertial measurement unit to unmanned aerial vehicle and just can realize the gesture correction, thereby guarantee that unmanned aerial vehicle steadily lands to unmanned aerial vehicle nest, can reduce the space that inertial measurement unit occupy unmanned aerial vehicle, and reduce unmanned aerial vehicle because of the power consumption of inertial measurement unit during operation and lighten unmanned aerial vehicle's weight, unmanned aerial vehicle's hardware cost is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a network structure diagram provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a landing positioning method for an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a landing positioning device of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Referring to fig. 1, a network architecture diagram provided for an embodiment of the present application includes: mobile terminal 11, unmanned aerial vehicle machine nest 12 and unmanned aerial vehicle 13, unmanned aerial vehicle 13's quantity can be a plurality of, and the unmanned aerial vehicle cluster can be constituteed to a plurality of unmanned aerial vehicles, and the task is accomplished in unmanned aerial vehicle cluster cooperation.
Unmanned aerial vehicle nest 12 provides the platform of berthing for unmanned aerial vehicle 13, and take off and descending of unmanned aerial vehicle 13 can be controlled to unmanned aerial vehicle nest 12, and after unmanned aerial vehicle 13 successfully takes off, mobile terminal 11 can take over unmanned aerial vehicle 13's control right, controls unmanned aerial vehicle 13's flying speed and flight direction, or unmanned aerial vehicle 13 also can start automatic flight mode, controls its flying speed and flight direction isoparametric by oneself.
WIFI communication mode can be adopted between mobile terminal 11 and unmanned aerial vehicle nest 12, also can adopt WIFI communication mode before unmanned aerial vehicle nest 12 and the unmanned aerial vehicle, and unmanned aerial vehicle nest 12 can regard as relay node simultaneously, forwards the image data that unmanned aerial vehicle 13 gathered to mobile terminal 11 to increase communication distance. The mobile terminal 11 may be a mobile phone, a platform computer, a notebook computer, or the like.
Based on the network architecture of fig. 1, please refer to fig. 2, which is a schematic flow chart of a landing positioning method for an unmanned aerial vehicle according to an embodiment of the present application. As shown in fig. 2, the method of the embodiment of the present application may include the steps of:
s201, in the landing process of the unmanned aerial vehicle, whether the unmanned aerial vehicle enters the scanning range of the laser radar is detected.
The mobile terminal can send a landing instruction to the unmanned aerial vehicle through the unmanned aerial vehicle nest, the unmanned aerial vehicle nest is used as a relay node for communication between the mobile terminal and the unmanned aerial vehicle, position information of the unmanned aerial vehicle nest is carried in the landing instruction, and the unmanned aerial vehicle responds to the landing instruction and moves towards the direction close to the unmanned aerial vehicle nest according to the position information of the unmanned aerial vehicle nest. Unmanned aerial vehicle aircraft nest is provided with lidar, and lidar has certain scanning range, and scanning range comprises horizontal scanning scope and vertical scanning scope, for example: laser radar sets up the landing sign department at unmanned aerial vehicle aircraft nest, and laser radar launches laser beam towards the sky and scans, and laser radar's scanning range is conical. When the unmanned aerial vehicle nest detects that the unmanned aerial vehicle nest enters into laser radar's scanning scope, start laser radar and carry out periodic scanning, avoid laser radar to work when not detecting unmanned aerial vehicle, consequently can reduce the electric quantity consumption of laser radar during operation.
The method for detecting whether the unmanned aerial vehicle enters the scanning range of the laser radar or not by the unmanned aerial vehicle nest comprises the following steps: the unmanned aerial vehicle nest is pre-stored or is provided with a three-dimensional coordinate set representing a scanning range of the laser radar, the unmanned aerial vehicle nest acquires three-dimensional coordinates of the unmanned aerial vehicle in real time, the unmanned aerial vehicle and the unmanned aerial vehicle nest represent position information in the same laser radar coordinate system, the origin of the laser radar coordinate system can be the position of a landing mark, the xy plane is a horizontal plane, and the z axis is perpendicular to the horizontal plane. When the unmanned aerial vehicle nest detects that the three-dimensional coordinate of the unmanned aerial vehicle is located in the three-dimensional coordinate set of the laser radar, the unmanned aerial vehicle is determined to enter the scanning range of the laser radar.
202. If yes, scanning the unmanned aerial vehicle through a laser radar to obtain a point cloud image.
If the detection result of S201 is yes, scanning is performed on the unmanned aerial vehicle through the laser radar, the scanning duration is the length of one frame, and the unmanned aerial vehicle nest can scan the unmanned aerial vehicle multiple times in one frame to obtain multiple point cloud images.
In one or more embodiments, the point cloud image may include other obstacles in addition to the drone, such as: the method comprises the steps that filtering processing is carried out on cloud point data collected by a laser radar, the area of the unmanned aerial vehicle is divided from a point cloud image, a neural network method can be used in the dividing method, and the area of the unmanned aerial vehicle is identified from the point cloud image by using a pre-trained background division model, so that the accuracy of subsequent unmanned aerial vehicle attitude estimation is improved.
S203, determining the current posture of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image, and calculating a posture correction parameter between the current posture and the target posture.
The attitude of the unmanned aerial vehicle represents the azimuth direction of the unmanned aerial vehicle in a laser radar coordinate system, and the method for estimating the current attitude of the unmanned aerial vehicle comprises the following steps: a model-based pose estimation method and a learning-based pose estimation method.
In the model-based attitude estimation method, the set relationship or feature points of the unmanned aerial vehicle are used for estimation, and the basic idea is to use a geometric model or structure to represent the structure and shape of the unmanned aerial vehicle, establish a corresponding relationship between the model and an image by extracting object features, and then realize estimation of the spatial attitude of the object by geometric or other methods.
In the learning-based posture estimation method, the learning-based method learns the corresponding relationship between two-dimensional observation and three-dimensional posture from training samples in different postures acquired in advance by means of a machine learning method, applies a decision rule or a regression function obtained by learning to the samples, and uses the obtained result as a posture classifier for posture estimation of the samples. The learning-based method generally adopts global observation features, does not need to detect or identify local features of an object, and has good robustness. And extracting the image characteristics of the point cloud image by the unmanned aerial vehicle nest, and identifying the current posture of the unmanned aerial vehicle by using a pre-trained posture classifier.
The target attitude is a standard attitude of the unmanned aerial vehicle parked at a landing mark of an unmanned aerial vehicle nest, attitude offset between the current attitude and the target attitude is compared, and attitude correction parameters are calculated based on the attitude offset and comprise translation amount and rotation amount, so that the current attitude and the target attitude of the unmanned aerial vehicle are kept consistent. The pose of the drone may be represented using a six degree of freedom vector.
And S204, sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle.
The attitude correction instruction is used for indicating the unmanned aerial vehicle to correct the attitude according to the attitude correction parameter, and controlling the current attitude of the unmanned aerial vehicle to be consistent with the target attitude in real time, so that the unmanned aerial vehicle can stably land on an unmanned aerial vehicle nest. The unmanned aerial vehicle nest can send the gesture correction instruction to unmanned aerial vehicle through the WIFI link. The unmanned aerial vehicle nest is provided with a plurality of antennas, and antenna array is constituteed to a plurality of antennas, and at unmanned aerial vehicle nest and unmanned aerial vehicle communication process, unmanned aerial vehicle nest acquires the angle of unmanned aerial vehicle for antenna array in real time, realizes beam forming through the weighting coefficient of adjustment each antenna, with the directional this unmanned aerial vehicle of direction of transmission of WIFI signal, can improve the communication quality of WIFI link from this.
In this embodiment, in unmanned aerial vehicle descending in-process, when unmanned aerial vehicle enters into the scanning range of the laser radar that unmanned aerial vehicle nest set up, scan through laser radar to unmanned aerial vehicle and obtain a cloud of points image, according to a cloud of points data estimation unmanned aerial vehicle's current gesture, compare the gesture correction parameter between current gesture and the target gesture, unmanned aerial vehicle nest instructs unmanned aerial vehicle to carry out the gesture correction according to the gesture correction parameter, realize like this that to unmanned aerial vehicle need not set up inertial measurement unit and just can realize the gesture correction, thereby guarantee that unmanned aerial vehicle steadily descends to unmanned aerial vehicle nest, can reduce the space that inertial measurement unit took unmanned aerial vehicle, and reduce unmanned aerial vehicle because of the power consumption of inertial measurement unit during operation and lighten unmanned aerial vehicle's weight, reduce unmanned aerial vehicle's.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 3, which shows a schematic structural diagram of a landing positioning device of a drone according to an exemplary embodiment of the present application. The landing positioning device of the unmanned aerial vehicle can be realized into all or part of the unmanned aerial vehicle nest through software, hardware or the combination of the software and the hardware. The device 3 comprises: a detection unit 31, a scanning unit 32, a calculation unit 33 and a transmission unit 34.
The detection unit 31 is configured to detect whether the unmanned aerial vehicle enters a scanning range of the laser radar in a landing process of the unmanned aerial vehicle;
the scanning unit 32 is configured to scan the unmanned aerial vehicle through the laser radar to obtain a point cloud image if the unmanned aerial vehicle is detected to be detected;
a calculating unit 33, configured to determine a current pose of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image, and calculate a pose correction parameter between the current pose and a target pose;
a sending unit 34, configured to send an attitude correction instruction carrying the attitude correction parameter to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
In one or more embodiments, the detecting whether the drone enters the scanning range of the lidar during the landing of the drone includes:
acquiring a three-dimensional coordinate of the unmanned aerial vehicle in the laser radar coordinate system;
judging whether the three-dimensional coordinates are in a preset three-dimensional coordinate set or not;
if so, determining that the unmanned aerial vehicle enters the scanning range of the laser radar;
if not, determining that the unmanned aerial vehicle does not enter the scanning range of the laser radar.
In one or more embodiments, the determining the current pose of the drone in a lidar coordinate system from the point cloud image includes:
performing feature extraction on the point cloud image to obtain image features;
and processing the image features based on an attitude classifier to obtain the current attitude of the unmanned aerial vehicle in the laser radar coordinate system.
In one or more embodiments, the scanning unit 42 is further configured to:
and filtering the point cloud data according to a deep learning network.
In one or more embodiments, the drone nest is provided with a landing marker located at an origin of the lidar coordinate system, the attitude correction parameters include: the amount of translation and the amount of rotation.
In one or more embodiments, the sending, to the drone, an attitude correction instruction carrying the attitude correction parameters includes:
and sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle through a WIFI link.
In one or more embodiments, the drone nest is provided with an antenna array comprising a plurality of antennas;
the apparatus 3 further comprises: the adjusting unit is used for acquiring the position information of the unmanned aerial vehicle;
determining an angle of the drone relative to the antenna array from the location information;
and adjusting the weighting coefficient of each antenna in the antenna array according to the angle to generate a WIFI signal pointing to the unmanned aerial vehicle.
It should be noted that, when the landing positioning device of the unmanned aerial vehicle provided in the above embodiment executes the landing positioning method of the unmanned aerial vehicle, the above division of each functional module is merely used for illustration, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the landing positioning device of the unmanned aerial vehicle provided by the embodiment and the landing positioning method of the unmanned aerial vehicle belong to the same concept, and the embodiment of the method for embodying the implementation process is detailed in the embodiment, which is not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the method steps in the embodiment shown in fig. 2, and a specific execution process may refer to a specific description of the embodiment shown in fig. 2, which is not described herein again.
Fig. 4 is a schematic structural diagram of an apparatus according to an embodiment of the present disclosure. As shown in fig. 4, the apparatus may be the drone nest of fig. 1, and the drone nest 1000 may include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
Wherein a communication bus 1002 is used to enable connective communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 1001 may include one or more processing cores, among other things. The processor 1001 interfaces with various interfaces and lines throughout the various portions of the drone nest 1000, executing various functions of the drone nest 1000 and processing data by executing or executing instructions, programs, sets of codes, or sets of instructions stored in the memory 1005, as well as invoking data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 1001, but may be implemented by a single chip.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 4, the memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an application program.
In the drone nest 1000 shown in fig. 4, the user interface 1003 is mainly used to provide an input interface for a user to obtain data input by the user; and the processor 1001 may be configured to call an application program stored in the memory 1005 for configuring the application program interface, and specifically perform the following operations:
detecting whether the unmanned aerial vehicle enters a scanning range of the laser radar or not in the landing process of the unmanned aerial vehicle;
if so, scanning the unmanned aerial vehicle through the laser radar to obtain a point cloud image;
determining the current attitude of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image, and calculating an attitude correction parameter between the current attitude and a target attitude;
sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
In one or more embodiments, the processor 1001 performs the detecting whether the drone enters the scanning range of the lidar during the landing of the drone, including:
acquiring a three-dimensional coordinate of the unmanned aerial vehicle in the laser radar coordinate system;
judging whether the three-dimensional coordinates are in a preset three-dimensional coordinate set or not;
if so, determining that the unmanned aerial vehicle enters the scanning range of the laser radar;
if not, determining that the unmanned aerial vehicle does not enter the scanning range of the laser radar.
In one or more embodiments, processor 1001 performs the determining the current pose of the drone in a lidar coordinate system from the point cloud image, including:
performing feature extraction on the point cloud image to obtain image features;
and processing the image features based on an attitude classifier to obtain the current attitude of the unmanned aerial vehicle in the laser radar coordinate system.
In one or more embodiments, the processor 1001 executes the scanning of the drone by the lidar to obtain a point cloud image, further including:
and filtering the point cloud data according to a deep learning network.
In one or more embodiments, the drone nest is provided with a landing marker located at an origin of the lidar coordinate system, the attitude correction parameters include: the amount of translation and the amount of rotation.
In one or more embodiments, the sending, to the drone, an attitude correction instruction carrying the attitude correction parameters includes:
and sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle through a WIFI link.
In one or more embodiments, the drone nest is provided with an antenna array comprising a plurality of antennas;
the processor 1001 is further configured to perform:
acquiring the position information of the unmanned aerial vehicle;
determining an angle of the drone relative to the antenna array from the location information;
and adjusting the weighting coefficient of each antenna in the antenna array according to the angle to generate a WIFI signal pointing to the unmanned aerial vehicle.
The concept of this embodiment is the same as that of the embodiment of the method in fig. 2, and the technical effects brought by the embodiment are also the same, and the specific process can refer to the description of the embodiment in fig. 2, and will not be described again here.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (10)

1. A landing positioning method of an unmanned aerial vehicle is characterized by comprising the following steps:
detecting whether the unmanned aerial vehicle enters a scanning range of the laser radar or not in the landing process of the unmanned aerial vehicle;
if so, scanning the unmanned aerial vehicle through the laser radar to obtain a point cloud image;
determining the current attitude of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image, and calculating an attitude correction parameter between the current attitude and a target attitude;
sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
2. The method of claim 1, wherein detecting whether the drone enters the scanning range of the lidar during the landing of the drone comprises:
acquiring a three-dimensional coordinate of the unmanned aerial vehicle in the laser radar coordinate system;
judging whether the three-dimensional coordinates are in a preset three-dimensional coordinate set or not;
if so, determining that the unmanned aerial vehicle enters the scanning range of the laser radar;
if not, determining that the unmanned aerial vehicle does not enter the scanning range of the laser radar.
3. The method of claim 2, wherein the determining the current pose of the drone in a lidar coordinate system from the point cloud image comprises:
performing feature extraction on the point cloud image to obtain image features;
and processing the image features based on an attitude classifier to obtain the current attitude of the unmanned aerial vehicle in the laser radar coordinate system.
4. The method of claim 1, wherein scanning the drone with the lidar to obtain a point cloud image further comprises:
and filtering the point cloud data according to a deep learning network.
5. The method of claim 1, wherein the drone nest is provided with a landing marker located at an origin of the lidar coordinate system, the attitude correction parameters comprising: the amount of translation and the amount of rotation.
6. The method of claim 1, wherein sending an attitude correction instruction carrying the attitude correction parameters to the drone comprises:
and sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle through a WIFI link.
7. The method of claim 6, wherein the drone nest is provided with an antenna array comprising a plurality of antennas;
the method further comprises the following steps:
acquiring the position information of the unmanned aerial vehicle;
determining an angle of the drone relative to the antenna array from the location information;
and adjusting the weighting coefficient of each antenna in the antenna array according to the angle to generate a WIFI signal pointing to the unmanned aerial vehicle.
8. The utility model provides an unmanned aerial vehicle's descending positioner, its characterized in that includes:
the detection unit is used for detecting whether the unmanned aerial vehicle enters the scanning range of the laser radar or not in the landing process of the unmanned aerial vehicle;
the scanning unit is used for scanning the unmanned aerial vehicle through the laser radar to obtain a point cloud image if the unmanned aerial vehicle is detected to be in the correct position;
the calculation unit is used for determining the current posture of the unmanned aerial vehicle in a laser radar coordinate system according to the point cloud image and calculating a posture correction parameter between the current posture and a target posture;
the sending unit is used for sending an attitude correction instruction carrying the attitude correction parameters to the unmanned aerial vehicle; and the attitude correction instruction is used for instructing the unmanned aerial vehicle to carry out attitude correction according to the attitude correction parameters.
9. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to carry out the method steps according to any one of claims 1 to 7.
10. An unmanned aerial vehicle nest, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 7.
CN202110185350.XA 2021-02-10 2021-02-10 Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest Pending CN112884900A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110185350.XA CN112884900A (en) 2021-02-10 2021-02-10 Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110185350.XA CN112884900A (en) 2021-02-10 2021-02-10 Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest

Publications (1)

Publication Number Publication Date
CN112884900A true CN112884900A (en) 2021-06-01

Family

ID=76056478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110185350.XA Pending CN112884900A (en) 2021-02-10 2021-02-10 Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest

Country Status (1)

Country Link
CN (1) CN112884900A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608542A (en) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 Control method and equipment for automatic landing of unmanned aerial vehicle
CN114089783A (en) * 2021-11-17 2022-02-25 湖南精飞智能科技有限公司 Unmanned aerial vehicle intelligent terminal based on 5g big dipper technique
CN114355975A (en) * 2021-12-30 2022-04-15 达闼机器人有限公司 Method, system, processing device and medium for homing of flight device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150057844A1 (en) * 2012-03-30 2015-02-26 Parrot Method for controlling a multi-rotor rotary-wing drone, with cross wind and accelerometer bias estimation and compensation
CN107226206A (en) * 2016-03-24 2017-10-03 深圳市创翼睿翔天空科技有限公司 multi-rotor unmanned aerial vehicle safe landing system and method
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN111273128A (en) * 2020-02-28 2020-06-12 广东工业大学 Pipeline robot for detecting underground cable fault
CN111522022A (en) * 2020-04-20 2020-08-11 西安电子科技大学 Dynamic target detection method of robot based on laser radar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150057844A1 (en) * 2012-03-30 2015-02-26 Parrot Method for controlling a multi-rotor rotary-wing drone, with cross wind and accelerometer bias estimation and compensation
CN107226206A (en) * 2016-03-24 2017-10-03 深圳市创翼睿翔天空科技有限公司 multi-rotor unmanned aerial vehicle safe landing system and method
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN111273128A (en) * 2020-02-28 2020-06-12 广东工业大学 Pipeline robot for detecting underground cable fault
CN111522022A (en) * 2020-04-20 2020-08-11 西安电子科技大学 Dynamic target detection method of robot based on laser radar

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608542A (en) * 2021-08-12 2021-11-05 山东信通电子股份有限公司 Control method and equipment for automatic landing of unmanned aerial vehicle
CN113608542B (en) * 2021-08-12 2024-04-12 山东信通电子股份有限公司 Control method and equipment for automatic landing of unmanned aerial vehicle
CN114089783A (en) * 2021-11-17 2022-02-25 湖南精飞智能科技有限公司 Unmanned aerial vehicle intelligent terminal based on 5g big dipper technique
CN114089783B (en) * 2021-11-17 2022-06-07 湖南精飞智能科技有限公司 Unmanned aerial vehicle intelligent terminal based on 5g big dipper technique
CN114355975A (en) * 2021-12-30 2022-04-15 达闼机器人有限公司 Method, system, processing device and medium for homing of flight device
CN114355975B (en) * 2021-12-30 2024-03-05 达闼机器人股份有限公司 Method, system, processing equipment and medium for returning flying equipment to nest

Similar Documents

Publication Publication Date Title
US10884433B2 (en) Aerial drone utilizing pose estimation
CN112884900A (en) Landing positioning method and device for unmanned aerial vehicle, storage medium and unmanned aerial vehicle nest
CN113485453B (en) Method and device for generating inspection flight path of marine unmanned aerial vehicle and unmanned aerial vehicle
CN108419446B (en) System and method for laser depth map sampling
US11829141B2 (en) Determining a three-dimensional model of a scan target
CN113031631A (en) Unmanned aerial vehicle landing method and device, storage medium and unmanned aerial vehicle nest
CN109683699B (en) Method and device for realizing augmented reality based on deep learning and mobile terminal
JP2021534481A (en) Obstacle or ground recognition and flight control methods, devices, equipment and storage media
CN113359782B (en) Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN111123964B (en) Unmanned aerial vehicle landing method and device and computer readable medium
CN112987764B (en) Landing method, landing device, unmanned aerial vehicle and computer-readable storage medium
US11069080B1 (en) Collaborative airborne object tracking systems and methods
CN111126209B (en) Lane line detection method and related equipment
CN113189989B (en) Vehicle intention prediction method, device, equipment and storage medium
CN113887400B (en) Obstacle detection method, model training method and device and automatic driving vehicle
US20200380727A1 (en) Control method and device for mobile device, and storage device
CN112995890A (en) Unmanned aerial vehicle positioning method and device, storage medium and unmanned aerial vehicle nest
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
CN113671523A (en) Robot positioning method, device, storage medium and robot
CN112037255A (en) Target tracking method and device
CN115610694A (en) Unmanned aerial vehicle accurate landing method and system based on target detection
CN114721419A (en) Tower inspection method, device and equipment for power transmission line based on unmanned aerial vehicle
CN113064441A (en) Unmanned aerial vehicle parking method and device, storage medium and unmanned aerial vehicle nest
CN113168532A (en) Target detection method and device, unmanned aerial vehicle and computer readable storage medium
CN116148883B (en) SLAM method, device, terminal equipment and medium based on sparse depth image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination