WO2021016875A1 - Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur - Google Patents
Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur Download PDFInfo
- Publication number
- WO2021016875A1 WO2021016875A1 PCT/CN2019/098411 CN2019098411W WO2021016875A1 WO 2021016875 A1 WO2021016875 A1 WO 2021016875A1 CN 2019098411 W CN2019098411 W CN 2019098411W WO 2021016875 A1 WO2021016875 A1 WO 2021016875A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- aerial vehicle
- unmanned aerial
- palm
- distance data
- uav
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000009467 reduction Effects 0.000 claims abstract description 77
- 230000008859 change Effects 0.000 claims abstract description 70
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 35
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 24
- 238000003062 neural network model Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 16
- 230000015654 memory Effects 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 description 15
- 238000013528 artificial neural network Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 6
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000005283 ground state Effects 0.000 description 2
- 230000007787 long-term memory Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 230000006403 short-term memory Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Definitions
- This application relates to the technical field of aircraft control, and in particular to an aircraft landing method, an unmanned aircraft, and a computer-readable storage medium.
- an unmanned aerial vehicle especially a small unmanned aerial vehicle
- the user is usually required to operate a control terminal to control the unmanned aerial vehicle to land on a flat ground at a distance from the user.
- small unmanned aerial vehicles are usually small in size, with relatively short tripods and low chassis.
- the control terminal When the control terminal is used to control the unmanned aerial vehicle for landing, it is difficult to achieve precise control, and the landing area is relatively high, and the unmanned aerial vehicle needs to land on a relatively flat ground.
- the UAV is likely to tip over or scratch the sensor on the bottom of the UAV when it is landing, which reduces the unmanned aircraft.
- the safety of the aircraft landing When the control terminal is used to control the landing of an unmanned aerial vehicle, the interaction is too complicated and requires the user to have a relatively rich operating experience, which is not friendly to ordinary users. Therefore, how to provide a safe and convenient landing method is an urgent problem to be solved.
- the present application provides an aircraft landing method, an unmanned aircraft, and a computer-readable storage medium, aiming to improve the safety and convenience of the unmanned aircraft landing.
- this application provides a method for landing an aircraft, including:
- the unmanned aerial vehicle When the unmanned aerial vehicle is in a hovering state, acquiring distance data between the unmanned aerial vehicle and a reference target below the unmanned aerial vehicle, and acquiring an image below the unmanned aerial vehicle;
- this application provides an unmanned aerial vehicle, including a memory and a processor
- the memory is used to store computer programs
- the processor is configured to execute the computer program, and when executing the computer program, implement the following steps:
- the unmanned aerial vehicle When the unmanned aerial vehicle is in a hovering state, acquiring distance data between the unmanned aerial vehicle and a reference target below the unmanned aerial vehicle, and acquiring an image below the unmanned aerial vehicle;
- the present application provides a computer-readable storage medium that stores a computer program, and when the computer program is executed by a processor, the processor implements the aforementioned aircraft landing method.
- the embodiments of the present application provide an aircraft landing method, an unmanned aerial vehicle, and a computer-readable storage medium.
- the unmanned aerial vehicle When the unmanned aerial vehicle is in a hovering state, obtain the difference between the unmanned aerial vehicle and the reference target under the unmanned aerial vehicle And obtain the image below the UAV; when it is determined that the change in the distance data satisfies the preset distance reduction condition and the palm appears under the UAV, control the UAV to land On the palm.
- This method of landing can land directly on the palm of the user, effectively reducing the requirements for landing conditions of the unmanned aerial vehicle in the prior art, and improving the safety of the unmanned aerial vehicle landing; at the same time, there is no need to use a control terminal such as a remote control or a mobile phone to control the drone.
- the landing operation of the human aircraft is simple, which improves the convenience of landing, enables ordinary users to quickly use it, and improves the user experience.
- Fig. 1 is a schematic block diagram of an unmanned aerial system provided by an embodiment of the present application
- FIG. 2 is a schematic flowchart of a method for landing an aircraft according to an embodiment of the present application
- FIG. 3 is a schematic diagram when a palm appears under the unmanned aerial vehicle according to an embodiment of the present application
- FIG. 4 is a schematic diagram of an unmanned aerial vehicle provided by an embodiment of the present application when no palm appears underneath;
- FIG. 5 is a schematic flowchart of a method for landing an aircraft according to an embodiment of the present application
- Fig. 6 is a schematic block diagram of an unmanned aerial vehicle provided by an embodiment of the present application.
- the embodiment of the present application provides an aircraft landing method that can be applied to an unmanned aircraft to control the landing process of the unmanned aircraft.
- the unmanned aerial vehicle may be a rotary-wing type unmanned aerial vehicle, such as a single-rotor aircraft, a double-rotor aircraft, a three-rotor aircraft, a four-rotor aircraft, a six-rotor aircraft, an eight-rotor aircraft, a ten-rotor aircraft, and a twelve-rotor aircraft.
- the unmanned aerial vehicle may also be other types of unmanned aerial vehicles or movable devices, and the embodiments of the present application are not limited thereto.
- FIG. 1 is a schematic block diagram of an unmanned aerial system according to an embodiment of the present application.
- a rotor-type unmanned aerial vehicle is taken as an example.
- the unmanned aerial system 100 may include an unmanned aerial vehicle 110 and a control terminal 120.
- the UAV 110 includes a control device 111, a distance measuring device 112 and a pan/tilt 113.
- the control device 111 is used to control the flight of the unmanned aerial vehicle. It is understandable that the control device can control the unmanned aerial vehicle according to pre-programmed program instructions, and can also control the unmanned aerial vehicle by responding to one or more control instructions of the control terminal 120. Wherein, the control device 111 may include a flight controller.
- the distance measuring device 112 is mounted under the unmanned aerial vehicle to measure the distance between the unmanned aerial vehicle and a reference target under the unmanned aerial vehicle. It is understood that the reference target can be any Any object below the UAV and within the sensing range of the distance measuring device 12. When the user places his hand under the UAV, the user's palm will enter the sensing range of the distance measuring device. At this time, the reference target is the user's palm.
- the control device 11 can obtain the sensor data output by the distance measuring device 112 and determine the distance data from the reference target under the UAV according to the sensor data.
- the distance measuring device includes at least one of the following: Time of Flight (TOF) ranging detection equipment, radar, ultrasonic detection equipment, laser detection equipment, and vision detection equipment.
- TOF Time of Flight
- the pan/tilt 113 may include an electronic speed governor (referred to as an ESC) 1131 and a motor 1132.
- the pan/tilt is used to carry the camera 1133.
- the control device can control the movement of the pan/tilt through ESC and motor.
- the pan/tilt may also include a controller, through which the ESC and the motor are controlled to control the movement of the pan/tilt.
- the gimbal can be independent of the unmanned aerial vehicle or a part of the unmanned aerial vehicle.
- the motor can be a DC motor or an AC motor.
- the motor may be a brushless motor or a brushed motor, and the embodiment of the present application is not limited thereto.
- the shooting device 1133 communicates with the control device, and shoots a video under the control of the control device.
- the captured video contains multiple frames of real-time images, and the user's palm position is determined by analyzing the real-time images.
- the photographing device 1131 can be any device that can obtain image information, such as an RGB camera, a monocular camera, a binocular camera, a Time of Flight (TOF) camera, etc.
- the image information can be an RGB image, a gray image, or a depth image. Wait.
- the photographing device 1133 may also be arranged in other suitable positions of the unmanned aerial vehicle, such as the nose of the unmanned aerial vehicle, and the embodiment of the present application is not limited thereto.
- the control terminal 120 is located on the ground end of the unmanned aerial vehicle, and can communicate with the unmanned aerial vehicle in a wireless manner for remote control of the aerial vehicle.
- the control terminal may include at least one of a remote control and a user terminal installed in an application program for controlling the unmanned aerial vehicle.
- the user terminal can be a control terminal such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, and a wearable device.
- receiving the flight control instructions input by the user through the control terminal can refer to the control of the unmanned aerial vehicle through the joystick, dial, buttons, buttons and other input devices on the remote control or the user interface (UI) on the user terminal.
- UI user interface
- FIG. 2 is a schematic flowchart of a method for landing an aircraft according to an embodiment of the present application.
- the landing method of the aircraft may be executed by an unmanned aerial vehicle, specifically, may be executed by a control device of the unmanned aerial vehicle.
- the control device of the unmanned aerial vehicle may be the control device 111 shown in FIG. 1.
- the control device of the unmanned aerial vehicle executes the method as an example for schematic description.
- the landing method of the aircraft in this embodiment includes step S210 to step S230.
- control device detects the flight status of the unmanned aerial vehicle, and when it is determined that the unmanned aerial vehicle is in the hovering state, it measures the distance between the unmanned aerial vehicle and the unmanned aerial vehicle by controlling the distance measuring device mounted under the unmanned aerial vehicle.
- the distance data between the reference targets under the aircraft is controlled by the camera on the unmanned aircraft to capture images to obtain the image under the unmanned aircraft.
- the flight status of the unmanned aerial vehicle may include: the movement state, the hovering state and the unmanned aerial vehicle staying on the ground state.
- the motion state further includes the climbing state, the descent state, and the horizontal flight state.
- S220 Determine whether the change in the distance data satisfies a preset distance reduction condition, and determine whether a palm appears under the UAV according to the image.
- the control device determines whether the change in the distance data satisfies a preset distance reduction condition, and determines the distance based on the image. State whether there is a palm under the UAV.
- the step of determining whether the change in the distance data meets the preset distance reduction condition may be performed before or after the step of determining whether the change in the distance data meets the preset distance reduction condition.
- the step of determining whether the change in the distance data meets the preset distance reduction condition and the step of determining whether the change in the distance data meets the preset distance reduction condition can also be performed at the same time. The embodiment of the present application does not Limited to this.
- the preset distance reduction conditions can be set according to actual needs.
- the distance data includes distance data at the current moment and distance data at multiple historical moments.
- the determining whether the change of the distance data satisfies a preset distance reduction condition, that is, step S220 specifically includes sub-steps S221 to S223.
- a plurality of distance data within a preset time is acquired, and the distance data is the distance data between the unmanned aerial vehicle and the reference target under the unmanned aerial vehicle measured by a distance measuring device.
- the preset time can be set according to actual needs.
- the distance data in the preset time includes the distance data at the current time and the distance data at multiple historical time points.
- the distance data of multiple historical moments are arranged in ascending order or descending order from smallest to largest to obtain a data sequence, and the distance data located in the middle of the data sequence is regarded as the data sequence of the multiple historical moments.
- the median value of the distance data is the distance data between the unmanned aerial vehicle and the reference target under the unmanned aerial vehicle measured by a distance measuring device.
- the preset time can be set according to actual needs.
- the distance data in the preset time includes the distance data at the current time and the distance data at multiple historical time points.
- the distance data of multiple historical moments are arranged in ascending order or descending order from smallest to largest to obtain
- the distance data of multiple historical moments includes five data of 5.0, 5.2, 5.4, 5.6, and 5.8, and the median value of the distance data of multiple historical moments is 5.4.
- the distance data of each historical moment may also be arithmetic averaged, that is, the sum of the distance data of all historical moments is divided by the number of distance data of the historical moment, so as to obtain the distances of the multiple historical moments. The average of the data.
- S222 Determine the difference between the distance data at the current moment and the median value or average value.
- the distance data at the current time is divided from the median value, or the distance data at the current time is divided from the average value, Thus, the difference between the distance data at the current moment and the median value or average value is determined.
- the determining the difference between the distance data at the current moment and the median value or average value specifically includes: determining the distance data at the current moment and the median value based on a distance reduction formula Or the difference between the averages.
- the distance reduction formula is:
- H ToF (t) is the distance data between the UAV and the reference target under the UAV at time t measured by the time-of-flight ranging detection device, that is, the distance data at the current time
- H ToF (i) is the distance data between the UAV and the reference target under the UAV at time i measured by the time-of-flight ranging detection equipment, that is, the distance data at the historical time
- m is a positive integer and m ⁇ t.
- m 5.
- m can also be any other suitable value, and the embodiment of the present application is not limited thereto.
- the difference After determining the difference between the distance data at the current moment and the median or average value, it is determined whether the difference satisfies a preset distance reduction condition. When the difference is greater than or equal to a preset distance threshold, it is determined that the change in the distance data satisfies the distance reduction condition. When the difference is less than the preset distance threshold, it is determined that the change in the distance data does not satisfy the distance reduction condition.
- the preset distance threshold can be designed according to actual applications, and the embodiment of the present application is not limited to this.
- the distance data measured by the distance measuring device is the height H 2 of the UAV 110 from the user's palm 20, and H 3 is the height of the user's palm 20 from the ground 30.
- the distance data detected by the distance measuring device is the height H 1 of the UAV 110 from the ground, that is, when the user retracts the palm from under the UAV
- the distance measuring device detects that the distance data has changed, and the magnitude of the change is H 3.
- the magnitude of the change is greater than the preset distance threshold, it is determined that the change of the distance data meets the distance reduction condition.
- the control device controls the UAV to land on the palm of the user.
- the unmanned aerial vehicle is controlled to land on the palm, thereby effectively reducing the requirements of the unmanned aerial vehicle in the prior art for landing conditions and improving the landing safety of the unmanned aerial vehicle ;
- a control terminal such as a remote control or a mobile phone to control the landing of the unmanned aerial vehicle.
- the landing operation is simple and easy to use, which improves the convenience of landing, enables ordinary users to quickly use it, and improves the user experience.
- the landing method of the aircraft further includes:
- S201 Determine the position of the palm in the image, and determine the position of the palm in space according to the position of the palm in the image.
- controlling the unmanned aerial vehicle to land on the palm specifically includes: controlling the unmanned aerial vehicle to land on the palm according to the position of the palm in space.
- the unmanned aerial vehicle adjusts the landing direction according to the position of the palm in the space, so that the unmanned aerial vehicle can accurately land on the palm of the user, ensuring the safety of landing and making it convenient for the user to grasp.
- first determine the position of the palm in the image before controlling the UAV to land on the palm, first determine the position of the palm in the image, and determine the position of the palm in space according to the position of the palm in the image. position.
- the position of the palm in the image may be represented by image coordinates of the palm in the image coordinate system.
- the upper left corner of the image can be used as the origin (0, 0), and the length and width of the image are both represented by the number of pixels, based on which the pixel coordinates of the palm in the image can be determined.
- the palm after acquiring the image taken by the camera, the palm can be recognized through a neural network that has been trained to recognize the palm. Specifically, the neural network can return the position of the palm in the image. In some cases, the neural network can return the coordinates of the upper left and lower right corners of the image area corresponding to the palm in the image.
- the determining the position of the palm in space according to the position of the palm in the image specifically includes:
- S2011 Obtain a preset camera internal parameter matrix, rotation matrix, and displacement matrix, where the camera internal parameter matrix includes internal parameter values of the camera, and the rotation matrix includes rotation parameter values of the image coordinate system relative to the origin of the spatial coordinate system, so
- the displacement matrix includes the displacement parameter value of the image coordinate system relative to the origin of the space coordinate system;
- S2012 according to the camera internal parameter matrix, rotation matrix and displacement matrix, the image of the palm in the image coordinate system
- the coordinates are transformed into a coordinate system to obtain the space coordinates of the palm in the space coordinate system, so as to determine the position of the palm in space.
- step S2012 specifically includes: based on the coordinate conversion formula, according to the camera internal parameter matrix, rotation matrix and displacement matrix, the image coordinates of the palm in the image coordinate system are converted into the coordinate system, To obtain the space coordinates of the palm in the space coordinate system.
- the coordinate conversion formula is:
- [u,v,1] T is the image coordinate of the palm in the image coordinate system
- [x w ,y w ,z w ] T is the space coordinate of the palm in the spatial coordinate system
- h is the The distance between the human aircraft and the reference target below the UAV
- K is the camera internal parameter matrix
- R is the rotation matrix of the image coordinate system relative to the origin of the space coordinate system
- T is the image coordinate system relative to the origin of the space coordinate system Displacement matrix.
- ⁇ x fm x
- ⁇ y fm y
- f is the focal length
- m x is the number of pixels per unit distance in the x direction (scale factors)
- m y is the number of pixels per unit distance in the y direction
- c x and c y are the principal points of the optical center.
- the space coordinate system may be a world coordinate system, of course, it may also be any other suitable coordinate system, and the embodiment of the present application is not limited thereto.
- the landing method of the aircraft further includes: determining the area of the palm in the image.
- control device can input the image into a pre-trained neural network model, and the neural network model can output the position in the image of the pixel corresponding to the palm in the image, and the position of the pixel corresponding to the palm can be determined based on the position.
- Quantity the quantity can represent the area of the palm in the image.
- controlling the UAV to land on the palm includes: When it is determined that the change in the distance data satisfies the preset distance reduction condition, the palm appears under the UAV and the area of the palm in the image is greater than or equal to the preset area threshold, the UAV is controlled to land on On the palm.
- the control when the change in the distance data satisfies the preset distance reduction condition, the palm appears under the UAV, and the area of the palm in the image is greater than or equal to the preset area threshold, the control The unmanned aerial vehicle landed on the palm. If the change of the distance data satisfies the preset distance reduction condition, the palm appears under the UAV, and the area of the palm in the image is greater than or equal to the preset area threshold, any one of the conditions is not met, control The unmanned aerial vehicle continues to hover, or the flight state of the unmanned aerial vehicle is detected to determine whether the unmanned aerial vehicle is in the hovering state.
- This method can ensure that there is a sufficient area of the palm for the unmanned aerial vehicle to land, and that the unmanned aerial vehicle can land on the palm safely and steadily.
- the preset area threshold can be designed according to actual applications, and the embodiment of the present application is not limited to this.
- the method further includes: determining the preset area threshold according to distance data between the unmanned aerial vehicle and a reference target under the unmanned aerial vehicle.
- the preset area threshold of the palm is determined based on the distance data between the palm and the UAV, that is, the preset area threshold is different if the distance data is different. It is understandable that when a palm appears under the UAV, the reference target is the palm. At this time, the smaller the distance between the palm and the UAV, the greater the preset area threshold should be, and the greater the distance between the palm and the UAV, the smaller the preset area threshold should be. Therefore, the preset area threshold can be determined according to the distance data between the UAV and the reference target under the UAV.
- the calculation formula of the preset area threshold S is:
- f is the focal length of the camera
- h is the distance data between the UAV and the reference target below the UAV
- w hand is the preset statistical average of the human palm (for example, 8cm, 9cm or 10cm, etc.)
- c x and c y are the optical center of the camera lens.
- the distance data between the unmanned aerial vehicle and a reference target under the unmanned aerial vehicle is acquired, and the unmanned aerial vehicle is acquired
- the image below when it is determined that the change in the distance data meets the preset distance reduction condition and the palm appears under the UAV, control the UAV to land on the palm, which can effectively reduce the prior art
- the requirements for the landing conditions of the UAV in the UAV improve the safety of the UAV’s landing without fouling the UAV.
- a control terminal such as a remote control or a mobile phone to control the landing of the UAV.
- the landing operation is simple and easy. Use, improve the convenience of landing, enable ordinary users to quickly get started, and improve user experience.
- FIG. 5 is a schematic flowchart of a method for landing an aircraft according to an embodiment of the present application.
- the landing method of an aircraft in this embodiment includes step S310 to step S340.
- the unmanned aerial vehicle when the unmanned aerial vehicle is preparing to land, it needs to detect the flight status of the unmanned aerial vehicle, and measure the distance between the unmanned aerial vehicle and the reference target under the unmanned aerial vehicle by controlling the distance measuring device mounted under the unmanned aerial vehicle.
- the distance data of the UAV is controlled and the image below the UAV is acquired by controlling the camera on the UAV.
- several detection trigger conditions can be set to trigger the detection of the flight status of the UAV.
- the palm landing instruction sent by the control terminal when the palm landing instruction sent by the control terminal is received, the flight status of the unmanned aerial vehicle is detected.
- the user can send an instruction to enter the landing mode to the UAV through the control terminal.
- the sensor system of the UAV collects data information, and the flight control system of the UAV The flight status of the UAV can be detected based on the collected data information.
- the sensing system is used to measure the attitude information of the unmanned aerial vehicle, that is, the position information and state information of the unmanned aerial vehicle in space, such as three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration and other information.
- the sensing system may include, for example, at least one of a gyroscope, an electronic compass, an IMU (Inertial Measurement Unit), a global navigation satellite system, and a vision sensor.
- the control terminal may be, for example, the control terminal in FIG. 1, of course, it may also be different from the control terminal, and the embodiment of the present application is not limited thereto.
- image information of the image sensor is acquired.
- the flight state of the unmanned aerial vehicle is detected.
- image sensors can be mounted on unmanned aerial vehicles.
- the image sensor can be installed on the nose of the unmanned aerial vehicle, or on the gimbal of the unmanned aerial vehicle. In this embodiment, the image sensor is located on the nose of the unmanned aerial vehicle for schematic description.
- the image sensor can collect image information in the flight space in real time.
- the image sensor can obtain the image information including the user information
- the control device of the unmanned aerial vehicle can obtain the image data containing the user information
- the detected user is determined based on the image data, for example, it is determined that the user is located in front of an unmanned aerial vehicle.
- the flight status of the UAV can be detected.
- the image sensor can be any device that can obtain image information, such as RGB camera, monocular camera, binocular camera, Time of Flight (TOF) camera, etc.
- the image information can be RGB image, grayscale image, depth Images etc.
- the image sensor is the same as or different from the aforementioned camera.
- the detecting the flight status of the unmanned aerial vehicle includes: obtaining the flight speed of the unmanned aerial vehicle; and determining the flight status of the unmanned aerial vehicle according to the flight speed.
- the flight state is less than a preset speed threshold, it is determined that the flight state of the UAV is a hovering state. If the flight state is greater than or equal to the speed threshold, it is determined that the flight state of the UAV is a non-hovering state. Among them, the non-hovering state includes the moving state and the unmanned aerial vehicle staying on the ground state. It is understandable that the preset speed threshold can be designed according to actual application scenarios, and the embodiment of the present application is not limited to this.
- S320 Determine whether there is a water surface under the UAV according to the image.
- the determining whether there is a water surface under the UAV according to the image specifically includes: inputting the image into a first neural network model to determine whether there is a water surface under the UAV.
- the image is input into the pre-trained first neural network model to determine whether there is a water surface below the unmanned aerial vehicle.
- the first neural network model may return the result that there is a water surface under the UAV or the result that there is no water surface under the UAV.
- the first neural network model is a model obtained by performing model training on a large number of sample images containing the water surface based on the initial neural network.
- the initial neural network may be Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Long/Short Term Memory (LSTM) networks, etc.
- CNN Convolutional Neural Networks
- RNN Recurrent Neural Networks
- LSTM Long/Short Term Memory
- the preset distance reduction conditions can be set according to actual needs.
- the distance data includes distance data at the current moment and distance data at multiple historical moments.
- determining whether the change in the distance data satisfies a preset distance reduction condition that is, acquiring the median or average value of the distance data at the multiple historical moments; Determine the difference between the distance data at the current moment and the median or average value; when the difference is greater than or equal to a preset distance threshold, it is determined that the preset distance reduction condition is satisfied.
- the determining whether a palm appears under the UAV according to the image includes: when it is determined that the change in the distance data satisfies a preset distance reduction condition, inputting the image into the second The neural network model determines whether a palm appears under the UAV.
- the image is input to the pre-trained second neural network model to determine the Whether there is a palm under the aircraft.
- the second neural network model may return the result that a palm appears under the UAV or the result that no palm appears under the UAV.
- the second neural network model is a model obtained by performing model training on a large number of sample images containing palms based on a preset neural network.
- the initial neural network may be Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Long/Short Term Memory (LSTM) network, etc.
- CNN Convolutional Neural Networks
- RNN Recurrent Neural Networks
- LSTM Long/Short Term Memory
- the embodiments of the application are not limited to this .
- the second neural network model can be the same as or different from the palm area model described above.
- the unmanned aerial vehicle is controlled to continue hovering, or the flight state of the unmanned aerial vehicle is detected to determine whether the unmanned aerial vehicle is in the hovering state.
- the step of determining whether the change in the distance data meets a preset distance reduction condition may also be performed after determining whether the change in the distance data meets a preset distance reduction. After the conditional step is executed, or the two steps are executed simultaneously, the embodiment of the present application is not limited to this.
- the unmanned aerial vehicle is controlled to land on the palm.
- the unmanned aircraft When it is determined that the change in the distance data satisfies the preset distance reduction condition, the palm appears under the UAV, and there is no water surface under the UAV, any one of the three conditions is not met, control the unmanned aircraft The aircraft continues to hover, or detects the flight status of the unmanned aircraft to determine whether the unmanned aircraft is in the hovering state.
- the UAV when it is determined that there is a water surface below the UAV, the UAV is controlled to continue to hover, that is, the UAV is not controlled to land; and a prompt message is generated to prompt the UAV or the user The current area is not suitable for landing.
- the control device sends the prompt information to the control terminal to prompt the user that the current area is not suitable for landing.
- the user sends a flight control instruction to the control device through the control terminal, so that the UAV executes the flight task corresponding to the flight control instruction, for example, activates the flight mode to fly away from the water surface area.
- the position of the palm in the image is determined in real time, and the position of the palm in the image is determined according to the position of the palm in the image.
- the position in the image controls the UAV to follow the palm to achieve landing on the palm. If during the process of controlling the UAV to land to the palm, it is determined that no palm appears under the UAV, for example, the user suddenly retracts the palm, then the UAV is controlled to continue to hover.
- the controlling the unmanned aerial vehicle to land on the palm includes: in the process of controlling the unmanned aerial vehicle to land on the palm, determining whether a flight control instruction sent by a control terminal is received; when When it is determined that the flight control instruction is not received, control the unmanned aerial vehicle to continue to land on the palm.
- the control terminal generates flight control instructions according to actual needs, and the flight control instructions include information used to notify the unmanned aerial vehicle that it needs to perform flight tasks.
- the unmanned aerial vehicle In the process of controlling the unmanned aerial vehicle to land on the palm, it is determined whether a flight control instruction sent by the control terminal is received. If the flight control instruction is not received, the unmanned aerial vehicle is controlled to continue landing with the palm. If it is determined that the flight control instruction is received, the flight task corresponding to the flight control instruction is executed.
- the palm when it is determined that the change in the distance data satisfies the preset distance reduction condition, the palm appears under the UAV, and there is no water surface under the UAV, control The unmanned aerial vehicle lands on the palm. Therefore, it can prevent the unmanned aerial vehicle from landing on the surface environment, and can effectively reduce the requirements of the existing unmanned aerial vehicle for landing conditions, improve the safety of the unmanned aerial vehicle landing, and will not stain the unmanned aerial vehicle;
- a control terminal such as a remote control or a mobile phone to control the landing of the UAV, the landing operation is simple and easy to use, which improves the convenience of landing, enables ordinary users to quickly use it, and improves the user experience.
- FIG. 6 is a schematic block diagram of an unmanned aerial vehicle 400 according to an embodiment of the present application.
- the unmanned aerial vehicle 400 includes a processor 401 and a memory 402, and the processor 401 and the memory 402 are connected by a bus 403, which is, for example, an I2C (Inter-integrated Circuit) bus.
- I2C Inter-integrated Circuit
- the processor 401 may be a micro-controller unit (MCU), a central processing unit (CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
- MCU micro-controller unit
- CPU central processing unit
- DSP Digital Signal Processor
- the memory 402 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
- the processor 401 is configured to run a computer program stored in the memory 402, and implement the aforementioned aircraft landing method when executing the computer program.
- the processor 401 is configured to execute a computer program stored in the memory 402, and implement the following steps when executing the computer program:
- the unmanned aerial vehicle When the unmanned aerial vehicle is in a hovering state, acquiring distance data between the unmanned aerial vehicle and a reference target below the unmanned aerial vehicle, and acquiring an image below the unmanned aerial vehicle;
- the processor 401 is further configured to: determine the position of the palm in the image, and determine the position of the palm in space according to the position of the palm in the image.
- the processor 401 realizes the control of the unmanned aerial vehicle to land on the palm, it is used to realize: control the unmanned aerial vehicle to land on the palm according to the position of the palm in space.
- the processor 401 is further configured to: determine the area of the palm in the image.
- the processor 401 controls the UAV to land on the palm when it is determined that the change in the distance data satisfies a preset distance reduction condition and a palm appears under the UAV, To achieve: when it is determined that the change in the distance data satisfies a preset distance reduction condition, a palm appears under the UAV, and the area of the palm in the image is greater than or equal to a preset area threshold, the control is controlled. The human aircraft landed on the palm.
- the processor 401 is further configured to: determine whether there is a water surface under the UAV according to the image.
- the processor 401 controls the UAV to land on the palm when it is determined that the change in the distance data satisfies a preset distance reduction condition and a palm appears under the UAV, For realization: when it is determined that the change in the distance data satisfies the preset distance reduction condition, the palm appears under the unmanned aerial vehicle, and there is no water surface under the unmanned aerial vehicle, control the unmanned aerial vehicle to land Narrated on the palm.
- the processor 401 is further configured to: when it is determined that there is a water surface below the UAV, control the UAV to continue to hover.
- the processor 401 determines whether the change in the distance data satisfies a preset distance reduction condition, it is used to realize: when it is determined that there is no water surface below the UAV To determine whether the change in the distance data satisfies a preset distance reduction condition.
- the processor 401 determines whether there is a water surface under the UAV according to the image, it is used to implement: input the image into the first neural network model to determine the Whether there is a water surface under the UAV.
- the processor 401 When the processor 401 is used to determine whether a palm appears under the UAV based on the image, it is configured to: when it is determined that the change in the distance data satisfies a preset distance reduction condition, the The image is input to the second neural network model to determine whether a palm appears under the UAV.
- the processor 401 when the processor 401 realizes the control of the unmanned aerial vehicle landing on the palm, it is used to realize: in the process of controlling the unmanned aerial vehicle to land on the palm, determining whether to receive A flight control instruction sent to the control terminal; when it is determined that the flight control instruction is not received, the unmanned aerial vehicle is controlled to continue to land on the palm.
- the processor 401 is further configured to: when it is determined that the flight control instruction is received, execute the flight task corresponding to the flight control instruction.
- the distance data includes the distance data at the current moment and the distance data at multiple historical moments
- the processor 401 is implementing the determining whether the change in the distance data satisfies a preset distance reduction condition. Is used to realize: obtain the median or average value of the distance data at the multiple historical moments; determine the difference between the distance data at the current moment and the median or average; when the difference is When it is greater than or equal to the preset distance threshold, it is determined that the preset distance reduction condition is met.
- the unmanned aerial vehicle when the unmanned aerial vehicle is in a hovering state, the distance data between the unmanned aerial vehicle and the reference target under the unmanned aerial vehicle is obtained, and the distance data under the unmanned aerial vehicle is obtained.
- the change in the distance data meets the preset distance reduction condition and the palm appears under the UAV, control the UAV to land on the palm.
- This method of landing can land directly on the palm of the user, effectively reducing the requirements for landing conditions of unmanned aerial vehicles in the prior art, and improving the safety of unmanned aerial vehicles landing; at the same time, there is no need to use control terminals such as remote controllers or mobile phones to control the drone
- the landing operation of the human aircraft is simple and easy to use, which improves the convenience of landing, enables ordinary users to quickly get started and improves the user experience.
- the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the foregoing implementation Example provides the steps of the aircraft landing method.
- the computer-readable storage medium may be the internal storage unit of the unmanned aerial vehicle described in any of the foregoing embodiments, such as the hard disk or memory of the unmanned aerial vehicle.
- the computer-readable storage medium may also be an external storage device of the UAV, such as a plug-in hard disk equipped on the UAV, a Smart Media Card (SMC), or Secure Digital (Secure Digital). , SD) card, flash card (Flash Card), etc.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un procédé d'atterrissage de véhicule aérien comprenant les étapes consistant à : lorsqu'un véhicule aérien sans pilote est dans un état de vol stationnaire, acquérir des données de distance, et acquérir une image au-dessous du véhicule aérien sans pilote (S210); déterminer si un changement dans les données de distance satisfait ou non une condition de réduction de distance prédéfinie, et déterminer si une paume de main apparaît sous le véhicule aérien sans pilote selon l'image (S220); et lorsque les deux conditions sont toutes deux satisfaites, commander le véhicule aérien sans pilote afin qu'il atterrisse sur la paume de main (S230). La présente invention concerne en outre un véhicule aérien sans pilote et un support d'informations lisible par ordinateur.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/098411 WO2021016875A1 (fr) | 2019-07-30 | 2019-07-30 | Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur |
CN201980033547.2A CN112154390A (zh) | 2019-07-30 | 2019-07-30 | 飞行器的降落方法、无人飞行器及计算机可读存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/098411 WO2021016875A1 (fr) | 2019-07-30 | 2019-07-30 | Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021016875A1 true WO2021016875A1 (fr) | 2021-02-04 |
Family
ID=73891960
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/098411 WO2021016875A1 (fr) | 2019-07-30 | 2019-07-30 | Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112154390A (fr) |
WO (1) | WO2021016875A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11435743B2 (en) * | 2017-09-12 | 2022-09-06 | SZ DJI Technology Co., Ltd. | Throwable unmanned aerial vehicle and method of operation |
WO2023211690A1 (fr) * | 2022-04-27 | 2023-11-02 | Snap Inc. | Faire atterrir un drone autonome avec des gestes |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103744429A (zh) * | 2013-02-07 | 2014-04-23 | 山东英特力光通信开发有限公司 | 一种小型无人直升机飞行控制系统 |
US20170045894A1 (en) * | 2015-08-12 | 2017-02-16 | Qualcomm Incorporated | Autonomous Landing and Control |
CN106444824A (zh) * | 2016-05-23 | 2017-02-22 | 重庆零度智控智能科技有限公司 | 无人机、无人机降落控制装置及方法 |
CN108475070A (zh) * | 2017-04-28 | 2018-08-31 | 深圳市大疆创新科技有限公司 | 一种无人机手掌降落的控制方法、控制设备及无人机 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107636550A (zh) * | 2016-11-10 | 2018-01-26 | 深圳市大疆创新科技有限公司 | 飞行控制方法、装置及飞行器 |
CN108121350B (zh) * | 2016-11-29 | 2021-08-24 | 腾讯科技(深圳)有限公司 | 一种控制飞行器降落的方法以及相关装置 |
CN106672216A (zh) * | 2016-12-13 | 2017-05-17 | 深圳市元征科技股份有限公司 | 一种控制无人机降落的方法及无人机 |
CN109074168B (zh) * | 2018-01-23 | 2022-06-17 | 深圳市大疆创新科技有限公司 | 无人机的控制方法、设备和无人机 |
-
2019
- 2019-07-30 CN CN201980033547.2A patent/CN112154390A/zh active Pending
- 2019-07-30 WO PCT/CN2019/098411 patent/WO2021016875A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103744429A (zh) * | 2013-02-07 | 2014-04-23 | 山东英特力光通信开发有限公司 | 一种小型无人直升机飞行控制系统 |
US20170045894A1 (en) * | 2015-08-12 | 2017-02-16 | Qualcomm Incorporated | Autonomous Landing and Control |
CN106444824A (zh) * | 2016-05-23 | 2017-02-22 | 重庆零度智控智能科技有限公司 | 无人机、无人机降落控制装置及方法 |
CN108475070A (zh) * | 2017-04-28 | 2018-08-31 | 深圳市大疆创新科技有限公司 | 一种无人机手掌降落的控制方法、控制设备及无人机 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11435743B2 (en) * | 2017-09-12 | 2022-09-06 | SZ DJI Technology Co., Ltd. | Throwable unmanned aerial vehicle and method of operation |
WO2023211690A1 (fr) * | 2022-04-27 | 2023-11-02 | Snap Inc. | Faire atterrir un drone autonome avec des gestes |
Also Published As
Publication number | Publication date |
---|---|
CN112154390A (zh) | 2020-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10379545B2 (en) | Detecting optical discrepancies in captured images | |
WO2022001748A1 (fr) | Procédé et appareil de suivi de cible, ainsi que dispositif électronique et support mobile | |
CN109074168B (zh) | 无人机的控制方法、设备和无人机 | |
JP2018526641A (ja) | レーザ深度マップサンプリングのためのシステム及び方法 | |
CN106094861B (zh) | 无人机、无人机控制方法及装置 | |
WO2022021027A1 (fr) | Procédé et appareil de suivi de cible, véhicule aérien sans pilote, système et support de stockage lisible associés | |
CN110622091A (zh) | 云台的控制方法、装置、系统、计算机存储介质及无人机 | |
WO2021081774A1 (fr) | Procédé et appareil d'optimisation de paramètres, dispositif de commande et aéronef | |
WO2017181513A1 (fr) | Procédé et dispositif de commande de vol pour véhicule aérien sans pilote | |
US11449076B2 (en) | Method for controlling palm landing of unmanned aerial vehicle, control device, and unmanned aerial vehicle | |
WO2021016875A1 (fr) | Procédé d'atterrissage de véhicule aérien, véhicule aérien sans pilote, et support d'informations lisible par ordinateur | |
CN109754420B (zh) | 一种目标距离估计方法、装置及无人机 | |
WO2020237422A1 (fr) | Procédé d'arpentage aérien, aéronef et support d'informations | |
WO2020048365A1 (fr) | Procédé et dispositif de commande de vol pour aéronef, et dispositif terminal et système de commande de vol | |
CN115238018A (zh) | 用于管理3d飞行路径的方法和相关系统 | |
WO2020024182A1 (fr) | Procédé et appareil de traitement de paramètre, dispositif de caméra et aéronef | |
CN112560769A (zh) | 用于检测障碍物的方法、电子设备、路侧设备和云控平台 | |
WO2019183789A1 (fr) | Procédé et appareil de commande de véhicule aérien sans pilote, et véhicule aérien sans pilote | |
WO2022082440A1 (fr) | Procédé, appareil et système pour déterminer une stratégie de suivi de cible et dispositif et support de stockage | |
WO2020019175A1 (fr) | Procédé et dispositif de traitement d'image et dispositif photographique et véhicule aérien sans pilote | |
US20210181769A1 (en) | Movable platform control method, movable platform, terminal device, and system | |
WO2021217450A1 (fr) | Procédé et dispositif de suivi de cible, et support de stockage | |
CN112154480B (zh) | 可移动平台的定位方法、装置、可移动平台及存储介质 | |
WO2022021028A1 (fr) | Procédé de détection de cible, dispositif, aéronef sans pilote et support de stockage lisible par ordinateur | |
US20210256732A1 (en) | Image processing method and unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19939663 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19939663 Country of ref document: EP Kind code of ref document: A1 |