WO2021078264A1 - 一种降落控制方法、飞行器及存储介质 - Google Patents
一种降落控制方法、飞行器及存储介质 Download PDFInfo
- Publication number
- WO2021078264A1 WO2021078264A1 PCT/CN2020/123311 CN2020123311W WO2021078264A1 WO 2021078264 A1 WO2021078264 A1 WO 2021078264A1 CN 2020123311 W CN2020123311 W CN 2020123311W WO 2021078264 A1 WO2021078264 A1 WO 2021078264A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- aircraft
- template image
- preset
- landing
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000008569 process Effects 0.000 claims abstract description 11
- 238000012549 training Methods 0.000 claims abstract description 4
- 230000004044 response Effects 0.000 claims description 19
- 230000003287 optical effect Effects 0.000 claims description 10
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U70/00—Launching, take-off or landing arrangements
- B64U70/60—Take-off or landing of UAVs from a runway using their own power
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the invention relates to the technical field of aircraft control, in particular to a landing control method, an aircraft and a storage medium.
- Aircraft with flight function can land according to the landing instructions sent by the terminal.
- some smart aircraft have the function of automatic return to home or one-key landing.
- the commonly used landing method of aircraft is to use GPS (Global Positioning System) Assisted landing means sending the GPS coordinates of the target area to the aircraft, and the aircraft's control system controls the aircraft to land in the target area and corrects its landing position according to the GPS coordinates of the target area.
- GPS Global Positioning System
- the main purpose of the present invention is to provide a landing control method, an aircraft and a storage medium, aiming to better realize the shooting control of the aircraft.
- the present invention provides a landing control method applied to an aircraft, and the method includes:
- the landing of the aircraft is controlled according to the position information, so that the aircraft is landed to the preset take-off and landing point.
- the acquiring the first image of the preset take-off and landing point collected by the aircraft during takeoff and the first pose information of the aircraft when the first image is collected includes:
- the first image of the preset take-off and landing point collected by the aircraft and the first pose information of the aircraft when the first image is collected are acquired at each interval of the preset flying height.
- the current flying height of the aircraft is greater than a preset height, stop collecting the first image and the first pose information.
- the adjusting the first image according to the first pose information to obtain the first template image of the preset take-off and landing point includes:
- the adjusting the current frame image according to the first template image and the second pose information to obtain the current frame template image includes:
- the image features include gradient histogram features and color features
- using the image features to train a position filter and a scale filter of the template image includes:
- the position filter and the scale filter of the template image are trained by using the gradient histogram feature and the color feature.
- the predicting the position information of the first template image in the next frame image of the preset take-off and landing point according to the position filter and the scale filter includes:
- the position information of the first template image in the next frame of image is determined according to the maximum response position and the maximum response scale.
- the controlling the landing of the aircraft according to the position information includes:
- the aircraft is controlled to land.
- the present invention also provides an aircraft, which includes:
- the memory is used to store a landing control program executable by the computer
- the processor is used to call the landing control program executable by the computer to implement the aforementioned landing control method.
- the present invention also provides a storage medium that stores a landing control program executable by a computer, and the processor can execute the aforementioned landing control method when calling the landing control program
- the present invention collects the first image of the preset take-off and landing point at the first preset height and the first pose information of the aircraft when the first image is collected, using the first image and the first position.
- the pose acquires the first template image of the preset take-off and landing point, and uses the first template image as the tracking object.
- the current frame image of the preset take-off and landing point is collected at the second preset height, and the second pose information corresponding to the aircraft when the current frame image is collected, the second preset height is greater than the first preset height Set the height.
- the current frame image is adjusted according to the first template image and the second pose information to obtain the current frame template image.
- the position information is used to correct the landing of the aircraft so that the aircraft can dynamically track the preset take-off and landing points for landing, thereby achieving accurate landing of the aircraft.
- FIG. 1 is a schematic diagram of a three-dimensional structure of an aircraft provided by an embodiment of the present invention
- FIG. 2 is a flowchart of steps of a landing control method provided by an embodiment of the present invention.
- FIG. 3 is a schematic diagram of obtaining images of preset take-off and landing points when the aircraft reaches a preset altitude during take-off and landing;
- FIG. 4A is a flowchart of an embodiment of step S11 in FIG. 2;
- 4B is a schematic diagram of image changes after the aircraft adjusts the acquired image
- FIG. 5 is a step flow chart of an embodiment of step S13 in FIG. 2;
- Fig. 6 is a step flow chart of an embodiment of step S16 in Fig. 2;
- FIG. 7 is a schematic diagram of a block diagram structure of an aircraft provided by an embodiment of the present invention.
- the present invention provides a landing control method, an aircraft, and a storage medium, wherein the landing control method is applied to an aircraft, and the method includes: acquiring a preset take-off and landing point collected at a first preset altitude during takeoff of the aircraft The first image of and the first pose information of the aircraft when the first image was collected; the first image is adjusted according to the first pose information to obtain the first template of the preset take-off and landing point Image; acquiring the current frame image of the preset take-off and landing point collected at the second preset height during the landing of the aircraft and the second pose information corresponding to the aircraft when the current frame image is collected, wherein, the The second preset height is greater than the first preset height; the current frame image is adjusted according to the first template image and the second pose information to obtain the current frame template image; from the current frame template image Extract the image features of the first template image in the image; use the image features to train the position filter and scale filter of the first template image; predict the first template image according to the position filter and the scale filter The position information
- the present invention acquires the preset take-off and landing by collecting the first image of the preset take-off and landing point at the first preset height and the first attitude information of the aircraft when the first image is collected.
- Point the first template image use the first template image as the tracking object.
- the current frame image of the preset take-off and landing point is collected at the second preset height, and the second pose information corresponding to the aircraft when the current frame image is collected, the second preset height is greater than the first preset height Set the height.
- the current frame image is adjusted according to the first template image and the second pose information to obtain the current frame template image.
- the position information is used to correct the landing of the aircraft so that the aircraft can dynamically track the preset take-off and landing points for landing, thereby achieving accurate landing of the aircraft.
- FIG. 1 is an aircraft 10 provided by the present invention, and the aircraft 10 is in communication connection with a terminal device 20.
- the aircraft 10 may be a rotary-wing aircraft, such as a quad-rotor aircraft, a hexa-rotor aircraft, or a fixed-wing aircraft.
- the terminal device 20 is used to send control instructions to the aircraft 10 to control the aircraft 10, and the terminal device 10 may be a remote control.
- the device can also be a smart phone.
- the aircraft 10 includes a fuselage 101, an arm 102, a power component 103 to control components (not shown), and a camera 104.
- the control component can receive the response instruction sent by the terminal device 20, and generate corresponding control signals according to the response instruction to control the aircraft 10 to perform corresponding operations, such as controlling the power component 103 to move to adjust the flight attitude or control the camera 104 to take pictures.
- the arm 102 is connected to the fuselage 101, and the power assembly 103 is arranged on the arm 102 and electrically connected to the control assembly for providing flight power for the aircraft 10.
- the camera device 104 is disposed on the body 101 and is electrically connected to the control component for acquiring images or video information.
- FIG. 2 is a landing control method provided by the present invention. The method is applied to an aircraft 10, and the method includes:
- Step S10 Acquire a first image of a preset take-off and landing point collected at a first preset altitude during a takeoff process of the aircraft, and first pose information of the aircraft when the first image is collected.
- the first image of the preset take-off and landing point Q collected by the camera 104 is acquired, and the first position of the aircraft 10 when the first image is collected
- the posture information where the first posture information includes first posture information and first position information.
- the acquiring the first image of the preset take-off and landing point acquired by the aircraft during takeoff and the first pose information of the aircraft when the first image is acquired includes:
- the first image of the preset take-off and landing point collected by the aircraft and the first pose information of the aircraft when the first image is collected are acquired at every interval of the preset flying height, wherein the preset flight The height can be set as required.
- the method further includes:
- Step S11 Adjust the first image according to the first pose information to obtain a first template image of the preset take-off and landing point.
- step S11 includes:
- Step S111 Generate a corresponding first adjustment angle according to the first pose information
- Step S112 Adjust the first image according to the first adjustment angle to obtain a first adjustment image
- Step S113 Acquire a first template image of the preset take-off and landing point according to the first adjustment image, where the first template image is all that is intercepted with the optical center point of the first adjustment image as the center point.
- the first adjustment image is an image of a preset area.
- the image of the preset take-off and landing point Q taken by the flight attitude angle is not strictly a horizontal image.
- the horizontal image of the take-off and landing point Q is based on the first posture information when the aircraft 10 obtains the first image a, and the posture angle of the aircraft 10 when the aircraft 10 is taken is obtained from the first posture information, and the posture angle is used as the first adjustment angle Perform angle compensation and adjustment on the first image a to obtain the first adjusted image b.
- the image within the preset range is intercepted with the optical center point of the first adjustment image b as the midpoint to obtain the first template image Ti.
- the first template image Ti is the image of the preset take-off and landing point Q acquired by the aircraft 10 at the first preset height H1.
- the first template image Ti can be used as the tracking target, and the image of the preset takeoff and landing point Q can be acquired in real time.
- the position of the first template image Ti is tracked in the image of the preset take-off and landing point Q to achieve accurate landing of the aircraft.
- Step S12 Obtain the current frame image of the preset take-off and landing point collected at the second preset altitude during the landing process of the aircraft and the second pose information corresponding to the aircraft when the current frame image is collected, wherein The second preset height is greater than the first preset height.
- the current frame image of the preset take-off and landing point collected at the second preset height H2 and the second pose information corresponding to the aircraft when the current frame image is collected wherein, the second preset height H2 is greater than the first preset height H1.
- H1 can be 15m
- H2 can be 25m
- H1 can be 6m
- H2 can be 15m
- H1 can be 1m
- H2 can be 6m.
- Step S13 Adjust the current frame image according to the first template image and the second pose information to obtain a current frame template image.
- step S13 includes:
- Step S131 Generate a corresponding second adjustment angle according to the second pose information
- Step S132 Adjust the current frame image according to the second adjustment angle to obtain the current frame adjusted image
- Step S133 Acquire a second template image from the current frame adjustment image according to the first template image, where the second template image is intercepted from the current frame adjustment image, and the current frame adjustment image An image with the optical center point as the center and a preset ratio to the size of the first template image.
- the aircraft 10 acquires the current frame image of the preset take-off and landing point Q
- the current frame image of the preset take-off and landing point Q captured by the flight attitude angle is not a horizontal image in the strict sense, in order to obtain more accurate
- the horizontal image of the current frame image of the take-off and landing point Q is preset, and then the second pose information when the aircraft 10 obtains the current frame image is obtained by the aircraft 10, and the attitude angle of the aircraft 10 when the aircraft 10 is taken is obtained from the second pose information,
- the attitude angle is used as the second adjustment angle to perform angle compensation and adjustment on the current frame image to obtain the current frame adjustment image.
- the current frame adjustment image is intercepted, and the image centered on the optical center point of the current frame adjustment image and in a preset ratio to the size of the first template image Ti, and this image is used as the current frame template image It
- Step S14 Extract the image feature of the current frame template image.
- Image features are extracted from the collected multiple sample images, where the image features include Histogram of Oriented Gradient (HOG) and Color Name (CN).
- HOG Histogram of Oriented Gradient
- CN Color Name
- Step S15 Use the image feature to train the position filter and the scale filter of the first template image.
- Step S16 Predict the position information of the first template image in the next frame image of the preset take-off and landing point according to the position filter and the scale filter.
- step S16 includes:
- Step S161 Obtain the next frame image of the preset take-off and landing point, and adjust the next frame image to obtain the next frame template image;
- Step S162 Calculate the maximum response position of the first template image in the next frame of template image according to the position filter
- Step S163 Calculate the maximum response scale of the first template image in the next frame of template image according to the scale filter
- Step S164 Determine the position information of the first template image in the next frame of image according to the maximum response position and the maximum response scale.
- the adjusting the next frame image to obtain the next frame template image includes:
- the next frame image is adjusted according to the third pose information to obtain the next frame template image.
- the maximum response position of the first template image in the next frame of template image is calculated according to the position filter. Taking the maximum response position as a center, pictures of different scales are collected, and a scale filter is used to calculate the maximum response scale of the first template image in the next frame of template image.
- the maximum response position and the maximum response scale are used to determine the position information of the first template image in the next frame of image, where the position information is the coordinates of the optical center point of the first template image in the next frame of image.
- Step S17 Control the aircraft to land according to the position information, so as to make the aircraft land to the preset take-off and landing point.
- step S17 includes:
- the aircraft is controlled to land.
- the aircraft 10 further includes a memory 105, a processor 106, and a bus 107.
- the memory 105 is electrically connected to the processor 106 through the bus 107.
- the memory 105 includes at least one type of readable storage medium, and the readable storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, and the like.
- the memory 105 may be an internal storage unit of the aircraft 10 in some embodiments, such as a hard disk of the aircraft 10. In other embodiments, the memory 105 may also be an external storage device of the aircraft 10, such as a plug-in hard disk, a smart media card (SMC), and a secure digital (SD) card equipped on the aircraft 10. Flash Card, etc.
- the memory 105 can be used not only to store application software and various data installed in the aircraft 10, such as computer-readable code of a landing control method program, etc., but also to temporarily store data that has been output or will be output.
- the processor 106 may be a central processing unit (CPU), a controller, a microcontroller, a microprocessor, or other data processing chip, and the processor 106 may call program codes stored in the memory 105 or The data is processed to execute the aforementioned landing control method.
- CPU central processing unit
- controller controller
- microcontroller microcontroller
- microprocessor or other data processing chip
- the present invention also provides a storage medium that stores a landing control program executable by a computer, and the processor can execute the aforementioned landing control method when calling the landing control program.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
Abstract
一种降落控制方法、飞行器及存储介质,其中,方法应用于飞行器,包括获取飞行器在起飞过程中在第一预设高度采集的预设起降点的第一图像和采集第一图像时飞行器的第一位姿信息(S10);根据第一位姿信息调整第一图像,以获取预设起降点的第一模板图像(S11);获取飞行器降落过程中在第二预设高度采集的预设起降点的当前帧图像和采集当前帧图像时对应飞行器的第二位姿信息(S12);根据第一模板图像以及第二位姿信息调整当前帧图像,以获取当前帧模板图像(S13);提取当前帧模板图像的图像特征(S14);利用图像特征训练第一模板图像的位置滤波器和尺度滤波器(S15);根据位置滤波器和尺度滤波器预测第一模板图像在下一帧图像中的位置信息(S16);根据位置信息控制飞行器降落(S17)。
Description
本申请要求于2019年10月25日提交中国专利局、申请号为201911025804.6、申请名称为“一种降落控制方法、飞行器及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本发明涉及飞行器控制技术领域,尤其涉及一种降落控制方法、飞行器及存储介质。
随着飞行器的不断发展,飞行器应用的领域也越来越广。带有飞行功能的飞行器可以根据终端发送的降落指令进行降落,当然,有些智能飞行器具有自动返航或者一键降落的功能,目前,飞行器常用的降落方式是使用GPS(Global Positioning System,全球定位系统)辅助降落,即将目标区域的GPS坐标发送至飞行器,飞行器的控系统控制飞行器在目标区域降落,并根据目标区域的GPS坐标修正自身的降落位置。
然而,由于GPS坐标精度不足,以及飞行器的漂移等问题,飞行器采用这种方式降落时很难准确的降落到目标上,GPS的的定位精度直接决定了精准降落的精度,在GPS信号差时飞行器的降落误差大。
因此,如何控制飞行器的精准降落成为研究的热点问题。
发明内容
本发明的主要目的在于提供一种降落控制方法、飞行器及存储介质,旨在更优实现飞行器的拍摄控制。
为实现上述目的,本发明提供一种降落控制方法,应用于飞行器,所述方法包括:
获取所述飞行器在起飞过程中在第一预设高度采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息;
根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像;
获取所述飞行器在降落过程中在第二预设高度采集的预设起降点的当前 帧图像和采集所述当前帧图像时对应所述飞行器的第二位姿信息,其中,所述第二预设高度大于所述第一预设高度;
根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像;
提取所述当前帧模板图像的图像特征;
利用所述图像特征训练所述第一模板图像的位置滤波器和尺度滤波器;
根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息;
根据所述位置信息控制所述飞行器降落,以使所述飞行器降落至所述预设起降点。
优选地,所述获取所述飞行器在起飞过程中采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息,包括:
在起飞过程中,每间隔预设飞行高度,获取所述飞行器采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息。
优选地,当所述飞行器的当前飞行高度大于预设高度时,停止采集所述第一图像和所述第一位姿信息。
优选地,所述根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像,包括:
根据所述第一位姿信息生成对应的第一调整角;
根据所述第一调整角调整所述第一图像,以获取第一调整图像;
根据所述第一调整图像获取所述预设起降点的第一模板图像,其中,所述第一模板图像为以所述第一调整图像的光心点为中心点截取的所述第一调整图像内预设区域的图像。
优选地,所述根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像,包括:
根据所述第二位姿信息生成对应的第二调整角;
根据所述第二调整角调整所述当前帧图像,以获取当前帧调整图像;
根据所述第一模板图像从所述当前帧调整图像中获取当前帧模板图像,其中,所述当前帧模板图像为从所述当前帧调整图像截取,以所述当前帧调整图像的光心点为中心并与所述第一模板图像的尺寸大小成预设比例的图 像。
优选地,所述图像特征包括梯度直方图特征和颜色特征,所述利用所述图像特征训练所述一模板图像的位置滤波器和尺度滤波器,包括:
利用所述梯度直方图特征和所述颜色特征训练所述一模板图像的位置滤波器和尺度滤波器。
优选地,所述根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息,包括:
获取所述预设起降点的下一帧图像,并调整所述下一帧图像以获取下一帧模板图像;
根据所述位置滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应位置;
根据所述尺度滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应尺度;
根据所述最大响应位置和所述最大响应尺度确定所述第一模板图像在所述下一帧图像中的位置信息。
优选地,所述根据所述位置信息控制所述飞行器降落,包括:
计算所述位置信息在世界坐标系下的三维坐标;
根据所述三维坐标,控制所述飞行器降落。
本发明还提供一种飞行器,所述飞行器包括:
存储器以及处理器;
所述存储器用于存储计算机可执行的降落控制程序;
所述处理器用于调用所述计算机可执行的降落控制程序以实现前述的降落控制方法。
本发明还提供一种存储介质,所述存储介质存储有计算机可执行的降落控制程序,处理器在调用所述降落控制程序时,可执行前述的降落控制方法
与现有技术相比,本发明通过在第一预设高度采集预设起降点的第一图像和采集该第一图像时该飞行器的第一位姿信息,利用第一图像和第一位姿获取该预设起降点的第一模板图像,利用该第一模板图像作为追踪对象。在飞行器需要降落时在第二预设高度采集预设起降点的当前帧图像以及采集所述当前帧图像时对应所述飞行器的第二位姿信息,该第二预设高度大于第一 预设高度。根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像。
也即对当前帧模板图像进行图像特征提取,并利用提取的图像特征训练第一模板图像的位置滤波器和尺度滤波器,利用位置滤波器和尺度滤波器预测第一模板图像在下一帧图像中的位置信息,利用该位置信息修正飞行器的降落从而使得飞行器可以动态跟踪预设起降点进行降落,从而实现飞行器的精准降落。
图1为本发明一实施例提供的飞行器的立体结构示意图;
图2为本发明一实施例提供的降落控制方法的步骤流程图;
图3为飞行器在起降过程中达到预设高度时获取预设起降点图像的示意图;
图4A为图2中步骤S11一实施例的步骤流程图;
图4B为飞行器对获取的图像进行调整后的图像变化示意图;
图5为图2中步骤S13一实施例的步骤流程图;
图6为图2中步骤S16一实施例的步骤流程图;
图7为本发明一实施例提供的飞行器的框图结构示意图。
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的实施例能够以除了在这里图示或描述的内容以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清 楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
需要说明的是,在本发明中涉及“第一”、“第二”等的描述仅用于描述目的,而不能理解为指示或暗示其相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。另外,各个实施例之间的技术方案可以相互结合,但是必须是以本领域普通技术人员能够实现为基础,当技术方案的结合出现相互矛盾或无法实现时应当认为这种技术方案的结合不存在,也不在本发明要求的保护范围之内。
本发明提供一种降落控制方法、飞行器及存储介质,其中,该降落控制方法应用于飞行器,所述方法包括:获取所述飞行器在起飞过程中在第一预设高度采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息;根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像;获取所述飞行器在降落过程中在第二预设高度采集的预设起降点的当前帧图像和采集所述当前帧图像时对应所述飞行器的第二位姿信息,其中,所述第二预设高度大于所述第一预设高度;根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像;从所述当前帧模板图像中提取所述第一模板图像的图像特征;利用所述图像特征训练所述第一模板图像的位置滤波器和尺度滤波器;根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息;根据所述位置信息控制所述飞行器降落,以使所述飞行器降落至所述预设起降点。
本发明通过在第一预设高度采集预设起降点的第一图像和采集该第一图像时该飞行器的第一位姿信息,利用第一图像和第一位姿获取该预设起降点的第一模板图像,利用该第一模板图像作为追踪对象。在飞行器需要降落时在第二预设高度采集预设起降点的当前帧图像以及采集所述当前帧图像时对应所述飞行器的第二位姿信息,该第二预设高度大于第一预设高度。根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像。
也即对当前帧模板图像进行图像特征提取,并利用提取的图像特征训练 第一模板图像的位置滤波器和尺度滤波器,利用位置滤波器和尺度滤波器预测第一模板图像在下一帧图像中的位置信息,利用该位置信息修正飞行器的降落从而使得飞行器可以动态跟踪预设起降点进行降落,从而实现飞行器的精准降落。
请参阅图1,图1为本发明提供的一种飞行器10,该飞行器10与终端设备20通信连接。其中,该飞行器10可以是旋翼飞行器,如四旋翼飞行器、六旋翼飞行器,也可以是固定翼飞行器,该终端设备20用于向飞行器10发送控制指令以控制飞行器10,该终端设备10可以是遥控器也可以是智能手机。
该飞行器10包括机身101、机臂102、动力组件103以控制组件(图未示)及摄像装置104。其中,控制组件可以接收终端设备20发送的响应指令,并根据响应指令生成对应的控制信号以控制飞行器10执行相应操作,如控制动力组件103动作,以进行飞行姿态调整或控制摄像装置104摄像。机臂102与机身101连接,动力组件103设置于机臂102并与控制组件电连接,用于为飞行器10提供飞行动力。摄像装置104设置有于机身101,并与控制组件电连接用于获取图像或影像信息。
请参阅图2,图2为本发明提供的一种降落控制方法,该方法应用于飞行器10,该方法包括:
步骤S10:获取所述飞行器在起飞过程中在第一预设高度采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息。
如图3所示,飞行器10在起飞至第一预设高度H1时,获取通过摄像装置104采集的预设起降点Q的第一图像,以及采集该第一图像时飞行器10的第一位姿信息,其中,该第一位姿信息为包括第一姿态信息和第一位置信息。
在部分实施例中,所述获取所述飞行器在起飞过程中采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息,包括:
在起飞过程中,每间隔预设飞行高度,获取所述飞行器采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息,其中该预设飞行高度可以根据需要进行设定。
在部分实施例中,所述方法还包括:
当所述飞行器的当前飞行高度大于预设高度时,停止采集所述第一图像和所述第一位姿信息。
根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像;
步骤S11:根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像。
请参阅图4A,在部分实施例中,步骤S11包括:
步骤S111:根据所述第一位姿信息生成对应的第一调整角;
步骤S112:根据所述第一调整角调整所述第一图像,以获取第一调整图像;
步骤S113:根据所述第一调整图像获取所述预设起降点的第一模板图像,其中,所述第一模板图像为以所述第一调整图像的光心点为中心点截取的所述第一调整图像内预设区域的图像。
示例性地,如图4B所示,飞行器10在获取第一图像a时,由于飞行姿态角度所拍摄的预设起降点Q的图像并非严格意义上的水平图像,为了更为精准获取预设起降点Q的水平图像,则根据飞行器10获取该第一图像a时的第一位姿信息,从该第一位姿信息获取飞行器10拍摄时的姿态角,以姿态角作为第一调整角对第一图像a进行角度补偿和调整,获得第一调整图像b。同时,以第一调整图像b的光心点为中点截取预设范围内的图像,以获取第一模板图像Ti。该第一模板图像Ti即为在第一预设高度H1时,飞行器10获取的预设起降点Q的图像。当飞行器10高度大于H1,且飞行器10需要精准降落至该预设起降点Q时,可以以第一模板图像Ti为追踪目标,并实时获取预设起降点Q的图像,在实时获取的预设起降点Q的图像中追踪第一模板图像Ti所在位置,以实现飞行器精准降落。
步骤S12:获取所述飞行器在降落过程中在第二预设高度采集的预设起降点的当前帧图像和采集所述当前帧图像时对应所述飞行器的第二位姿信息,其中,所述第二预设高度大于所述第一预设高度。
如图3所示,在飞行器10降落过程中,在第二预设高度H2采集的预设起降点的当前帧图像和采集所述当前帧图像时对应所述飞行器的第二位姿信息,其中,所述第二预设高度H2大于所述第一预设高度H1。
如H1可以是15m,H2可以是25m,或H1可以是6m,H2可以是15m,或H1可以是1m,H2可以是6m。
步骤S13:根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像。
请参阅图5A,在部分实施例中,步骤S13包括:
步骤S131:根据所述第二位姿信息生成对应的第二调整角;
步骤S132:根据所述第二调整角调整所述当前帧图像,以获取当前帧调整图像;
步骤S133:根据所述第一模板图像从所述当前帧调整图像中获取第二模板图像,其中,所述第二模板图像为从所述当前帧调整图像截取的,以所述当前帧调整图像的光心点为中心并与所述第一模板图像的尺寸大小成预设比例的图像。
示例性地,飞行器10在获取预设起降点Q的当前帧图像时,由于飞行姿态角度所拍摄的预设起降点Q的当前帧图像并非严格意义上的水平图像,为了更为精准获取预设起降点Q的当前帧图像的水平图像,则根据飞行器10获取该飞行器10获取当前帧图像时的第二位姿信息,从该第二位姿信息获取飞行器10拍摄时的姿态角,以姿态角作为第二调整角对当前帧图像进行角度补偿和调整,获得当前帧调整图像。同时,从所述当前帧调整图像截取,以所述当前帧调整图像的光心点为中心并与第一模板图像Ti的尺寸大小成预设比例的图像,以该图像作为当前帧模板图像It,其中,该预设比例可以是,第一模板图像Ti的尺寸S1和当前帧模板图像It的尺寸S2满足:S1/S2=H1/H2,当前帧模板图像It。
步骤S14:提取所述当前帧模板图像的图像特征。
在当前帧模板图像的周边四个不同的相互垂直的方向上采集多个样本图像以及采集当前帧模板图像多个不同尺寸下的样本图像。
对采集到的多个样本图像的提取图像特征,其中,图像特征包括梯度直方图特征(Histogram of Oriented Gradient,HOG)和颜色特征(Color Name,CN)。
步骤S15:利用所述图像特征训练所述第一模板图像的位置滤波器和尺度滤波器。
设置位置滤波器的目标函数以及尺度滤波器的目标函数,利用所获取的图像特征及位置滤波器的目标函数建立第一模板图像的位置滤波器,利用所获取的图像特征及尺度滤波器的目标函数建立第一模板图像的尺度滤波器。
步骤S16:根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息。
请参阅图6,部分实施例中,步骤S16包括:
步骤S161:获取所述预设起降点的下一帧图像,并调整所述下一帧图像以获取下一帧模板图像;
步骤S162:根据所述位置滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应位置;
步骤S163:根据所述尺度滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应尺度;
步骤S164:根据所述最大响应位置和所述最大响应尺度确定所述第一模板图像在所述下一帧图像中的位置信息。
其中,所述调整所述下一帧图像以获取下一帧模板图像,包括:
获取飞行器在采集所述预设起降点的下一帧图像时的第三位姿信息;
根据第三位姿信息调整所述下一帧图像,以获取下一帧模板图像。
示例性地,获取预设起降点的下一帧图像,以及获取该下一帧图像时飞行器10的第三位姿信息,根据第三位姿信息调整该下一帧图像,以获取下一帧模板图像。据所述位置滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应位置。以该最大响应位置为中心,采集不同尺度的图片,利用尺度滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应尺度。利用最大响应位置和最大响应尺度确定第一模板图像在所述下一帧图像中的位置信息,其中,位置信息为第一模板图像的光心点在所述下一帧图像中的坐标。
步骤S17:根据所述位置信息控制所述飞行器降落,以使所述飞行器降落至所述预设起降点。
部分实施例中,步骤S17包括:
计算所述位置信息在世界坐标系下的三维坐标;
根据所述三维坐标,控制所述飞行器降落。
的请参阅图7,在部分实施例中,飞行器10还包括存储器105、处理器106以及总线107,存储器105、通过总线107与处理器106电连接。
其中,存储器105至少包括一种类型的可读存储介质,所述可读存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等)、磁性存储器、磁盘、光盘等。存储器105在一些实施例中可以是飞行器10的内部存储单元,例如该飞行器10的硬盘。存储器105在另一些实施例中也可以是飞行器10的外部存储设备,例如飞行器10上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。存储器105不仅可以用于存储安装于飞行器10的应用软件及各类数据,例如计算机可读的降落控制方法程序的代码等,还可以用于暂时地存储已经输出或者将要输出的数据。
处理器106在一些实施例中可以是中央处理器(Central Processing Unit,CPU)、控制器、微控制器、微处理器或其他数据处理芯片,处理器106可调用存储器105中存储的程序代码或处理数据,以执行前述的降落控制方法。
本发明还提供一种存储介质,所述存储介质存储有计算机可执行的降落控制程序,处理器在调用所述降落控制程序时,可执行前述的降落控制方法。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
Claims (10)
- 一种降落控制方法,应用于飞行器,其特征在于,所述方法包括:获取所述飞行器在起飞过程中在第一预设高度采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息;根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像;获取所述飞行器在降落过程中在第二预设高度采集的预设起降点的当前帧图像和采集所述当前帧图像时对应所述飞行器的第二位姿信息,其中,所述第二预设高度大于所述第一预设高度;根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像;提取所述当前帧模板图像的图像特征;利用所述图像特征训练所述第一模板图像的位置滤波器和尺度滤波器;根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息;根据所述位置信息控制所述飞行器降落,以使所述飞行器降落至所述预设起降点。
- 如权利要求1所述的方法,其特征在于,所述获取所述飞行器在起飞过程中采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息,包括:在起飞过程中,每间隔预设飞行高度,获取所述飞行器采集的预设起降点的第一图像和采集所述第一图像时所述飞行器的第一位姿信息。
- 如权利要求2所述的方法,其特征在于,所述方法还包括:当所述飞行器的当前飞行高度大于预设高度时,停止采集所述第一图像和所述第一位姿信息。
- 如权利要求1所述的方法,其特征在于,所述根据所述第一位姿信息调整所述第一图像,以获取所述预设起降点的第一模板图像,包括:根据所述第一位姿信息生成对应的第一调整角;根据所述第一调整角调整所述第一图像,以获取第一调整图像;根据所述第一调整图像获取所述预设起降点的第一模板图像,其中,所述第一模板图像为以所述第一调整图像的光心点为中心点截取的所述第一调整图像内预设区域的图像。
- 如权利要求4所述的方法,其特征在于,所述根据所述第一模板图像以及所述第二位姿信息调整所述当前帧图像,以获取当前帧模板图像,包括:根据所述第二位姿信息生成对应的第二调整角;根据所述第二调整角调整所述当前帧图像,以获取当前帧调整图像;根据所述第一模板图像从所述当前帧调整图像中获取当前帧模板图像,其中,所述当前帧模板图像为从所述当前帧调整图像截取,以所述当前帧调整图像的光心点为中心并与所述第一模板图像的尺寸大小成预设比例的图像。
- 如权利要求1所述的方法,其特征在于,所述图像特征包括梯度直方图特征和颜色特征,所述利用所述图像特征训练所述一模板图像的位置滤波器和尺度滤波器,包括:利用所述梯度直方图特征和所述颜色特征训练所述一模板图像的位置滤波器和尺度滤波器。
- 如权利要求6所述的方法,其特征在于,所述根据所述位置滤波器和所述尺度滤波器预测所述第一模板图像在所述预设起降点的下一帧图像中的位置信息,包括:获取所述预设起降点的下一帧图像,并调整所述下一帧图像以获取下一帧模板图像;根据所述位置滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应位置;根据所述尺度滤波器计算所述第一模板图像在所述下一帧模板图像中的最大响应尺度;根据所述最大响应位置和所述最大响应尺度确定所述第一模板图像在所述下一帧图像中的位置信息。
- 如权利要求1所述的方法,其特征在于,所述根据所述位置信息控制所述飞行器降落,包括:计算所述位置信息在世界坐标系下的三维坐标;根据所述三维坐标,控制所述飞行器降落。
- 一种飞行器,其特征在于,所述飞行器包括:存储器以及处理器;所述存储器用于存储计算机可执行的降落控制程序;所述处理器用于调用所述计算机可执行的降落控制程序以实现如权利要求1~8任意一项所述的降落控制方法。
- 一种存储介质,其特征在于,所述存储介质存储有计算机可执行的降落控制程序,处理器在调用所述降落控制程序时,可执行如权利要求1-8任意一项所述的降落控制方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/660,443 US12124274B2 (en) | 2019-10-25 | 2022-04-25 | Landing control method, aircraft and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911025804.6A CN110968107A (zh) | 2019-10-25 | 2019-10-25 | 一种降落控制方法、飞行器及存储介质 |
CN201911025804.6 | 2019-10-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/660,443 Continuation US12124274B2 (en) | 2019-10-25 | 2022-04-25 | Landing control method, aircraft and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021078264A1 true WO2021078264A1 (zh) | 2021-04-29 |
Family
ID=70029928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/123311 WO2021078264A1 (zh) | 2019-10-25 | 2020-10-23 | 一种降落控制方法、飞行器及存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110968107A (zh) |
WO (1) | WO2021078264A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114020006A (zh) * | 2021-09-26 | 2022-02-08 | 佛山中科云图智能科技有限公司 | 无人机辅助降落方法、装置、存储介质以及电子设备 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110968107A (zh) * | 2019-10-25 | 2020-04-07 | 深圳市道通智能航空技术有限公司 | 一种降落控制方法、飞行器及存储介质 |
CN112987764B (zh) * | 2021-02-01 | 2024-02-20 | 鹏城实验室 | 降落方法、装置、无人机以及计算机可读存储介质 |
CN113077516B (zh) * | 2021-04-28 | 2024-02-23 | 深圳市人工智能与机器人研究院 | 一种位姿确定方法及相关设备 |
CN113608542B (zh) * | 2021-08-12 | 2024-04-12 | 山东信通电子股份有限公司 | 一种无人机自动降落的控制方法以及设备 |
CN115329932A (zh) * | 2022-08-05 | 2022-11-11 | 中国民用航空飞行学院 | 基于数字孪生的飞机着陆姿态监视方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6157876A (en) * | 1999-10-12 | 2000-12-05 | Honeywell International Inc. | Method and apparatus for navigating an aircraft from an image of the runway |
CN1670479A (zh) * | 2004-03-15 | 2005-09-21 | 清华大学 | 基于视频图像测量飞行器飞行高度的方法 |
CN104006790A (zh) * | 2013-02-21 | 2014-08-27 | 成都海存艾匹科技有限公司 | 基于视觉的飞机降落辅助装置 |
CN106774423A (zh) * | 2017-02-28 | 2017-05-31 | 亿航智能设备(广州)有限公司 | 一种无人机的降落方法及系统 |
US10191496B2 (en) * | 2016-04-21 | 2019-01-29 | Foundation Of Soongsil University-Industry Cooperation | Unmanned aerial vehicle and a landing guidance method using the same |
CN110968107A (zh) * | 2019-10-25 | 2020-04-07 | 深圳市道通智能航空技术有限公司 | 一种降落控制方法、飞行器及存储介质 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
CN104049641B (zh) * | 2014-05-29 | 2017-08-25 | 深圳市大疆创新科技有限公司 | 一种自动降落方法、装置及飞行器 |
CN109215074B (zh) * | 2017-06-29 | 2021-08-03 | 清华大学 | 基于分层码标的无人机降落方法、装置、设备以及可读存储介质 |
CN107644430A (zh) * | 2017-07-27 | 2018-01-30 | 孙战里 | 基于自适应特征融合的目标跟踪 |
CN107820585B (zh) * | 2017-09-06 | 2021-08-13 | 深圳市道通智能航空技术股份有限公司 | 飞行器降落方法、飞行器和计算机可读存储介质 |
CN108153334B (zh) * | 2017-12-01 | 2020-09-25 | 南京航空航天大学 | 无合作目标式无人直升机视觉自主返航与着降方法及系统 |
CN110001980B (zh) * | 2019-04-19 | 2021-11-26 | 深圳市道通智能航空技术股份有限公司 | 一种飞行器降落方法及装置 |
CN111627062B (zh) * | 2020-06-08 | 2021-02-05 | 星逻人工智能技术(上海)有限公司 | 飞行器停机状态控制方法、装置及装置使用方法 |
-
2019
- 2019-10-25 CN CN201911025804.6A patent/CN110968107A/zh active Pending
-
2020
- 2020-10-23 WO PCT/CN2020/123311 patent/WO2021078264A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6157876A (en) * | 1999-10-12 | 2000-12-05 | Honeywell International Inc. | Method and apparatus for navigating an aircraft from an image of the runway |
CN1670479A (zh) * | 2004-03-15 | 2005-09-21 | 清华大学 | 基于视频图像测量飞行器飞行高度的方法 |
CN104006790A (zh) * | 2013-02-21 | 2014-08-27 | 成都海存艾匹科技有限公司 | 基于视觉的飞机降落辅助装置 |
US10191496B2 (en) * | 2016-04-21 | 2019-01-29 | Foundation Of Soongsil University-Industry Cooperation | Unmanned aerial vehicle and a landing guidance method using the same |
CN106774423A (zh) * | 2017-02-28 | 2017-05-31 | 亿航智能设备(广州)有限公司 | 一种无人机的降落方法及系统 |
CN110968107A (zh) * | 2019-10-25 | 2020-04-07 | 深圳市道通智能航空技术有限公司 | 一种降落控制方法、飞行器及存储介质 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114020006A (zh) * | 2021-09-26 | 2022-02-08 | 佛山中科云图智能科技有限公司 | 无人机辅助降落方法、装置、存储介质以及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN110968107A (zh) | 2020-04-07 |
US20220253075A1 (en) | 2022-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021078264A1 (zh) | 一种降落控制方法、飞行器及存储介质 | |
US11073389B2 (en) | Hover control | |
CN110001980B (zh) | 一种飞行器降落方法及装置 | |
JP2020030204A (ja) | 距離測定方法、プログラム、距離測定システム、および可動物体 | |
WO2020107372A1 (zh) | 拍摄设备的控制方法、装置、设备及存储介质 | |
CN110633629A (zh) | 基于人工智能的电网巡检方法、装置、设备及存储介质 | |
WO2018120350A1 (zh) | 对无人机进行定位的方法及装置 | |
CN111829532B (zh) | 一种飞行器重定位系统和重定位方法 | |
WO2019015158A1 (zh) | 无人机避障方法及无人机 | |
WO2020014987A1 (zh) | 移动机器人的控制方法、装置、设备及存储介质 | |
KR101959366B1 (ko) | 무인기와 무선단말기 간의 상호 인식 방법 | |
CN109782806B (zh) | 一种无人机室内路径跟踪方法及装置 | |
KR20160070375A (ko) | 무인 비행체의 군집 비행을 이용한 항공 영상 정합 장치 및 방법 | |
CN116866719B (zh) | 一种基于图像识别的高清视频内容智能分析处理方法 | |
Pareek et al. | Person identification using autonomous drone through resource constraint devices | |
KR20220068606A (ko) | 부분 영상을 고려한 드론의 자동 착륙 알고리즘 | |
CN115883969B (zh) | 一种无人机拍摄方法、装置、设备及介质 | |
WO2021077306A1 (zh) | 无人飞行器的返航控制方法、用户终端以及无人飞行器 | |
US12124274B2 (en) | Landing control method, aircraft and storage medium | |
CN110906922A (zh) | 无人机位姿信息的确定方法及装置、存储介质、终端 | |
US20160012290A1 (en) | Photo-Optic Comparative Geolocation System | |
CN111581322B (zh) | 视频中兴趣区域在地图窗口内显示的方法和装置及设备 | |
CN110650287A (zh) | 一种拍摄控制方法、装置、飞行器及飞行系统 | |
JP2021047737A (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
US20240135685A1 (en) | Information processing apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20879337 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20879337 Country of ref document: EP Kind code of ref document: A1 |