WO2017080102A1 - 飞行装置、飞行控制系统及方法 - Google Patents

飞行装置、飞行控制系统及方法 Download PDF

Info

Publication number
WO2017080102A1
WO2017080102A1 PCT/CN2016/070162 CN2016070162W WO2017080102A1 WO 2017080102 A1 WO2017080102 A1 WO 2017080102A1 CN 2016070162 W CN2016070162 W CN 2016070162W WO 2017080102 A1 WO2017080102 A1 WO 2017080102A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
scene
offset
flying device
template
Prior art date
Application number
PCT/CN2016/070162
Other languages
English (en)
French (fr)
Inventor
李佐广
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2017080102A1 publication Critical patent/WO2017080102A1/zh
Priority to US15/630,613 priority Critical patent/US10515271B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Definitions

  • the present invention relates to a method, and more particularly to a flight control method, a flight control system, and a flight device for controlling a flight device.
  • speed detection and/or positioning control of the drone is a key technology.
  • the speed detection and/or positioning control of the drone is mostly located by GPS (global positioning system).
  • GPS global positioning system
  • Speed detection and/or positioning control cannot be performed on the drone.
  • the current speed detection and/or positioning control of the UAV is mostly performed by an algorithm based on a common scene.
  • the algorithm based on the common scene often Causes inaccurate positioning.
  • the present invention provides a flight device and a flight control method, which can perform speed detection and positioning control on a flight device without relying on GPS.
  • the present invention provides the following technical solutions.
  • the present invention provides a flight control system for controlling a flight device, the flight control system including an acquisition module for acquiring an image acquired by a camera module of the flight device and acquiring a distance sensor sensed by the flight device a height of the flying device; a scene determining module, configured to determine a scene in which the flying device is currently located; and an image offset determining module configured to calculate, according to the two adjacent images captured by the camera module and the current scene of the flying device An image X offset and an image Y offset of the second frame image of the two adjacent images; the offset calibration module is configured to acquire the flight detected by the acceleration sensor of the flight device The acceleration and angular velocity of the device in the three-dimensional direction, and the image X and Y offset are compensated according to the acceleration and angular velocity of the flying device to obtain an image correction including the corrected image X offset and the image Y offset.
  • a speed calculation module for correcting the offset of the lens through the camera module, the height of the flying device, and the image
  • the X and Y offsets of the world coordinates corresponding to the image correction offset are calculated, and the speed of the flying device is obtained according to the time interval between the two frames of adjacent image acquisition and the X and Y offset of the world coordinates.
  • the flight control system further includes an operational control module for positioning and/or hovering control of the flying device based on at least the speed of the flying device.
  • the scene determination module determines a scene in which the flying device is located according to at least one parameter in an image acquired by the camera module, and the image offset determining module analyzes the image according to an algorithm corresponding to a scene in which the flying device is located.
  • the image X, Y offset of the second frame image of the two adjacent images relative to the first frame image is calculated by a change in parameters in the two frames of images.
  • the at least one parameter includes a texture
  • the scene determining module processes the image by using a Sobel gradient algorithm to obtain a gradient matrix, and the number of pixels in the statistical gradient matrix whose gradient is greater than a first predetermined threshold, and determines When the number of the counted pixels is greater than the second threshold, the current scene is a scene with rich texture, and the scene is judged to be less than or equal to the second threshold, and the texture is less, and the current scene is a scene with less texture.
  • the image offset determining module calculates a second frame image of the two adjacent images by using a gray template matching algorithm to analyze a change in a parameter in the two frames when the current scene is a texture rich scene.
  • the second frame image is offset from the image X, Y of the first frame image.
  • the at least one parameter further includes a reflection
  • the scene determination module performs shadow statistics according to the continuous multi-frame image to determine whether there is a reflection, thereby determining that the scene is a scene with reflection or a scene without reflection.
  • the image offset determination module removes the shadow in the image when the scene in which the flying device is currently located, and removes the shadow, and then determines whether it is a texture-rich scene, and when the current scene is a texture-rich scene.
  • the at least one parameter further includes grayscale
  • the scene determination module converts the image into a grayscale histogram according to the grayscale value of the image and performs statistics to obtain a statistical value, and the statistical value and the corresponding threshold. Comparing, detecting whether the scene belongs to a dark, normal brightness, or a bright scene, and the image offset determining module is currently in a scene where the flying device is bright or dark. In the scene, perform histogram equalization on the image, adjust the brightness of the image to be uniform, and then determine whether it is a texture-rich scene, and analyze the image in the two frames by using a gray template matching algorithm when the current scene is a texture-rich scene.
  • the X and Y offsets of the second frame image of the two adjacent images relative to the first frame image are calculated according to the change of the parameter, and the sobel gradient template matching algorithm is used when the current scene is a scene with less texture.
  • An image X, Y offset of the second frame image of the two adjacent images relative to the first frame image is calculated by a change in parameters in the two frames of images.
  • the at least one parameter further includes a line
  • the scene determination module performs gradient detection on the image, performs binarization processing, and then uses hough line detection to determine whether there is a straight line, if it is determined that there is at least one straight line and If the length of the line occupies at least 1/2 of the width of the image, the scene is determined to be a line-rich scene
  • the image offset determination module divides the line into a horizontal direction when the scene where the flying device is currently located is a scene with a rich line Straight line in the straight line and the vertical direction, find a line with the smallest difference between the horizontal direction and the horizontal direction detected by the current image in the previous frame image, and calculate the distance between the straight line with the smallest angle difference to obtain the first distance, when the first distance When [-R, R] is not set to 0, where R is the set moving range, and the line in the previous frame image is found to have the smallest angle difference from the straight line detected in the vertical direction of the current image, and the two are calculated.
  • the image offset determination module calculates a change in a parameter in the two frames by using a gray template matching algorithm to analyze a change in a parameter in the two frames when the current scene is a texture rich scene.
  • the image X and Y offset of the two-frame image relative to the image of the first frame includes: the image offset determining module sets the current image width and height as W and H, respectively, and sets a template image T size to be Mx ⁇ My.
  • the minimum value of SAD of T, S and S(i, j) is the image X, Y offset of the adjacent two frames of images, wherein SAD refers to accumulating
  • the image offset determination module is a scene with less texture in the current scene.
  • the Sobel gradient template matching algorithm is used to analyze the change of the parameters in the two frames of images to calculate the X, Y offset of the second frame image of the two adjacent images relative to the first frame image, including: the image The offset determination module performs edge detection by a Sobel operator and performs calculation using a two-dimensional template including a horizontal template for horizontal differential operation and a vertical template for vertical differential operation; the image offset The quantity determination module uses the horizontal template and the vertical end to perform the plane convolution operation, respectively calculates the horizontal direction convolution fx and the vertical direction convolution fy, and finds the gradient value G as the square root of the square of fx and the square of fy, and then respectively Gradient operations are performed on two adjacent images to obtain gradient value matrices A and B, where A is the Sobel gradient matrix of the upper frame image, B is the Sobel gradient matrix of the current frame image; the image offset determination module sets the template Figure T is small for Mx ⁇ My
  • [4,4] position the difference value matrix T and S(i,j) are subjected to the difference operation method to obtain the matrix difference matrix C, and the absolute values of each element in the C matrix satisfying the following conditions 1 and 2 are accumulated.
  • SS(i,j) where condition 1 is A[r,c]>T and condition 2 is B[r,c]>T;
  • T is the gradient threshold,
  • the (i, j) position corresponding to the smallest value of SS(i, j) is the image X, Y offset of the adjacent two frames of images.
  • the present invention provides a flight control method for controlling a flight device, the flight control method comprising: acquiring an image acquired by a camera module of the flight device and acquiring a flight device sensed by a distance sensor of the flight device Height; determining the scene in which the flying device is currently located; calculating the two adjacent images according to the two adjacent images captured by the camera module and the scene in which the flying device is currently located Obtaining an image X offset and an image Y offset of the second frame image relative to the first frame image; acquiring acceleration and angular velocity of the flying device detected by the acceleration sensor of the flying device in a three-dimensional direction, and according to the The acceleration and angular velocity of the flying device compensate the image X offset and the image Y offset to obtain an image correction offset including the corrected image X offset and the image Y offset; and through the camera mode
  • the lens focal length of the group, the height of the flying device, and the image correction offset amount calculate an X, Y offset of the world coordinate corresponding to the image correction offset, and according to the time
  • the method further includes the step of positioning and/or hovering the flight device based at least on the speed of the flying device.
  • the step of "determining the scene in which the flying device is currently located" includes determining a scene in which the flying device is located based on at least one parameter in an image captured by the camera module.
  • Image horizontal offset includes: selecting a corresponding algorithm according to a scene in which the flying device is currently located; and calculating a second of the two adjacent images according to the algorithm analyzing changes in parameters in the two frames of images The offset of the frame image relative to the image X, Y of the first frame image.
  • the at least one parameter comprises a texture
  • the step of “determining a scene in which the flying device is located according to at least one parameter in an image captured by the camera module” comprises: processing the image using a sobel gradient algorithm to obtain a gradient matrix, where the number of pixels in the gradient matrix is greater than the first predetermined threshold, and the number of the statistical pixels is greater than the second threshold, the current scene is a texture rich scene, and the number of pixels in the judgment is determined. If the number is less than or equal to the second value, the texture is considered to be less, and the current scene is a scene with less texture; the step “selects a corresponding algorithm according to the scene in which the flight device is currently located; and analyzes the two frames according to the algorithm.
  • Calculating the image X, Y offset of the second frame image of the two adjacent images relative to the image of the first frame by the change of the parameter includes: using the gray template matching algorithm when the current scene is a texture rich scene Calculating a change of a parameter in the two frames of images to calculate an image of the second frame image of the two adjacent images relative to the first frame image X, Y offset, and the sobel gradient template matching algorithm is used to analyze the change of the parameters in the two frames when the current scene is a scene with less texture, and the second frame image in the adjacent image of the two frames is calculated relative to The image X, Y offset of the first frame image.
  • the at least one parameter further comprises a reflection
  • the step "according to the camera mode At least one parameter of the group of acquired images determines a scene in which the flying device is located, including: performing shadow statistics according to consecutive multi-frame images, determining whether there is a reflection, thereby determining that the scene is a scene with reflection or a scene without reflection; Step "select a corresponding algorithm according to the scene in which the flight device is currently located; and calculate a second frame image of the two adjacent images relative to the first frame according to the algorithm analyzing the change of the parameters in the two frames of images.
  • the image X, Y offset of the image includes: when the scene currently in the flying device is a reflection scene, the shadow in the image is removed, and after the shadow is removed, it is determined whether the texture is rich in the scene, and in the current scene is
  • the texture rich scene is analyzed by using a gray template matching algorithm to analyze the change of the parameters in the two frames of images
  • the image X and Y offsets of the second frame image in the two adjacent images are calculated.
  • calculating the two adjacent images by using a sobel gradient template matching algorithm to analyze the changes of the parameters in the two frames when the current scene is a scene with less texture.
  • the at least one parameter further includes grayscale
  • the step of “determining a scene in which the flying device is located according to at least one parameter in an image captured by the camera module” includes: displaying the image according to a gray value of the image Convert to a gray histogram and perform statistics to obtain statistical values, and compare the statistical values with corresponding thresholds to detect whether the scene belongs to a dark, normal brightness, or a bright scene; the step is “according to the current flight device Selecting a corresponding algorithm in the scene; and calculating an image X and Y of the second frame image of the two adjacent images relative to the first frame image according to the algorithm analyzing the change of the parameters in the two frames of images
  • the shift amount includes: when the scene where the flying device is currently located is a scene with a light or a dark light, the histogram is equalized, the brightness of the image is adjusted to be uniform, and then it is judged whether it is a texture rich scene, and When the current scene is a texture rich scene, the gray template matching algorithm is used
  • the at least one parameter further includes a line
  • the step of “determining a scene in which the flying device is located according to at least one parameter in an image captured by the camera module” includes: performing gradient detection on the image, performing two Value processing, and then use hough line detection to determine whether there is a straight line.
  • the scene is judged to be a line rich scene; the step "based on the flying device Selecting a corresponding algorithm for the current scene; and calculating an image X of the second frame image of the two adjacent images relative to the first frame image according to the algorithm analyzing the change of the parameters in the two frames of images; Y offset" includes: current on the flight device When the scene is a scene with a rich line, the straight line is a straight line in the horizontal direction and a straight line in the vertical direction. In the previous frame image, a straight line with the smallest difference between the horizontal direction and the horizontal direction detected by the current image is found, and two lines are calculated.
  • the distance of the line with the smallest angle difference is the first distance.
  • the straight line with the smallest angle difference between the straight lines in the vertical direction, the distance of the straight line with the smallest difference between the two angles is calculated to obtain the second distance, and when the second distance is not [-R, R], the distance is set to 0, and the linear positioning is obtained.
  • the first distance and the second distance are the image X and Y offsets of the adjacent two frames of images.
  • the template map T is obtained from the position of the current frame image [4, 4];
  • the template image is superimposed on the matching graph, and the search subgraph in the reference graph covered by the template graph is S(i, j), i, j is the upper left corner image of the subgraph in the matching graph S
  • i and j take values in the range [-4, 4], S(0, 0) corresponds to the [4, 4
  • the image X, Y offset with respect to the first frame image includes: edge detection by the Sobel operator and calculation using a two-dimensional template including a horizontal template for horizontal difference operation and for vertical The vertical template for the direction difference operation; using the horizontal template and the vertical end to perform the plane convolution operation, respectively calculate the horizontal convolution fx and the vertical convolution fy, and find the gradient value G as the square root of the sum of the square of fx and the square of fy Then, gradient operations are performed on the adjacent two images to obtain gradient value matrices A and B, where A is the Sobel gradient matrix of the upper frame image, B is the Sobel gradient matrix of the current frame image, and the template template T is set to Mx.
  • the present invention provides a flying device including a camera module, a distance sensor and an acceleration sensor, the camera module for acquiring an image, the distance sensor for acquiring and acquiring a height of the flying device, the acceleration The sensor is configured to detect acceleration and angular velocity of the flying device in a three-dimensional direction, and the flying device further includes a flight control system, the flight control system includes an acquisition module for acquiring an image captured by the camera module of the flying device And obtaining a height of the flying device sensed by the distance sensor of the flying device; a scene determining module, configured to determine a scene in which the flying device is currently located; and an image offset determining module, configured to acquire two frames adjacent to each other according to the camera module Calculating an image X offset and an image Y offset of the second frame image of the two frames of adjacent images relative to the first frame image; and an offset calibration module, configured to: Acquiring the acceleration and angular velocity of the flight device detected by the acceleration sensor of the flight device in three dimensions, and according to The acceleration and angular
  • the flight control system further includes an operational control module for positioning and/or hovering control of the flying device based on at least the speed of the flying device.
  • the scene determination module determines a scene in which the flying device is located according to at least one parameter in an image acquired by the camera module, and the image offset determining module analyzes the image according to an algorithm corresponding to a scene in which the flying device is located.
  • the image X, Y offset of the second frame image of the two adjacent images relative to the first frame image is calculated by a change in parameters in the two frames of images.
  • the at least one parameter includes a texture
  • the scene determining module processes the image by using a Sobel gradient algorithm to obtain a gradient matrix, and the number of pixels in the statistical gradient matrix whose gradient is greater than a first predetermined threshold, and determines Current field when the number of counted pixels is greater than the second threshold
  • the scene is a texture-rich scene, and in the judgment that the number of pixels counted is less than or equal to the second value, the texture is considered to be less, and the current scene is a scene with less texture; the image offset determination module is in the current scene
  • the texture rich scene is analyzed by using a gray template matching algorithm to analyze the change of the parameters in the two frames of images, the image X and Y offsets of the second frame image in the two adjacent images are calculated.
  • the at least one parameter further includes a reflection
  • the scene determination module performs shadow statistics according to the continuous multi-frame image to determine whether there is a reflection, thereby determining that the scene is a scene with reflection or a scene without reflection.
  • the image offset determination module removes the shadow in the image when the scene in which the flying device is currently located, and removes the shadow, and then determines whether it is a texture-rich scene, and when the current scene is a texture-rich scene.
  • the at least one parameter further includes grayscale
  • the scene determination module converts the image into a grayscale histogram according to the grayscale value of the image and performs statistics to obtain a statistical value, and the statistical value and the corresponding threshold. Comparing, detecting whether the scene belongs to a dark, normal brightness, or a bright scene, and the image offset determining module performs an image on the scene when the scene where the flying device is currently located is a scene with a lighted or dark light Histogram equalization, adjust the brightness of the image to be uniform, and then determine whether it is a texture-rich scene, and calculate the change of the parameters in the two frames by using a gray template matching algorithm when the current scene is a texture-rich scene.
  • the second frame image of the two adjacent images is offset from the image X, Y of the first frame image, and the sobel gradient template matching algorithm is used to analyze the two frames in the current scene.
  • the image X, Y offset of the second frame image of the two adjacent images relative to the first frame image is calculated by the change of the parameter.
  • the at least one parameter further includes a line
  • the scene determination module performs gradient detection on the image, performs binarization processing, and then uses hough line detection to determine whether there is a straight line, if it is determined that there is at least one straight line and If the length of the line occupies at least 1/2 of the width of the image, the scene is determined to be a line-rich scene; and the image offset determination module divides the line into a horizontal direction when the scene where the flying device is currently located is a scene with a rich line Straight lines in the straight and vertical directions, in the previous frame Find a line with the smallest difference between the horizontal direction and the horizontal direction detected by the current image. Calculate the distance between the two lines with the smallest angle difference to get the first distance.
  • the first distance is not [-R, R]
  • R is the set moving range
  • a line having the smallest angular difference from the straight line detected by the current image is found in the image of the previous frame, and the distance between the straight line with the smallest difference between the two angles is calculated to obtain the second distance.
  • the second distance is not set to [-R, R]
  • the first distance and the second distance obtained by the linear positioning are the image X and Y offsets of the adjacent two frames of images.
  • the image offset determination module calculates a change in a parameter in the two frames by using a gray template matching algorithm to analyze a change in a parameter in the two frames when the current scene is a texture rich scene.
  • the image X and Y offset of the two-frame image relative to the image of the first frame includes: the image offset determining module sets the current image width and height as W and H, respectively, and sets a template image T size to be Mx ⁇ My.
  • the minimum value of SAD of T, S and S(i, j) is the image X, Y offset of the adjacent two frames of images, wherein SAD refers to accumulating
  • the image offset determining module calculates a change in a parameter in the two frames by using a sobel gradient template matching algorithm when the current scene is a scene with less texture, and calculates the adjacent image in the two frames.
  • the image X, Y offset of the second frame image relative to the first frame image includes: the image offset determination module performs edge detection by a Sobel operator, and performs calculation using a two-dimensional template, and the two-dimensional template includes a horizontal template for horizontal difference operation and a vertical template for vertical difference operation; the image offset determination module performs horizontal convolution operation using a horizontal template and a vertical end, respectively calculating a horizontal direction convolution fx and a vertical direction volume For the product fy, find the gradient value G as the square root of the sum of the square of fx and the square of fy.
  • the invention has the beneficial effects that speed detection and positioning control can still be performed when the GPS signal is weak or no GPS signal, and precise control can be performed based on different scenarios.
  • FIG. 1 is a schematic diagram of a hardware architecture of a flying device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a flight control system in accordance with an embodiment of the present invention.
  • Fig. 3 is a schematic explanatory diagram showing the X and Y offset amounts of world coordinates in an embodiment of the present invention.
  • FIG. 4 is a schematic diagram showing the relationship between the X and Y offsets of the world coordinates and the image correction offset amount according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a flight control method according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a hardware architecture of a flying device 100 in an embodiment.
  • the flying device 100 includes a processor 10, a camera module 20, an acceleration sensor 30, and a distance sensor 40.
  • the camera module 20 is configured to acquire images every predetermined time interval, for example, to capture images every 1 second.
  • the camera module 20 can be a monocular or binocular camera, and the acquired image is a low resolution image.
  • the acceleration sensor 30 is configured to detect acceleration of the flying device 100 in three dimensions (X, Y, Z) Degree and angular velocity.
  • the distance sensor 40 is used to detect the distance of the flying device 100 from the ground, that is, the height of the flying device 100.
  • the acceleration sensor 30 can be a gyroscope, and the distance sensor 40 can be an ultrasonic sensor.
  • the processor 10 operates a flight control system S1.
  • the flight control system S1 includes an acquisition module 11, a scene determination module 12, an image offset determination module 13, an offset calibration module 14, a speed calculation module 15, and an operation control module 16.
  • the flight control system S1 is used to detect the speed of the flying device 100 and perform positioning control.
  • the modules of the flight control system S1 can be programmed and executed by the processor 10.
  • the modules of the flight control system S1 can also be firmware that is solidified in the processor 10.
  • the flight control system S1 can be an application software installed in the flight device 100.
  • the acquisition module 11 is configured to acquire an image acquired by the camera module 20 and a height of the flying device 100 sensed by the distance sensor 40. In the present embodiment, the acquisition module 11 acquires the image acquired by the camera module 20 and the height of the flying device 100 sensed by the distance sensor 40 in real time.
  • the scene determination module 12 is configured to determine the scene in which the flying device 100 is currently located.
  • the scene determining module 12 determines the scene in which the flying device 100 is located according to the characteristics of the image, and the specific process thereof will be described later.
  • the image offset determining module 13 is configured to calculate the two adjacent images according to the two adjacent images acquired by the camera module 20 acquired by the acquiring module 11 and the scene where the flying device 100 is currently determined by the scene determining module 12 The image X offset and the image Y offset of the second frame image relative to the first frame image.
  • the image offset determining module 14 calculates the second frame image of the two adjacent images according to the algorithm corresponding to the scene in which the flying device 100 is located, and analyzes the change of the parameters in the two frames of images.
  • the image offset determination module 14 may determine the same feature points in the two adjacent images, and calculate the X, Y offsets of the same feature points in the two adjacent images.
  • the image X, Y offset is obtained.
  • the same feature point is an imaging point of the same object in the two adjacent images, and the X and Y offset of the image is in the X direction and the Y direction of the same object in the image of the two frames. Offset.
  • the offset calibration module 14 is configured to acquire the acceleration and angular velocity of the flying device 100 detected by the acceleration sensor 30 in a three-dimensional direction, and compensate the offset of the image X and Y according to the acceleration and angular velocity of the flying device 100. An image correction offset is obtained.
  • the speed calculation module 15 is configured to calculate an X, Y offset of the world coordinate corresponding to the image correction offset by the lens focal length, the height of the flying device 100, and the image correction offset, that is, corresponding to the true The actual X, Y offset of the real world.
  • X and Y of the present invention refer to the horizontal axis direction and the vertical axis direction on a plane parallel to the ground in the three-dimensional coordinate system, respectively.
  • the X, Y offset of the world coordinates is the moving distance of the flying device 100 / camera module 20 relative to the ground in the X and Y directions.
  • FIG. 3 Please refer to FIG. 3 together for a schematic diagram of the X and Y offsets of the world coordinates.
  • the camera module 20 of the flying device 100 compares the actual when acquiring an image. Object A will move relative to each other.
  • the camera module 20 includes a lens 21 and an image sensor 22.
  • the actual object A is caused to be imaged on the image sensor 22 by the lens 21 of the camera module 20 of the flying device 100.
  • the actual object A may have an offset to the lower left as shown in FIG.
  • the image correction offset amount has a certain correspondence relationship with the X and Y offsets of the world coordinates, and the X and Y offset amounts of the world coordinates can be obtained based on the image correction offset amount or the like.
  • the camera module 20 can be a camera, a camera, a camera, or the like.
  • the image correction offset is the actual distance of the object A in the two-frame adjacent images P1, P2 from the image point A1 on the image sensor 22 in the X direction and the Y direction.
  • FIG. 4 Please refer to FIG. 4 as a diagram showing the relationship between the X and Y offset of a world coordinate and the image correction offset.
  • the lens focal length be f
  • the height of the flying device 100 be H
  • the corrected image X offset in the image correction offset be x1
  • the corrected image Y offset be y1
  • the world coordinate X offset be The X1, Y offset is Y1.
  • the ratio of the X or Y offset of the world coordinates to the corrected image X offset or corrected image Y offset is equal to the ratio of the lens focal length to f and height H.
  • the speed calculation module 15 determines the speed of the flying device based on the time interval t1 of the two adjacent image acquisitions and the X and Y offsets of the world coordinates.
  • the X, Y offset of the world coordinates is in the X and Y directions of the flight device during the time interval between the two adjacent image acquisitions.
  • the moving distance is set such that the camera module 20 collects the two frames of adjacent images at a time interval t1, and the speed calculating module 15 calculates the flying device in the X according to the time interval t1 and the X and Y offset of the world coordinates.
  • the rate in the direction is X1/t1 and the rate in the Y direction is Y1/t1.
  • the speed calculation module 15 first obtains a vector sum of the X, Y offsets of the world coordinates and derives the actual displacement D1 of the flying device 100, and then calculates the actual speed of the flying device 100 as D1/t1.
  • the operational control module 16 is for positioning and/or hovering control of the flying device 100 based at least on the speed of the flying device 100. For example, the operational control module 16 calculates the time required based on the speed of the flying device 100 and the distance between the flying device 100 and the destination, and prepares for hovering or landing when the required time is less than a predetermined value. In an embodiment, the operation control module 16 determines that the currently calculated speed and direction are substantially equal to the speed calculated at the previous time but in the opposite direction, determining that the flight speed of the flying device 100 is close to 0, and the moving distance is very small, such as 1 cm. The moving distance controls the flying device 100 to hover at a certain position.
  • the speed of the flying device 100 can be calculated from the captured picture, and the positioning control can be performed.
  • the flying device 100 is an unmanned aerial vehicle.
  • the scene determining module 12 determines the scene in which the flying device 100 is currently located according to the image collected by the camera module 20 . In other embodiments, the scene determining module 12 may also determine that the scene selected by the user is the scene in which the flying device 100 is currently located according to the scene selected by the user.
  • the scene determining module 12 determines the type of the scene according to at least one parameter in the image collected by the camera module 20.
  • the at least one parameter includes a texture
  • the scene determining module 12 processes the image by using a sobel gradient algorithm to obtain a gradient matrix, and the number C of pixels in the gradient matrix that is greater than the threshold T1, and the value C is greater than the threshold.
  • T2 thinks that the texture is rich, otherwise the texture is considered to be less.
  • the at least one parameter further includes a reflection
  • the scene determining module 12 performs shadow statistics according to the continuous multi-frame image to determine whether there is a reflection, thereby determining that the scene is a scene with reflection or a scene without reflection.
  • the scene determining module 12 determines that there is a situation of darkness and brightness in the continuous multi-frame image or performs light reflection detection such as a drone
  • the detection is performed according to the shape of the drone or the like, and it is determined that there is no match.
  • the scene is determined to be a scene with reflection.
  • the scene determining module performs the determining according to the reflection detecting algorithm.
  • the circle of the drone may display a grayscale image with a brightness in the image, so the scene determining module 12 determines whether each pixel of the gray image is greater than or equal to the threshold T, wherein T is set according to the empirical value, and may be set to 220;
  • the determining module 12 sets the gray value pixel of greater than or equal to T to 255, and sets the grayscale value pixel smaller than T to 0, thereby converting the image into a binary image, where 0 is the background and 255 is the foreground, and then
  • the connected area is extracted and represented by a circumscribed rectangle; the scene determining module 12 performs the target size judgment, and the target size range is considered to be a light reflection target.
  • the target size range is obtained according to the measured values of the reflections of different heights.
  • the at least one parameter further includes grayscale
  • the scene determining module 12 converts the image into a grayscale histogram according to the gray value of the image and performs statistics, and compares with the corresponding threshold to detect whether the scene is dark, Normal brightness, or brighter.
  • the scene determining module 12 gives a threshold T according to the average brightness L of the grayscale histogram, and belongs to a dark scene when determining L ⁇ 80 (candela), and L>170 belongs to a bright scene, 80 ⁇ L ⁇ 170 o'clock is a normal brightness scene.
  • the at least one parameter may further include a line
  • the scene determining module 12 performs gradient detection on the image, performs binarization processing, and then uses a conventional hough line detection to determine whether there is a straight line, if it is determined that at least one straight line and the straight line length If the image width is at least 1/2, the scene is determined to be a line scene.
  • the scene determining module 12 determines the type of the current scene according to one parameter and a corresponding algorithm. In other embodiments, the scene determination module may also determine the scene according to the plurality of parameters and the plurality of corresponding algorithms. For example, the scene determination module 12 can simultaneously determine whether the current scene is a texture rich and lined scene according to the texture and the line.
  • the image offset determining module 13 calculates an image horizontal offset of the second frame image of the two adjacent images from the first frame image according to the two adjacent images and the scene where the flying device 100 is currently located.
  • the image offset determining module 13 selects a corresponding algorithm according to the scene in which the flying device 100 is currently located, and analyzes the two adjacent images according to the algorithm to obtain a second frame image relative to the first The horizontal offset of the image of the frame image.
  • the image offset determination module 13 adopts a grayscale template matching algorithm.
  • the current image width and height are respectively W and H
  • the template image T is from the current frame image [4, 4].
  • the position is obtained;
  • the template image is superimposed on the matching graph, and the search subgraph in the reference graph covered by the template graph is S(i, j), i, j is the upper left corner of the subgraph in the matching graph.
  • the position in S, i and j are at [-4, 4]
  • the value in the range. S(0,0) corresponds to the [4,4] position of A.
  • the correlation function SAD is calculated to find the search subgraph as similar as possible to the template graph and its coordinate positions i and j.
  • the minimum value of SAD of T and S(i, j) is the best matching position, that is, the phase.
  • the X or Y direction relative offset or the amount of movement of the adjacent two frames of images that is, the X and Y offsets of the adjacent two frames of images, wherein the offset is in the range of [-4, 4].
  • SAD refers to the process of accumulating and summing the absolute values of pixel differences corresponding to each position of two images. The smaller the SAD value, the higher the matching of the two images, which can be used as the best match.
  • the image offset determination module 13 employs a sobel gradient template matching algorithm. Specifically, the image offset determination module 13 performs edge detection by using a Sobel algorithm, which uses a two-dimensional template for calculation, and the two-dimensional template includes a horizontal template and a vertical template. As shown in the figure below, the horizontal template is used for horizontal differential operation and the vertical template is used for vertical differential operation.
  • the image shift amount determining module 13 performs a plane convolution operation using the above-described template, and calculates the horizontal direction convolution fx and the vertical direction convolution fy, respectively, and obtains the gradient value G as the square root of the square of the fx and the square of the square of fy. Then, gradient operations are performed on the adjacent two images to obtain gradient value matrices A and B. Where A is the Sobel gradient matrix of the upper frame image and B is the Sobel gradient matrix of the current frame image.
  • the template image is superimposed on the matching image, and the search subgraph in the reference image covered by the template image is displayed.
  • S(i,j) i,j is the position of the upper left image point of the subgraph in the matching map S, and i and j take values in the range of [-4, 4].
  • S(0,0) corresponds to the [4,4] position of A.
  • a difference operation method is performed on the gradient value matrices T and S(i, j) to obtain a matrix difference matrix C.
  • the absolute values of each element in the C matrix that satisfy the following conditions (condition 1 and condition 2 below) are accumulated to obtain the sum SS(i, j).
  • A[r,c] is the gradient value of the r and c positions in the gradient value matrix A
  • [r,c] is the gradient value of the r and c positions in the gradient value matrix B
  • T is the gradient threshold.
  • the (i, j) position corresponding to the smallest value of SS(i, j) is the best matching position, that is, the relative offset or the amount of movement of the adjacent two frames of images in the X and Y directions, that is, the adjacent two frames of images.
  • Image X, Y offset where the offset is in the range [-4, 4].
  • the image offset determining module 13 performs histogram equalization on the image when the scene in which the flying device 100 is currently located is a scene with particularly bright or particularly dark illumination, adjusts the brightness of the image to be uniform, and then determines whether it is a texture.
  • a rich scene, and according to the relationship as described above, the image is processed according to whether the texture is rich or not.
  • the image offset determining module 13 removes the shadow in the image when the scene in which the flying device 100 is currently located, and removes the shadow, and determines whether the texture is rich in the scene, and according to the relationship as described above. Select the corresponding algorithm according to whether the texture is rich or not.
  • the image offset determination module 13 divides a straight line into a horizontal straight line and a vertical straight line when the scene in which the flying device 100 is currently located is a line rich scene, and finds one in the previous frame image and the current image is detected. A straight line Lph in which the horizontal direction Lch angle difference is the smallest. Calculate the distance between two lines to get j. When j is not in [-R, R], set it to 0, where R is the set moving range, and generally R takes 4. In the previous frame image, a line Lcv having the smallest angular difference from the straight line Lcv detected in the vertical direction of the current image is found. Calculate the distance between two lines to get i, set to 0 when i is not in [-R, R].
  • the i,j obtained by linear positioning is the relative offset or movement amount of the adjacent two frames of images in the X and Y directions, that is, the X and Y offsets of the images of two adjacent frames, wherein the offset is [-4 , 4] range.
  • the speed of the flying device 100 can be determined and the positioning control can be performed when there is no GPS signal, and precise control can be further performed based on different scenarios.
  • FIG. 5 is a flowchart of a flight control method according to an embodiment of the present invention.
  • the flight control method is used to detect the speed of the flying device 100 and perform positioning control.
  • the acquisition module 11 is obtained.
  • the image acquired by the camera module 20 and the height of the flying device 100 sensed by the distance sensor 40 are taken (501).
  • the scene determination module 12 determines the scene in which the flight device 100 is currently located (503).
  • the image offset determining module 13 calculates an image of the second frame image of the two adjacent images relative to the first frame image according to the two adjacent images captured by the camera module 20 and the scene where the flying device 100 is currently located. X, Y offset (505). The image offset determining module 13 determines the same feature points in the adjacent images of the two frames, and calculates the X and Y offsets of the same feature points in the adjacent images of the two frames to obtain the image X. Y offset.
  • the scene determining module 12 determines the type of the scene according to at least one parameter in the image collected by the camera module 20, and selects a corresponding algorithm according to the scene in which the flying device 100 is currently located, and according to the algorithm, the two The frame adjacent image is subjected to analysis processing to obtain an image horizontal offset of the second frame image with respect to the first frame image.
  • the offset calibration module 14 acquires the acceleration and angular velocity of the flying device 100 detected by the acceleration sensor 30 in a three-dimensional direction, and compensates the image X and Y offset according to the acceleration and angular velocity of the flying device to obtain an image. Correct the offset (507).
  • the speed calculation module 15 is configured to calculate an X, Y offset of the world coordinate corresponding to the image correction offset by the lens focal length, the height of the flying device 100, and the image correction offset, and obtain the adjacent image according to the two frames.
  • the time interval and the X and Y offsets of the world coordinates determine the speed of the flying device (509). Specifically, the speed calculation module 15 calculates the rate of the flying device in the X direction and the speed in the Y direction according to the time interval t1 and the X, Y offset of the world coordinates. More specifically, the speed calculation module 15 derives the speeds of the flying device 100 in the X and Y directions, respectively, by dividing the X, Y offset by the time interval.
  • the operational control module 16 positions and/or hoveres the flight device 100 based at least on the speed of the flight device 100 (511).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Astronomy & Astrophysics (AREA)
  • Image Analysis (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明涉及一种飞行控制方法,包括:获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;确定飞行装置当前所处的场景;根据采集的两帧相邻图像以及飞行装置所处的场景,计算两帧相邻图像的图像X、Y偏移量;获取飞行装置的加速度传感器侦测的飞行装置的加速度及角速度,并根据加速度及角速度对所述图像X、Y偏移量进行补偿得到图像校正偏移量;以及通过相机模组的镜头焦距、高度及图像校正偏移量计算世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及世界坐标的X、Y偏移量求出飞行装置的速度。本发明还提供一种飞行控制系统及飞行装置,可在无GPS信号时对飞行装置进行速度侦测及定位控制。

Description

飞行装置、飞行控制系统及方法 技术领域
本发明涉及一种方法,尤其涉及一种用于控制飞行装置的飞行控制方法、飞行控制系统及飞行装置。
背景技术
目前,无人机等飞行装置由于其便捷性和安全性,已经广泛应用于农业生产、地质勘测、气象监测、电力线巡查、抢险救灾辅助、视频拍摄、地图绘建等领域。在无人机的控制中,对无人机进行速度侦测及/或定位控制,是一个关键的技术。目前对无人机进行速度侦测及/或定位控制大都通过GPS(global positioning system,全球定位系统)进行定位,然而,当无人机处于GPS信号较弱或GPS信号未覆盖的区域时,将无法对无人机进行速度侦测及/或定位控制。此外,目前对无人机进行速度侦测及/或定位控制大都通过基于一通用场景的算法进行,然而,当无人机实际处于场景与该通用场景差别较大时,基于通用场景的算法往往导致无法准确地定位。
发明内容
有鉴于此,本发明提供一种飞行装置及飞行控制方法,不依赖GPS即可对飞行装置进行速度侦测及定位控制。
为解决上述技术问题,本发明提供以下技术方案。
一方面,本发明提供一种飞行控制系统,用于控制一飞行装置,该飞行控制系统包括获取模块,用于获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;场景确定模块,用于确定飞行装置当前所处的场景;图像偏移量确定模块,用于根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据该飞行装置的加速度及角速度对该图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及速度计算模块,用于通过相机模组的镜头焦距、飞行装置的高度以及该图像校正偏移量 计算该图像校正偏移量对应的世界坐标的X、Y偏移量,并根据该两帧相邻图像采集的时间间隔以及该世界坐标的X、Y偏移量求出该飞行装置的速度。
在一些实施例中,所述飞行控制系统还包括一运行控制模块,用于至少基于该飞行装置的速度对飞行装置进行定位及/或悬停控制。
在一些实施例中,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景,所述图像偏移量确定模块根据飞行装置所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数包括纹理,所述场景确定模块采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于第一预定阈值的像素的个数,并判断所统计像素的个数大于第二阈值时,当前场景为纹理丰富的场景,以及在判断所统计的像素的个数小于或等于第二阀值认为纹理比较少,当前场景为纹理较少的场景;所述图像偏移量确定模块并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括倒影,所述场景确定模块根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括灰度,所述场景确定模块根据图像的灰度值将图像转换为灰度直方图并进行统计得到统计值,并将统计值与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为光照偏亮或者偏暗的 场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括线条,所述场景确定模块对图像做梯度检测,进行二值化处理,然后使用hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景;所述图像偏移量确定模块在飞行装置当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线角度差最小的直线,计算两条角度差最小的直线的距离得到第一距离,当第一距离不在[-R,R]时设置为0,其中R是设定的移动范围,在上一帧图像中找一条和当前图像检测到的垂直方向的直线的角度差最小的直线,计算所述两条角度差最小的直线的距离得到第二距离,当第二距离不在[-R,R]时设置为0,直线定位得到的第一距离、第二距离就是相邻两帧图像的图像X、Y偏移量。
在一些实施例中,所述图像偏移量确定模块在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块设当前图像宽高分别为W和H,设定一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得,所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,所述图像偏移量确定模块并在匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为相邻两帧图像的图像X、Y偏移量,其中,SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。
在一些实施例中,所述图像偏移量确定模块在当前场景为纹理较少的场景 时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块通过Sobel算子进行边缘检测,并使用二维模板进行计算,二维模板包括用于水平方向差分运算的水平模板和用于垂直方向差分运算的垂直模板;所述图像偏移量确定模块使用水平模板和垂直末班进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根,然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B,其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵;所述图像偏移量确定模块并设定模板图T小为Mx×My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;所述图像偏移量确定模块并设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得;所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C,将C矩阵中符合下面条件1和条件2的每个元素绝对值进行累加,得到和SS(i,j),其中,条件1为A[r,c]>T、条件2为B[r,c]>T;其中,A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值,其中SS(i,j)最小的值对应的(i,j)位置为相邻两帧图像的图像X、Y偏移量。
在一些实施例中,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
另一方面,本发明提供一种飞行控制方法,用于控制飞行装置,所述飞行控制方法包括:获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;确定飞行装置当前所处的场景;根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像 中的第二帧图像相对于第一帧图像的图像X偏移量和图像Y偏移量;获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X偏移量和图像Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及通过相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
在一些实施例中,所述方法还包括步骤:至少基于所述飞行装置的速度对飞行装置进行定位及/或悬停控制。
在一些实施例中,所述步骤“确定飞行装置当前所处的场景所述场景”包括:根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景。
在一些实施例中,所述步骤“根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像水平偏移量”包括:根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数包括纹理,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于第一预定阈值的像素的个数,并判断所统计像素的个数大于第二阈值时,当前场景为纹理丰富的场景,以及在判断所统计的像素的个数小于或等于第二值认为纹理比较少,当前场景为纹理较少的场景;所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括倒影,所述步骤“根据相机模 组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景;所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括灰度,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:根据图像的灰度值将图像转换为灰度直方图并进行统计得到统计值,并将统计值与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮的场景;所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前所处的场景为光照偏亮或者偏暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,,所述至少一个参数还包括线条,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:对图像做梯度检测,进行二值化处理,然后使用hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景;所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前 所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线角度差最小的直线,计算两条角度差最小的直线的距离得到第一距离,当第一距离不在[-R,R]时设置为0,其中R是设定的移动范围,在上一帧图像中找一条和当前图像检测到的垂直方向的直线的角度差最小的直线,计算所述两条角度差最小的直线的距离得到第二距离,当第二距离不在[-R,R]时设置为0,直线定位得到的第一距离、第二距离就是相邻两帧图像的图像X、Y偏移量。
在一些实施例中,所述步骤“在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:设当前图像宽高分别为W和H,设定一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得,在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置;以及在匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为相邻两帧图像的图像X、Y偏移量,其中,SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。
在一些实施例中,所述步骤“在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:通过Sobel算子进行边缘检测,并使用二维模板进行计算,二维模板包括用于水平方向差分运算的水平模板和用于垂直方向差分运算的垂直模板;使用水平模板和垂直末班进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根,然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B,其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵;设定模板图T小为Mx×My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得;所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i, j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C,将C矩阵中符合下面条件1和条件2的每个元素绝对值进行累加,得到和SS(i,j),其中,条件1为A[r,c]>T、条件2为B[r,c]>T;其中,A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值,其中SS(i,j)最小的值对应的(i,j)位置为相邻两帧图像的图像X、Y偏移量。
另一方面,本发明提供一种飞行装置,包括相机模组、距离传感器以及加速度传感器,所述相机模组用于采集的图像,所述距离传感器用于以及获取飞行装置的高度,所述加速度传感器用于侦测到的飞行装置在三维方向上的加速度及角速度,所述飞行装置还包括一种飞行控制系统,该飞行控制系统包括获取模块,用于获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;场景确定模块,用于确定飞行装置当前所处的场景;图像偏移量确定模块,用于根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据该飞行装置的加速度及角速度对该图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及速度计算模块,用于通过相机模组的镜头焦距、飞行装置的高度以及该图像校正偏移量计算该图像校正偏移量对应的世界坐标的X、Y偏移量,并根据该两帧相邻图像采集的时间间隔以及该世界坐标的X、Y偏移量求出该飞行装置的速度。
在一些实施例中,所述飞行控制系统还包括一运行控制模块,用于至少基于该飞行装置的速度对飞行装置进行定位及/或悬停控制。
在一些实施例中,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景,所述图像偏移量确定模块根据飞行装置所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数包括纹理,所述场景确定模块采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于第一预定阈值的像素的个数,并判断所统计像素的个数大于第二阈值时,当前场 景为纹理丰富的场景,以及在判断所统计的像素的个数小于或等于第二值认为纹理比较少,当前场景为纹理较少的场景;所述图像偏移量确定模块并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括倒影,所述场景确定模块根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括灰度,所述场景确定模块根据图像的灰度值将图像转换为灰度直方图并进行统计得到统计值,并将统计值与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为光照偏亮或者偏暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
在一些实施例中,所述至少一个参数还包括线条,所述场景确定模块对图像做梯度检测,进行二值化处理,然后使用hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景;所述图像偏移量确定模块在飞行装置当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像 中找一条和当前图像检测到的水平方向的直线角度差最小的直线,计算两条角度差最小的直线的距离得到第一距离,当第一距离不在[-R,R]时设置为0,其中R是设定的移动范围,在上一帧图像中找一条和当前图像检测到的垂直方向的直线的角度差最小的直线,计算所述两条角度差最小的直线的距离得到第二距离,当第二距离不在[-R,R]时设置为0,直线定位得到的第一距离、第二距离就是相邻两帧图像的图像X、Y偏移量。
在一些实施例中,所述图像偏移量确定模块在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块设当前图像宽高分别为W和H,设定一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得,所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,所述图像偏移量确定模块并在匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为相邻两帧图像的图像X、Y偏移量,其中,SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。
在一些实施例中,所述图像偏移量确定模块在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块通过Sobel算子进行边缘检测,并使用二维模板进行计算,二维模板包括用于水平方向差分运算的水平模板和用于垂直方向差分运算的垂直模板;所述图像偏移量确定模块使用水平模板和垂直末班进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根,然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B,其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵;所述图像偏移量确定模块并设定模板图T小为Mx×My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;所述图像偏移量确定模块并设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得; 所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C,将C矩阵中符合下面条件1和条件2的每个元素绝对值进行累加,得到和SS(i,j),其中,条件1为A[r,c]>T、条件2为B[r,c]>T;其中,A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值,其中SS(i,j)最小的值对应的(i,j)位置为相邻两帧图像的图像X、Y偏移量。
在一些实施例中,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
本发明的有益效果在于可在GPS信号较弱或没有GPS信号时仍然能够进行速度侦测及定位控制,并可基于不同的场景进行精确的控制。
附图说明
图1为本发明一实施方式中的飞行装置的硬件架构示意图。
图2为本发明一实施方式中的飞行控制系统的模块图。
图3为本发明一实施方式中的世界坐标的X、Y偏移量的说明示意图。
图4为本发明一实施方式中的世界坐标的X、Y偏移量与图像校正偏移量的关系示意图。
图5为本发明一实施方式中的飞行控制方法的流程图。
附图标记:
飞行装置                                   100
处理器                                     10
相机模组                                   20
加速度传感器                               30
距离传感器                                 40
飞行控制系统                               S1
获取模块                                   11
场景确定模块                               12
图像偏移量确定模块                         13
偏移量校准模块                             14
速度计算模块                               15
运行控制模块                               16
镜头                                       21
图像传感器                                 22
图像                                       P1、P2
物体                                       A
成像点                                     A1
镜头焦距                                   f
高度                                       H
校正后的图像X偏移量                        x1
校正后的图像Y偏移量                        y1
世界坐标的X偏移量                          X1
世界坐标的Y偏移量                          Y1
步骤                                       501-511
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
请参阅图1,为一实施方式中的飞行装置100的硬件架构示意图。该飞行装置100包括处理器10、相机模组20、加速度传感器30及距离传感器40。
该相机模组20用于每间隔预定时间采集图像,例如每间隔1秒采集一次图像。该相机模组20可为一单目或双目相机,采集的图像为低分辨率图像。该加速度传感器30用于侦测飞行装置100在三维方向(X、Y、Z)上的加速 度及角速度。该距离传感器40用于侦测飞行装置100与地面的距离,即飞行装置100的高度。其中,在一实施方式中,该加速度传感器30可为陀螺仪,该距离传感器40可为超声波传感器。
请一并参阅图2,该处理器10运行有一飞行控制系统S1。如图2所示,该飞行控制系统S1包括获取模块11、场景确定模块12、图像偏移量确定模块13、偏移量校准模块14、速度计算模块15及运行控制模块16。所述飞行控制系统S1用于侦测飞行装置100的速度并进行定位控制。其中,该飞行控制系统S1的各个模块可为程序化的指令模块,并可被处理器10调用执行,该飞行控制系统S1的各个模块也可为固化于处理器10中的固件(firmware)。在一应用中,该飞行控制系统S1可为一安装于飞行装置100中的应用软件。
该获取模块11用于获取相机模组20采集的图像以及距离传感器40感测到的飞行装置100的高度。在本实施方式中,该获取模块11实时获取相机模组20采集的图像以及距离传感器40感测到的飞行装置100的高度。
场景确定模块12用于确定飞行装置100当前所处的场景。其中,场景确定模块12根据图像的特征确定飞行装置100所处的场景,其具体的过程将在后面介绍。
该图像偏移量确定模块13用于根据获取模块11获取的相机模组20采集的两帧相邻图像以及场景确定模块12确定的飞行装置100当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量。在一实施方式中,图像偏移量确定模块14根据飞行装置100所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。在其他实施方式中,该图像偏移量确定模块14可确定该两帧相邻图像中的相同特征点,并计算该相同特征点在该两帧相邻图像中的X、Y偏移量而得到该图像X、Y偏移量。其中,该相同特征点为同一物体在该两帧相邻图像中的成像点,该图像X、Y偏移量为该同一物体在该两帧图像中成像点的X方向上与Y方向上的偏移量。
偏移量校准模块14用于获取加速度传感器30侦测到的飞行装置100在三维方向上的加速度及角速度,并根据该飞行装置100的加速度及角速度对该图像X、Y偏移量进行补偿而得到一图像校正偏移量。
速度计算模块15用于通过镜头焦距、飞行装置100的高度以及该图像校正偏移量计算该图像校正偏移量对应的世界坐标的X、Y偏移量,即对应到真 实世界的实际X、Y偏移量。其中,本发明的X、Y分别指三维坐标系统中与地面平行的平面上的横轴方向和纵轴方向。世界坐标的X、Y偏移量为飞行装置100/相机模组20相对于地面在X方向和Y方向上的移动距离。
请一并参阅图3,为世界坐标的X、Y偏移量的说明示意图。其中,由于飞行装置100在拍摄两帧相邻图像P1、P2的时间间隔内会移动,因此,对于一实际的物体A而言,飞行装置100的相机模组20在采集图像时,相对该实际物体A会发生相对运动。如图3所示,相机模组20包括镜头21及图像传感器22。当该飞行装置100在拍摄两帧相邻图像P1、P2的时间间隔内向右上方移动一定距离时,导致该实际物体A通过飞行装置100的相机模组20的镜头21在图像传感器22上成像后,该实际物体A在该两帧相邻图像P1、P2中的成像点A1会发生如图3中所示的朝左下方的偏移,而形成图像X偏移量和图像Y偏移量,而该图像X偏移量和图像Y偏移量经过补偿校正后即为图像校正偏移量。因此,该图像校正偏移量与世界坐标的X、Y偏移量呈一定的对应关系,可根据该图像校正偏移量等得到世界坐标的X、Y偏移量。其中,相机模组20可为照相机、摄像机、摄像头等。该图像校正偏移量为物体A在该两帧相邻图像P1、P2中的成像点A1对应在图像传感器22上的X方向和Y方向的实际距离。
请一并参阅图4,为一世界坐标的X、Y偏移量与图像校正偏移量的关系示意图。设镜头焦距为f,飞行装置100的高度为H,图像校正偏移量中校正后的图像X偏移量为x1,校正后的图像Y偏移量为y1,世界坐标的X偏移量为X1,Y偏移量为Y1。显然,当相机模组20向下拍摄图像时,物距即为该高度H。如图4可见,世界坐标的X或Y偏移量与校正后的图像X偏移量或校正后的图像Y偏移量之比等于镜头焦距为f与高度H之比。即,世界坐标的X、Y偏移量与图像校正偏移量的关系分别满足公式1:x1/X1=f/H以及公式2:y1/Y1=f/H。由于焦距f、高度H以及图像校正偏移量中的校正后的X偏移量x1及Y偏移量y1均为已知,因此,速度计算模块15可根据镜头焦距、飞行装置100的高度以及该图像校正偏移量,通过上述的公式1和公式2分别求出世界坐标的X、Y偏移量X1、Y1。
该速度计算模块15并根据该两帧相邻图像采集的时间间隔t1以及该世界坐标的X、Y偏移量求出该飞行装置的速度。如前所述,该世界坐标的X、Y偏移量为该飞行装置在该两帧相邻图像获取的时间间隔内在X和Y方向上的 移动距离,设该相机模组20采集该两帧相邻图像的时间间隔为t1,该速度计算模块15根据该时间间隔t1以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率为X1/t1以及在Y方向上的速率为Y1/t1。在一实施方式中,该速度计算模块15先求出世界坐标的X、Y偏移量的矢量和得出该飞行装置100的实际位移D1,然后计算飞行装置100的实际速度为D1/t1。
该运行控制模块16用于至少基于该飞行装置100的速度对飞行装置100进行定位及/或悬停控制。例如,该运行控制模块16根据该飞行装置100的速度以及飞行装置100与目的地之间的距离计算所需的时间,并在所需时间小于一预定值时进行悬停或降落前的准备。在一实施方式中,该运行控制模块16判断当前计算的速度和方向与上一时刻计算的速度基本相等但方向相反时,则确定飞行装置100飞行速度接近为0,运动距离非常小,如1cm的运动距离,则控制飞行装置100在某一位置悬停。
因此,本发明中,在无GPS信号的情况下,通过拍摄的图片也能够计算出飞行装置100的速度,并进行定位控制。
该飞行装置100为无人飞行器。
其中,在本实施方式中,该场景确定模块12根据相机模组20采集的图像确定飞行装置100当前所处的场景。在其他实施方式中,该场景确定模块12也可根据用户选择的场景确定该用户选择的场景为飞行装置100当前所处的场景。
具体的,该场景确定模块12根据相机模组20采集的图像中的至少一个参数确定所述场景的类型。
其中,该至少一个参数包括纹理,所述场景确定模块12通过采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于阈值T1的像素的个数C,并判断值C大于阈值T2时认为纹理比较丰富,否则认为纹理比较少。
该至少一个参数还包括倒影,所述场景确定模块12根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景。具体的,当所述场景确定模块12判断所述连续多帧图像中存在偏暗和偏亮的情况或者进行无人机等光倒影检测,根据无人机等的形状进行检测,判断有符合无人机等的形状时,确定该场景为有倒影的场景。具体的,所述场景确定模块根据倒影检测算法进行判断。一般,对于容易反光的地面,无人机的圆 形灯光会在图像中呈现亮度偏亮的灰度图像,因此场景确定模块12对灰度图像每个像素进行判断是否大于或等于阈值T,其中T根据经验值设定,可以设置为220;场景确定模块12并将大于等于T的灰度值像素设置为255,将小于T的灰度值像素设置为0,以此将图像转换为二值图像,其中0为背景,255为前景,然后进行连通区域提取,采用外接矩形表示;场景确定模块12并进行目标大小判断,符合目标大小范围的认为是灯光倒影目标。其中,目标大小范围是根据不同的高度的倒影的测量值得到。
该至少一个参数还包括灰度,所述场景确定模块12将根据图像的灰度值将图像转换为灰度直方图并进行统计,并与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮。具体的,所述场景确定模块12根据灰度直方图的平均亮度L给定一个阈值T,当判断L<80(坎德拉)时属于偏暗场景,L>170属于偏亮场景,80<L<170时属于普通亮度场景。
该至少一个参数还可包括线条,所述场景确定模块12对图像做梯度检测,进行二值化处理,然后使用传统的hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条场景。
在一实施方式中,该场景确定模块12单独根据上述的一个参数以及相应的算法确定当前场景的类型。在其他实施方式中,该场景确定模块也可同时根据上述的多个参数以及多个相应的算法确定所述场景。例如,该场景确定模块12可同时根据纹理、线条判断当前的场景是否为纹理丰富且有线条的场景。
其中,该图像偏移量确定模块13根据两帧相邻图像以及飞行装置100当前所处的场景计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像水平偏移量包括:该图像偏移量确定模块13根据飞行装置100当前所处的场景选择一对应的算法,并根据该算法对该两帧相邻图像进行分析处理而得出第二帧图像相对于第一帧图像的图像水平偏移量。
具体的,对于纹理丰富的场景,该图像偏移量确定模块13采用灰度模板匹配算法。具体包括:设当前图像宽高分别为W和H,设一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得。匹配时模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4, 4]范围内取值。S(0,0)对应A的[4,4]位置。匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为最佳匹配位置,也就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即相邻两帧图像的图像X、Y偏移量,其中,偏移量在[-4,4]范围。SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。SAD值越小,两个图像的匹配度越高,可以作为最佳匹配。
对于纹理比较少的场景,该图像偏移量确定模块13采用sobel梯度模板匹配算法。具体的,该图像偏移量确定模块13通过Sobel算法进行边缘检测,其使用二维模板进行计算,二维模板包括水平模板和垂直模板。如下图,水平模板用于水平方向差分运算,垂直模板用于垂直方向差分运算。
水平模板:
-1 0 1
-2 0 2
-1 0 1
垂直模板:
-1 -2 -1
0 0 0
1 2 1
该图像偏移量确定模块13使用上述模板进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根。然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B。其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵。设模板图T大小为Mx X My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得。匹配时模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图 为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值。S(0,0)对应A的[4,4]位置。对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C。将C矩阵中符合下面条件(如下条件1和条件2)的每个元素绝对值进行累加,得到和SS(i,j)。
A[r,c]>T(条件1)
B[r,c]>T(条件2)
其中A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值。其中SS(i,j)最小的值对应的(i,j)位置为最佳匹配位置,也就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即相邻两帧图像的图像X、Y偏移量,其中,偏移量在[-4,4]范围。
其中,该图像偏移量确定模块13在飞行装置100当前所处的场景为光照特别亮或者特别暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并根据如前所述的关系根据纹理是否丰富选择对应的算法对图像进行处理。
该图像偏移量确定模块13在飞行装置100当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并根据如前所述的关系根据纹理是否丰富选择对应的算法。
该图像偏移量确定模块13在飞行装置100当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线Lch角度差最小的直线Lph。计算两条直线的距离得到j,当j不在[-R,R]时设置为0,其中R是设定的移动范围,一般R取4。在上一帧图像中找一条和当前图像检测到的垂直方向的直线Lcv角度差最小的直线Lcv。计算两条直线的距离得到i,当i不在[-R,R]时设置为0。直线定位得到的i,j就是相邻两帧图像的X、Y方向相对偏移量或者移动量,即相邻两帧图像的图像X、Y偏移量,其中,偏移量在[-4,4]范围。
从而,根据本发明的飞行装置100及飞行控制系统S1,可在无GPS信号时确定飞行装置100的速度并进行定位控制,并能够进一步基于不同的场景进行精确的控制。
请参阅图5,为本发明一实施方式中的飞行控制方法的流程图。该飞行控制方法用于侦测飞行装置100的速度并进行定位控制。首先,获取模块11获 取相机模组20采集的图像以及距离传感器40感测到的飞行装置100的高度(501)。
场景确定模块12确定飞行装置100当前所处的场景(503)。
图像偏移量确定模块13根据相机模组20采集的两帧相邻图像以及飞行装置100当前所处的场景,计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量(505)。其中,该图像偏移量确定模块13确定该两帧相邻图像中的相同特征点,并计算该相同特征点在该两帧相邻图像中的X、Y偏移量而得到该图像X、Y偏移量。其中,该场景确定模块12根据相机模组20采集的图像中的至少一个参数确定所述场景的类型,并根据飞行装置100当前所处的场景选择一对应的算法,并根据该算法对该两帧相邻图像进行分析处理而得出第二帧图像相对于第一帧图像的图像水平偏移量。
偏移量校准模块14获取加速度传感器30侦测到的飞行装置100在三维方向上的加速度及角速度,并根据该飞行装置的加速度及角速度对该图像X、Y偏移量进行补偿而得到一图像校正偏移量(507)。
速度计算模块15用于通过镜头焦距、飞行装置100的高度以及该图像校正偏移量计算该图像校正偏移量对应的世界坐标的X、Y偏移量,并根据该两帧相邻图像获取的时间间隔以及该世界坐标的X、Y偏移量求出该飞行装置的速度(509)。具体的,该速度计算模块15根据该时间间隔t1以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。更具体的,该速度计算模块15用X、Y偏移量分别除以时间间隔而分别得出飞行装置100在X方向和Y方向上的速率。
该运行控制模块16至少基于该飞行装置100的速度对飞行装置100进行定位及/或悬停控制(511)。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (21)

  1. 一种飞行控制系统,用于控制飞行装置,其特征在于,所述飞行控制系统包括:
    获取模块,用于获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;
    场景确定模块,用于确定飞行装置当前所处的场景;
    图像偏移量确定模块,用于根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量及图像Y偏移量;
    偏移量校准模块,用于获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X、Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及
    速度计算模块,用于通过相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
  2. 如权利要求1所述的飞行控制系统,其特征在于,所述飞行控制系统还包括运行控制模块,用于至少基于所述飞行装置的速度对飞行装置进行定位及/或悬停控制。
  3. 如权利要求1所述的飞行控制系统,其特征在于,所述场景确定模块根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景,所述图像偏移量确定模块根据飞行装置所处场景对应的算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  4. 如权利要求3所述的飞行控制系统,其特征在于,所述至少一个参数包括纹理,所述场景确定模块采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于第一预定阈值的像素的个数,并判断所统计像 素的个数大于第二阈值时,当前场景为纹理丰富的场景,以及在判断所统计的像素的个数小于或等于第二阈值认为纹理比较少,当前场景为纹理较少的场景;所述图像偏移量确定模块并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  5. 如权利要求3所述的飞行控制系统,其特征在于,所述至少一个参数还包括倒影,所述场景确定模块根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  6. 如权利要求3所述的飞行控制系统,其特征在于,所述至少一个参数还包括灰度,所述场景确定模块根据图像的灰度值将图像转换为灰度直方图并进行统计得到统计值,并将统计值与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮的场景,所述图像偏移量确定模块在飞行装置当前所处的场景为光照偏亮或者偏暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  7. 如权利要求3所述的飞行控制系统,其特征在于,所述至少一个参数 还包括线条,所述场景确定模块对图像做梯度检测,进行二值化处理,然后使用hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景;所述图像偏移量确定模块在飞行装置当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线角度差最小的直线,计算两条角度差最小的直线的距离得到第一距离,当第一距离不在[-R,R]时设置为0,其中R是设定的移动范围,在上一帧图像中找一条和当前图像检测到的垂直方向的直线的角度差最小的直线,计算所述两条角度差最小的直线的距离得到第二距离,当第二距离不在[-R,R]时设置为0,直线定位得到的第一距离、第二距离就是相邻两帧图像的图像X、Y偏移量。
  8. 如权利要求4-6中任一项所述的飞行控制系统,其特征在于,所述图像偏移量确定模块在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块设当前图像宽高分别为W和H,设定一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得,所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,所述图像偏移量确定模块并在匹配时通过计算相关函数SAD来找到与模板图尽可能相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为相邻两帧图像的图像X、Y偏移量,其中,SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。
  9. 如权利要求4-6中任一项所述的飞行控制系统,其特征在于,所述图像偏移量确定模块在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量包括:所述图像偏移量确定模块通过Sobel 算法进行边缘检测,并使用二维模板进行计算,二维模板包括用于水平方向差分运算的水平模板和用于垂直方向差分运算的垂直模板;所述图像偏移量确定模块使用水平模板和垂直末班进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根,然后,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B,其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵;所述图像偏移量确定模块并设定模板图T小为Mx×My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;所述图像偏移量确定模块并设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得;所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C,将C矩阵中符合下面条件1和条件2的每个元素绝对值进行累加,得到和SS(i,j),其中,条件1为A[r,c]>T、条件2为B[r,c]>T;其中,A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值,其中SS(i,j)最小的值对应的(i,j)位置为相邻两帧图像的图像X、Y偏移量。
  10. 如权利要求1所述的飞行控制系统,其特征在于,所述速度计算模块根据公式1:x1/X1=f/H计算所述世界坐标的X偏移量,以及根据公式2:y1/Y1=f/H计算所述世界坐标的Y偏移量,其中,x1为校正后的图像X偏移量,y1为校正后的图像Y偏移量,f为镜头焦距,H为飞行装置的高度,X1为世界坐标的X偏移量,Y1为世界坐标的Y偏移量;所述速度计算模块并根据该相机模组采集该两帧相邻图像的时间间隔为以及该世界坐标的X、Y偏移量计算得出飞行装置在X方向上的速率以及在Y方向上的速率。
  11. 一种飞行控制方法,用于控制飞行装置,其特征在于,所述飞行控制方法包括:
    获取飞行装置的相机模组采集的图像以及获取飞行装置的距离传感器感测到的飞行装置的高度;
    确定飞行装置当前所处的场景;
    根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像X偏移量和图像Y偏移量;
    获取飞行装置的加速度传感器侦测到的飞行装置在三维方向上的加速度及角速度,并根据所述飞行装置的加速度及角速度对所述图像X偏移量和图像Y偏移量进行补偿而得到包括校正后的图像X偏移量和图像Y偏移量的图像校正偏移量;以及
    通过相机模组的镜头焦距、飞行装置的高度以及所述图像校正偏移量计算所述图像校正偏移量对应的世界坐标的X、Y偏移量,并根据所述两帧相邻图像采集的时间间隔以及所述世界坐标的X、Y偏移量求出所述飞行装置的速度。
  12. 如权利要求11所述的飞行控制方法,其特征在于,所述方法还包括步骤:
    至少基于所述飞行装置的速度对飞行装置进行定位及/或悬停控制。
  13. 如权利要求11所述的飞行控制方法,其特征在于,所述步骤“确定飞行装置当前所处的场景所述场景”包括:
    根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景。
  14. 如权利要求13所述的飞行控制方法,其特征在于,所述步骤“根据相机模组采集的两帧相邻图像以及飞行装置当前所处的场景,计算所述两帧相邻图像中的第二帧图像相对于第一帧图像的图像水平偏移量”包括:
    根据飞行装置当前所处的场景选择一对应的算法;以及
    根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  15. 如权利要求14所述的飞行控制方法,其特征在于,所述至少一个参数包括纹理,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:采用sobel梯度算法对图像进行处理,得到梯度矩阵,统计梯度矩阵中梯度大于第一预定阈值的像素的个数,并判断所统计像素的个 数大于第二阈值时,当前场景为纹理丰富的场景,以及在判断所统计的像素的个数小于或等于第二阀值认为纹理比较少,当前场景为纹理较少的场景;
    所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  16. 如权利要求15所述的飞行控制方法,其特征在于,所述至少一个参数还包括倒影,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:根据连续多帧图像进行阴影统计,判断是否有倒影,从而确定该场景为有倒影的场景或无倒影的场景;
    所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前所处的场景为有倒影场景时,去除图像中的阴影,并去除阴影后,判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  17. 如权利要求15所述的飞行控制方法,其特征在于,所述至少一个参数还包括灰度,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:根据图像的灰度值将图像转换为灰度直方图并进行统计得到统计值,并将统计值与相应的阈值进行比较,检测出场景是否属于偏暗、普通亮度、或者偏亮的场景;
    所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所 述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前所处的场景为光照偏亮或者偏暗的场景时,对图像进行直方图均衡化,将图像亮度调为均匀,然后判断是否为纹理丰富的场景,并在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量,以及在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量。
  18. 如权利要求15所述的飞行控制方法,其特征在于,所述至少一个参数还包括线条,所述步骤“根据相机模组采集的图像中的至少一个参数确定飞行装置所处的场景”包括:对图像做梯度检测,进行二值化处理,然后使用hough直线检测,判断是否有直线,如果判断有至少一条直线且直线长度占图像宽度至少1/2的,则判断所述场景为线条丰富场景;
    所述步骤“根据飞行装置当前所处的场景选择一对应的算法;以及根据所述算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:在飞行装置当前所处的场景为线条丰富的场景时,划分直线为水平方向的直线和垂直方向的直线,在上一帧图像中找一条和当前图像检测到的水平方向的直线角度差最小的直线,计算两条角度差最小的直线的距离得到第一距离,当第一距离不在[-R,R]时设置为0,其中R是设定的移动范围,在上一帧图像中找一条和当前图像检测到的垂直方向的直线的角度差最小的直线,计算所述两条角度差最小的直线的距离得到第二距离,当第二距离不在[-R,R]时设置为0,直线定位得到的第一距离、第二距离就是相邻两帧图像的图像X、Y偏移量。
  19. 如权利要求15-17中任一项所述的飞行控制方法,其特征在于,所述步骤“在当前场景为纹理丰富的场景时采用灰度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:
    设当前图像宽高分别为W和H,设定一模板图T大小为Mx×My,Mx=W-8,My=H-8,模板图T从当前帧图像[4,4]的位置获得;
    设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从上一帧图像获得,在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置;以及
    在匹配时通过计算相关函数SAD来找到与模板图相似的搜索子图以及它的坐标位置i和j,T和S(i,j)的SAD最小的值为相邻两帧图像的图像X、Y偏移量,其中,SAD是指将两幅图像每个位置对应像素差值的绝对值进行累加求和的过程。
  20. 如权利要求15-17中任一项所述的飞行控制方法,其特征在于,所述步骤“在当前场景为纹理较少的场景时采用sobel梯度模板匹配算法分析该两帧图像中的参数的变化而计算该两帧相邻图像中的第二帧图像相对于第一帧图像的图像X、Y偏移量”包括:
    通过Sobel算法进行边缘检测,并使用二维模板进行计算,二维模板包括用于水平方向差分运算的水平模板和用于垂直方向差分运算的垂直模板;
    使用水平模板和垂直末班进行平面卷积运算,分别计算水平方向卷积fx和垂直方向卷积fy,求其梯度值G为fx的平方与fy的平方和的平方根,分别对相邻两幅图像做梯度运算得到梯度值矩阵A和B,其中A为上帧图像的Sobel梯度矩阵,B为当前帧图像的Sobel梯度矩阵;
    设定模板图T小为Mx×My,Mx=W-8,My=H-8,模板图T从B的[4,4]位置获得;
    设定匹配图S大小为Nx×Ny,其中Nx=W,Ny=H,匹配图S从A获得;所述图像偏移量确定模块在匹配时将模板图叠放在匹配图上平移,模板图覆盖下的那块基准图中的搜索子图为S(i,j),i,j为这块子图的左上角像点在匹配图S中的位置,i和j在[-4,4]范围内取值,S(0,0)对应A的[4,4]位置,对梯度值矩阵T和S(i,j)做差运算法,得到矩阵差值矩阵C,将C矩阵中符合下面条件1和条件2的每个元素绝对值进行累加,得到和SS(i,j),其中,条件1为A[r,c]>T、条件2为B[r,c]>T;其中,A[r,c]是梯度值矩阵A中r,c位置的梯度值,[r,c]是梯度值矩阵B中r、c位置的梯度值,r>=0且r<My,c>=0且c<Mx,T是梯度阈值,其中SS(i,j)最小的值对应的(i,j)位置为相邻两帧图像的图像X、Y偏移量。
  21. 一种飞行装置,包括相机模组、距离传感器以及加速度传感器,所述相机模组用于采集图像,所述距离传感器用于获取飞行装置的高度,所述加速度传感器用于侦测到的飞行装置在三维方向上的加速度及角速度,其特征在于,所述飞行装置还包括如权利要求1-10中任一项所述的飞行控制系统。
PCT/CN2016/070162 2015-11-13 2016-01-05 飞行装置、飞行控制系统及方法 WO2017080102A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/630,613 US10515271B2 (en) 2015-11-13 2017-06-22 Flight device and flight control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510776453.8A CN105447853B (zh) 2015-11-13 2015-11-13 飞行装置、飞行控制系统及方法
CN201510776453.8 2015-11-13

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/630,613 Continuation US10515271B2 (en) 2015-11-13 2017-06-22 Flight device and flight control method

Publications (1)

Publication Number Publication Date
WO2017080102A1 true WO2017080102A1 (zh) 2017-05-18

Family

ID=55557986

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/070162 WO2017080102A1 (zh) 2015-11-13 2016-01-05 飞行装置、飞行控制系统及方法

Country Status (3)

Country Link
US (1) US10515271B2 (zh)
CN (1) CN105447853B (zh)
WO (1) WO2017080102A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109520939A (zh) * 2019-01-03 2019-03-26 华域视觉科技(上海)有限公司 一种材料检测装置
CN111815524A (zh) * 2019-12-11 2020-10-23 长沙天仪空间科技研究院有限公司 一种辐射定标的校正系统和方法
CN117152144A (zh) * 2023-10-30 2023-12-01 潍坊华潍新材料科技有限公司 基于图像处理的导辊监测方法及装置

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6506031B2 (ja) * 2015-01-28 2019-04-24 株式会社トプコン 測量データ処理装置、測量データ処理方法およびプログラム
CN106489062B (zh) 2015-06-26 2019-06-28 深圳市大疆创新科技有限公司 用于测量移动平台的位移的系统和方法
CN105346706B (zh) * 2015-11-13 2018-09-04 深圳市道通智能航空技术有限公司 飞行装置、飞行控制系统及方法
CN106094861B (zh) * 2016-06-02 2024-01-12 零度智控(北京)智能科技有限公司 无人机、无人机控制方法及装置
WO2018027451A1 (zh) * 2016-08-08 2018-02-15 深圳市道通智能航空技术有限公司 一种飞行定位的方法及装置
CN106469458B (zh) * 2016-09-30 2019-01-08 哈尔滨工业大学 一种飞行相机的校正方法
CN106780553A (zh) * 2016-11-18 2017-05-31 腾讯科技(深圳)有限公司 一种飞行器的移动位置确定方法、装置及飞行器
CN106771329B (zh) * 2016-12-27 2020-06-05 歌尔科技有限公司 一种无人机减速过程中运行速度的检测方法
US10602056B2 (en) * 2017-01-13 2020-03-24 Microsoft Technology Licensing, Llc Optimal scanning trajectories for 3D scenes
WO2019019172A1 (en) * 2017-07-28 2019-01-31 Qualcomm Incorporated ADAPTIVE IMAGE PROCESSING IN A ROBOTIC VEHICLE
US11087292B2 (en) * 2017-09-01 2021-08-10 Allstate Insurance Company Analyzing images and videos of damaged vehicles to determine damaged vehicle parts and vehicle asymmetries
CN108140245B (zh) * 2017-12-25 2022-08-23 深圳市道通智能航空技术股份有限公司 测距方法、装置以及无人机
CN112789571A (zh) * 2018-07-05 2021-05-11 深圳市道通智能航空技术股份有限公司 无人机降落方法、装置及无人机
CN110741625B (zh) * 2018-07-23 2022-06-21 深圳市大疆创新科技有限公司 运动估计方法及摄影器材
CN109344970B (zh) * 2018-11-27 2022-03-15 中国电子科技集团公司第二十研究所 一种无人机上基于视觉目标动向推理方法
CN110505463A (zh) * 2019-08-23 2019-11-26 上海亦我信息技术有限公司 基于拍照的实时自动3d建模方法
CN110930455B (zh) * 2019-11-29 2023-12-29 深圳市优必选科技股份有限公司 定位方法、装置、终端设备及存储介质
CN111540003A (zh) * 2020-04-27 2020-08-14 浙江光珀智能科技有限公司 一种深度图像的生成方法及装置
CN112330618B (zh) * 2020-10-29 2023-09-01 浙江大华技术股份有限公司 图像偏移检测方法、设备及存储介质
US11348470B1 (en) * 2021-01-07 2022-05-31 Rockwell Collins, Inc. Apparent video brightness control and metric
CN114730471A (zh) * 2021-08-13 2022-07-08 深圳市大疆创新科技有限公司 控制方法、控制装置、可移动平台及存储介质
CN117470199B (zh) * 2023-12-27 2024-03-15 天津云圣智能科技有限责任公司 一种摆动摄影控制的方法、装置、存储介质及电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100250022A1 (en) * 2006-12-29 2010-09-30 Air Recon, Inc. Useful unmanned aerial vehicle
WO2010108301A1 (zh) * 2009-03-27 2010-09-30 Yu Qifeng 飞机着陆与无人机回收的地基摄像测量引导方法
CN102426019A (zh) * 2011-08-25 2012-04-25 航天恒星科技有限公司 一种无人机景象匹配辅助导航方法及系统
CN102788580A (zh) * 2012-06-20 2012-11-21 天津工业大学 无人机视觉导航中的飞行路线合成方法
US8581981B2 (en) * 2006-04-28 2013-11-12 Southwest Research Institute Optical imaging system for unmanned aerial vehicle
CN104062977A (zh) * 2014-06-17 2014-09-24 天津大学 基于视觉slam的四旋翼无人机全自主飞行控制方法
US8942964B2 (en) * 2010-06-08 2015-01-27 Southwest Research Institute Optical state estimation and simulation environment for unmanned aerial vehicles
US20150134143A1 (en) * 2013-10-04 2015-05-14 Jim Willenborg Novel tracking system using unmanned aerial vehicles
CN104729506A (zh) * 2015-03-27 2015-06-24 北京航空航天大学 一种视觉信息辅助的无人机自主导航定位方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2961601B1 (fr) * 2010-06-22 2012-07-27 Parrot Procede d'evaluation de la vitesse horizontale d'un drone, notamment d'un drone apte au vol stationnaire autopilote
FR3020169A1 (fr) * 2014-04-16 2015-10-23 Parrot Drone a voilure tournante muni d'une camera video delivrant des sequences d'images stabilisees

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8581981B2 (en) * 2006-04-28 2013-11-12 Southwest Research Institute Optical imaging system for unmanned aerial vehicle
US20100250022A1 (en) * 2006-12-29 2010-09-30 Air Recon, Inc. Useful unmanned aerial vehicle
WO2010108301A1 (zh) * 2009-03-27 2010-09-30 Yu Qifeng 飞机着陆与无人机回收的地基摄像测量引导方法
US8942964B2 (en) * 2010-06-08 2015-01-27 Southwest Research Institute Optical state estimation and simulation environment for unmanned aerial vehicles
CN102426019A (zh) * 2011-08-25 2012-04-25 航天恒星科技有限公司 一种无人机景象匹配辅助导航方法及系统
CN102788580A (zh) * 2012-06-20 2012-11-21 天津工业大学 无人机视觉导航中的飞行路线合成方法
US20150134143A1 (en) * 2013-10-04 2015-05-14 Jim Willenborg Novel tracking system using unmanned aerial vehicles
CN104062977A (zh) * 2014-06-17 2014-09-24 天津大学 基于视觉slam的四旋翼无人机全自主飞行控制方法
CN104729506A (zh) * 2015-03-27 2015-06-24 北京航空航天大学 一种视觉信息辅助的无人机自主导航定位方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109520939A (zh) * 2019-01-03 2019-03-26 华域视觉科技(上海)有限公司 一种材料检测装置
CN109520939B (zh) * 2019-01-03 2024-05-24 华域视觉科技(上海)有限公司 一种材料检测装置
CN111815524A (zh) * 2019-12-11 2020-10-23 长沙天仪空间科技研究院有限公司 一种辐射定标的校正系统和方法
CN111815524B (zh) * 2019-12-11 2024-04-23 长沙天仪空间科技研究院有限公司 一种辐射定标的校正系统和方法
CN117152144A (zh) * 2023-10-30 2023-12-01 潍坊华潍新材料科技有限公司 基于图像处理的导辊监测方法及装置
CN117152144B (zh) * 2023-10-30 2024-01-30 潍坊华潍新材料科技有限公司 基于图像处理的导辊监测方法及装置

Also Published As

Publication number Publication date
CN105447853B (zh) 2018-07-13
US20170293796A1 (en) 2017-10-12
CN105447853A (zh) 2016-03-30
US10515271B2 (en) 2019-12-24

Similar Documents

Publication Publication Date Title
WO2017080102A1 (zh) 飞行装置、飞行控制系统及方法
WO2017080108A1 (zh) 飞行装置、飞行控制系统及方法
US9578310B2 (en) Automatic scene calibration
CN107357286A (zh) 视觉定位导航装置及其方法
KR101618814B1 (ko) 단일객체에 대한 기울기를 추정하는 영상을 감시하는 장치 및 방법
JP6353289B2 (ja) 測距補正装置
KR101071352B1 (ko) 좌표맵을 이용한 팬틸트줌 카메라 기반의 객체 추적 장치 및 방법
CN105405126B (zh) 一种基于单目视觉系统的多尺度空‑地参数自动标定方法
KR20150116777A (ko) 이동 경로 메타데이터를 이용한 항공 사진/비디오에서의 상황 인식 객체 검출
JP2016224953A (ja) ナビゲーション調整のためのクロススペクトル特徴相関
CN110514212A (zh) 一种融合单目视觉和差分gnss的智能车地图地标定位方法
JP2017142613A (ja) 情報処理装置、情報処理システム、情報処理方法及び情報処理プログラム
KR101469099B1 (ko) 사람 객체 추적을 통한 자동 카메라 보정 방법
JP2019056629A (ja) 距離推定装置及び方法
KR20130034528A (ko) 도로시설물 자동 위치측정 방법
CN115717867A (zh) 一种基于机载双相机和目标追踪的桥梁变形测量方法
US20130108109A1 (en) Method and device for the detection of moving objects in a video image sequence
JP6860445B2 (ja) 物体距離検出装置
Abdullah et al. Camera Calibration Performance on Different Non-metric Cameras.
CN115950435A (zh) 无人机巡检影像的实时定位方法
CN113706424B (zh) 一种基于人工智能的果冻效应图像校正方法及系统
CN112985388B (zh) 基于大位移光流法的组合导航方法及系统
CN112017229B (zh) 一种相机相对位姿求解方法
JP2019164837A (ja) 情報処理システム、情報処理方法及び情報処理プログラム
KR20160026271A (ko) 달리기 감지 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16863293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16863293

Country of ref document: EP

Kind code of ref document: A1