WO2019144291A1 - 飞行控制方法、装置和机器可读存储介质 - Google Patents

飞行控制方法、装置和机器可读存储介质 Download PDF

Info

Publication number
WO2019144291A1
WO2019144291A1 PCT/CN2018/073870 CN2018073870W WO2019144291A1 WO 2019144291 A1 WO2019144291 A1 WO 2019144291A1 CN 2018073870 W CN2018073870 W CN 2018073870W WO 2019144291 A1 WO2019144291 A1 WO 2019144291A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
aircraft
orientation
imaging device
determining
Prior art date
Application number
PCT/CN2018/073870
Other languages
English (en)
French (fr)
Inventor
钱杰
邬奇峰
王宏达
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201880011997.7A priority Critical patent/CN110312978B/zh
Priority to PCT/CN2018/073870 priority patent/WO2019144291A1/zh
Publication of WO2019144291A1 publication Critical patent/WO2019144291A1/zh
Priority to US16/934,948 priority patent/US20210011490A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/042Control of altitude or depth specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0052Navigation or guidance aids for a single aircraft for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0078Surveillance aids for monitoring traffic from the aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • Embodiments of the present invention relate to image processing techniques, and more particularly to flight control methods, apparatus, and machine readable storage media.
  • the mainstream control method of the aircraft is completed by the remote control.
  • the front and rear of the aircraft are controlled by the lever of the remote control, and the upper and lower sides are rotated.
  • Embodiments of the present invention disclose flight control methods, apparatus, and machine readable storage media to improve the efficiency and accuracy of flight control of an aircraft.
  • An aspect of an embodiment of the present invention provides a flight control method applied to an aircraft, the aircraft being provided with a first imaging device, the method comprising: determining a target relative aircraft based on a depth map acquired by the first imaging device a distance; determining a first orientation of the target relative to the aircraft; controlling flight of the aircraft based on the first distance and the first orientation.
  • An aspect of an embodiment of the present invention provides a flight control device applied to an aircraft, the aircraft being provided with a first imaging device, the flight control device comprising: a processor, configured to acquire a depth map based on the first imaging device Determining a first distance of the target relative to the aircraft; the processor further configured to determine a first orientation of the target relative to the aircraft; a memory to store the first distance and the first orientation; the processor, Also for controlling flight of the aircraft based on the first distance and the first orientation.
  • An aspect of an embodiment of the present invention provides a machine readable storage medium having stored thereon a plurality of computer instructions, the computer instructions being executed to perform processing based on a depth acquired by a first imaging device
  • the map determines a first distance of the target relative to the aircraft; determines a first orientation of the target relative to the aircraft; controls flight of the aircraft based on the first distance and the first orientation.
  • the distance of the target relative to the aircraft is determined by the depth map acquired by the first imaging device, and the orientation of the target relative to the aircraft is determined, and then the flight of the aircraft is controlled according to the distance and the orientation of the target relative to the aircraft.
  • the flight control of the aircraft is realized without the need of a remote controller, and the efficiency of the flight control is improved; and the manner of determining the distance of the target relative to the aircraft by the depth map can improve the accuracy of the distance of the determined target relative to the aircraft, thereby Can improve the accuracy of the flight control of the aircraft.
  • Embodiment 1 is a flowchart of a flight control method according to Embodiment 1 of the present invention.
  • FIG. 2 is a flowchart of a flight control method according to Embodiment 2 of the present invention.
  • Embodiment 3 is a flowchart of a flight control method according to Embodiment 3 of the present invention.
  • FIG. 4 is a structural diagram of a flight control device according to Embodiment 4 of the present invention.
  • Fig. 5 is a structural diagram of an aircraft mounted on a flight control device according to a fifth embodiment of the present invention.
  • a component when referred to as being "fixed” to another component, it can be directly on the other component or the component can be present. When a component is considered to "connect” another component, it can be directly connected to another component or possibly a central component.
  • Embodiment 1 is a diagrammatic representation of Embodiment 1:
  • FIG. 1 is a schematic flowchart diagram of a flight control method according to an embodiment of the present invention.
  • the process is applied to an aircraft, such as a drone, which is provided with a first imaging device.
  • the first imaging device includes, but is not limited to, a binocular camera or a TOF (Time of Flight) camera or the like that can acquire a depth map, and the first imaging device can be fixed on the aircraft.
  • the method of this embodiment may include the following steps:
  • Step 101 Determine a first distance of the target relative to the aircraft based on the depth map acquired by the first imaging device.
  • the target in order to determine the distance of the target from the aircraft based on the depth map acquired by the first imaging device (hereinafter referred to as the depth map) (referred to herein as the first distance), the target may be determined in the depth map first, and then based on the depth.
  • the map determines the first distance of the target relative to the aircraft.
  • different pixel points on the depth map may be clustered into different point clouds by clustering the depth map, and then the point cloud obtained based on the clustering
  • the shape and/or size identifies the target.
  • a second imaging device may also be disposed on the aircraft, including but not limited to a digital camera, a digital camera, and the like.
  • the second imaging device can be fixedly connected to the pan/tilt set on the aircraft, and can move with the movement of the gimbal, and the captured image can be transmitted to the designated terminal device in real time, such as the mobile terminal of the aircraft user.
  • the visual frame in which the frame is targeted may be determined in the photographing screen of the second imaging device.
  • the target in the mode in which the aircraft follows the target, can be specified by the user in the shooting screen displayed by the specified terminal device, and a visual frame corresponding to the target is generated.
  • all targets and types of targets can be identified in the captured image of the second imaging device by image recognition.
  • it may be directly determined to follow the target and generate a visual frame corresponding to the target;
  • it may be directly determined to follow the target and generate a visual frame corresponding to the target;
  • the target to be followed according to the preset strategy can determine the target to be followed according to the preset strategy, and generate a visual frame corresponding to the target, such as determining the first optional target as the following target, or determining the most intermediate optional target as the following target, or, The last optional target is determined to follow the target and the like.
  • the visual frame after determining a frame having a target in the captured image of the second imaging device, the visual frame may be rotationally mapped to the depth map, and the depth map is determined based on the visual frame mapped to the depth map. aims.
  • a point cloud that is clustered by pixel points in the depth map and a point cloud that has the largest overlapping area of the visual frame mapped to the depth map may be determined as the target.
  • Step 102 Determine a first orientation of the target relative to the aircraft.
  • each pixel of the depth map may be clustered, the target is identified based on the shape and/or size of the point cloud obtained by the cluster, and the position of the target in the depth map is determined, and further, the target is in the depth map.
  • the position in the determination determines the orientation of the target relative to the aircraft (referred to herein as the first orientation).
  • the visual frame in which the target is framed in the captured image of the second imaging device may be determined in the manner described in step 101, and then the first orientation of the target relative to the aircraft is determined based on the position of the visual frame in the captured image. .
  • an angle corresponding to two adjacent pixel points may be determined according to a viewing angle (fov) of the second imaging device and a resolution of the second imaged captured image, and then determined according to pixel coordinates of the center of the visual frame in the captured image.
  • the pixel offset value between the center of the vision frame and the center of the captured image, and further, the deviation angle of the target from the optical axis of the second imaging device can be obtained. Since the second imaging device is fixedly connected to the pan-tilt, the attitude angle of the pan-tilt is the attitude angle of the optical axis of the second imaging device, and the final determined target relative to the first orientation of the aircraft may be the attitude angle of the pan-tilt. The sum of the deviation angles of the optical axis of the second imaging device from the target.
  • the target may be determined in a grayscale image acquired by the first imaging device and the first orientation of the target relative to the aircraft may be determined based on the location of the target in the grayscale image.
  • the visual frame in which the target is framed in the captured image of the second imaging device may be determined in the manner described in step 101, and the visual frame is The rotation maps to the grayscale map, and in turn, the target is determined in the grayscale map based on the visual frame mapped into the grayscale image.
  • the target in order to determine the target in the grayscale image acquired by the first imaging device, the target may be identified in the grayscale image directly by means of image recognition.
  • Step 103 Control flight of the aircraft based on the first distance and the first orientation.
  • flight of the aircraft may be controlled based on the first distance and the first orientation.
  • the aircraft's following of the target may be controlled based on the first distance and the first orientation.
  • the gesture control command of the aircraft in response to the target may be controlled based on the first distance and the first orientation.
  • the distance of the target relative to the aircraft is determined by the depth map acquired based on the first imaging device, and the orientation of the target relative to the aircraft is determined, and further, according to the distance of the target relative to the aircraft and
  • the azimuth control of the flight of the aircraft enables flight control of the aircraft without the need for a remote control, improving the efficiency of flight control; and determining the distance of the target relative to the aircraft through the depth map can improve the determined target relative to the aircraft.
  • the accuracy of the distance and thus, can improve the accuracy of the flight control of the aircraft.
  • FIG. 1 The first embodiment shown in FIG. 1 has been described above.
  • FIG. 2 is a flowchart of a flight control method according to Embodiment 2 of the present invention. As shown in FIG. 2, on the basis of the first embodiment shown in FIG. 1, the method in the second embodiment may include the following steps:
  • Step 201 Determine a first distance of the target relative to the aircraft based on the depth map acquired by the first imaging device.
  • This step 201 is similar to step 101 and will not be described again.
  • Step 202 Determine a first orientation of the target relative to the aircraft.
  • This step 202 is similar to step 102 and will not be described again.
  • Step 203 In the near field state, and the target is located in the field of view of the first imaging device, controlling the flight of the aircraft based on the first distance and the first orientation.
  • This step 203 is an embodiment specifically implemented in the foregoing step 103.
  • the visual frame of the target is determined when the size ratio of the captured image is greater than or equal to a preset first percentage threshold, and/or when the distance between the target and the aircraft is less than or equal to the preset first distance, For the near field state.
  • this step 203 considering that in the near field state, the accuracy of using the visual frame to determine the first distance is poor; and the depth map is better in the near field state, and the target based on the depth map is better. The accuracy of the distance relative to the aircraft is higher.
  • the flight of the aircraft in the near field state, and within the field of view of the first imaging device, the flight of the aircraft can be controlled based on the first distance and the first orientation.
  • the current orientation of the target relative to the aircraft may be determined based on the visual frame; and further, the navigation coordinates may be based on the last determined target The first coordinate below and the current orientation update the first coordinate of the target in the navigation coordinate system.
  • the first coordinate of the target in the navigation coordinate is the coordinate of the target determined in the navigation coordinate system based on the first distance and the first orientation, and the specific determination manner is described below.
  • the orientation of the target relative to the aircraft may be determined using a visual box approach in the manner described in step 102, ie A visual frame in which a target is framed in the captured image of the second imaging device is determined, and an orientation of the target relative to the aircraft is determined based on the position of the visual frame in the captured image.
  • the first coordinate of the target in the navigation coordinate system may be updated according to the first coordinate of the last determined target in the navigation coordinates and the current orientation.
  • the first coordinate (Xe2, Ye2) after an update is:
  • d 1, Y d 1) is the coordinates of the aircraft in the navigation coordinate system when the first coordinate (Xe1, Ye1) is determined, which can be GPS (Global Positioning System, Global Positioning System) and VO (Visual Odometry) , visual odometer) is obtained
  • d pre 1 is the distance of the target relative to the aircraft that was last determined before the target disappears from the field of view of the first imaging device, ie, in the navigation coordinate system (Xe1, Ye1) and (X d 1, The distance between Y d 1).
  • the distance of the target relative to the aircraft may be updated according to the updated first coordinate and the latest coordinate of the aircraft in the navigation coordinate system, and according to the updated distance and the use.
  • the target determined by the visual frame mode updates the first coordinate again with respect to the latest current orientation of the aircraft.
  • the distance d pre 2 of the updated target relative to the aircraft is The distance between the navigation coordinate system (Xe2, Ye2) and (X d 2, Y d 2). If the target is determined by the visual frame method, the latest current orientation of the aircraft relative to the aircraft is Yaw target2drone 3, then again The updated first coordinates (Xe3, Ye3) are:
  • the first coordinate of the target in the navigation coordinate system can be updated all the time before the target returns to the field of view of the first imaging device again.
  • Embodiment 3 of the present invention provides another flight control method.
  • FIG. 3 is a flowchart of a flight control method according to Embodiment 3 of the present invention. As shown in FIG. 3, on the basis of the first embodiment shown in FIG. 1 or the second embodiment shown in FIG. 2, the method in the third embodiment may include the following steps:
  • Step 301 Step 201: Determine a first distance of the target relative to the aircraft based on the depth map acquired by the first imaging device.
  • This step 301 is similar to step 101 and will not be described again.
  • Step 302 Determine a first orientation of the target relative to the aircraft.
  • This step 302 is similar to step 102 and will not be described again.
  • Step 303 Determine a first coordinate of the target in the navigation coordinate system based on the first distance and the first orientation.
  • the coordinates of the target in the navigation coordinate system (referred to herein as the first coordinate) (Xt1, Yt1) may be determined according to the following formula:
  • (X d , Y d ) is the coordinates of the aircraft in the navigation coordinate system, which can be obtained by fusion of GPS and VO
  • Yaw target 2drone 1 is the first orientation
  • d1 is the first distance
  • Step 304 Determine a visual frame in which a target is framed in the photographing screen of the second imaging device.
  • Step 305 Determine a second distance and a second orientation between the target and the aircraft based on the visual frame.
  • the specific implementation of determining the distance between the target and the aircraft based on the visual frame (referred to as the second distance in the present invention) can be referred to the related description in the related related embodiments, and the present invention will not be described herein.
  • step 102 For a specific implementation of determining the orientation of the target relative to the aircraft (referred to herein as the second orientation) based on the visual frame, reference may be made to the related description in step 102, and the present invention is not described herein again.
  • Step 306 Determine a second coordinate of the target in the navigation coordinates based on the second distance and the second orientation.
  • the specific implementation of determining the coordinates of the target in the navigation coordinates (referred to herein as the second coordinate) based on the second distance and the second orientation is the first determination of the target in the navigation coordinates based on the first distance and the first orientation.
  • the specific implementation of the coordinates is similar, and the present invention will not be described herein.
  • step 301 to step 303 there is no necessary timing relationship between step 301 to step 303 and step 304 to step 306, that is, steps 301 to 303 may be performed first, and then steps 304 to 306 are performed; Steps 304 to 306 may be performed first, and then steps 301 to 303 may be performed;
  • Step 307 after switching from the near field state to the far field state, and/or in the near field state and the far field state, based on the first coordinate and/or the second coordinate, and the aircraft in the navigation coordinate system
  • the coordinates control the flight of the aircraft.
  • the visual frame of the target is determined to be a far field state when the size ratio of the captured image is less than a preset first percentage threshold, and/or the distance between the target and the aircraft is greater than the preset first distance.
  • flight of the aircraft may be controlled based on the first coordinate and/or the second coordinate, and the coordinates of the aircraft in the navigational coordinate system.
  • the flight of the aircraft may be controlled based on the first coordinate and/or the second coordinate, and the coordinates of the aircraft in the navigational coordinate system.
  • the flight of the aircraft is controlled, or the specific implementation based on the second coordinate and the coordinate of the aircraft in the navigation coordinate system to control the flight of the aircraft may refer to the correlation in the above method embodiment. The description of the present invention will not be repeated here.
  • controlling the flight of the aircraft based on the first coordinate and the second coordinate, and the coordinates of the aircraft in the navigation coordinate system may include:
  • the flight of the aircraft is controlled based on the coordinates of the fusion and the coordinates of the aircraft in the navigational coordinate system.
  • the first is determined.
  • the first coordinate and the second coordinate may be fused by the filter, and the flight of the aircraft is controlled based on the fused coordinates and the coordinates of the aircraft in the navigation coordinate system.
  • the filter described above can be a Kalman filter.
  • the merging the first coordinate and the second coordinate by the filter may include:
  • the type of the target is obtained, and the state equation of the Kalman filter is determined based on the type of the target;
  • the first coordinate and the second coordinate are fused based on a Kalman filter that determines the state equation.
  • the state equations of different corresponding Kalman filters of the target type are also different. Therefore, when it is necessary to use the Kalman filter for noise filtering, it is necessary to first determine the target. Type and determine the state equation of the Kalman filter corresponding to the type of target.
  • the bicycle model can be used; if the type of the target is a pedestrian, a uniform acceleration motion model can be used.
  • the type of the target can be acquired first, and the state equation of the Kalman filter is determined based on the type of the target, and further, based on the determination of the state equation.
  • the Kalman filter fuses the first coordinate and the second coordinate.
  • x(n) is the system state vector
  • u(n) is the drive input vector
  • w(n) is the estimated noise
  • a and B are the constant coefficient matrix, that is, the state equation in the state space.
  • z(n) is the observation result (ie, the measurement result)
  • H(n) is the observation vector
  • v(n) is the observation noise.
  • n-1) is the mean of the optimal estimation error at time n-1
  • n-1) is the mean of the estimated error at time n
  • n) is the time of n The mean of the optimal estimation error.
  • n-1) is the optimal estimate of the variance matrix at time n-1;
  • n-1) is the estimated value of the variance matrix at time n,
  • n) It is the optimal estimate of the variance matrix at time n.
  • the Kalman gain coefficient equation is:
  • Filters such as Kalman filters
  • Filters can also be used to filter the first and second coordinates to improve the accuracy of the coordinates of the target in the navigational coordinate system and improve the accuracy of the flight control of the aircraft.
  • the above filter is not limited to a Kalman filter.
  • the filter may also be a Butterworth filter, and the specific implementation thereof will not be described herein.
  • the coordinates of the target in the navigation coordinate system can be directly determined by the GPS device or the UWB device.
  • the coordinates of the target in the navigation coordinate system may also be acquired by the laser radar device, and the specific implementation thereof is not described herein.
  • the fourth embodiment provides a structural diagram of the flight control device.
  • 4 is a structural diagram of a flight control device according to a fourth embodiment of the present invention. This device corresponds to the method flow shown in FIG.
  • the apparatus can include a processor 401 and a memory 402.
  • the processor 401 is configured to determine a first distance of the target relative to the aircraft based on the depth map acquired by the first imaging device.
  • the processor 401 is further configured to determine a first orientation of the target relative to the aircraft.
  • the memory 402 is configured to store the first distance and the first orientation.
  • the processor 401 is further configured to control flight of the aircraft based on the first distance and the first orientation.
  • the processor 401 is specifically configured to determine the target in the depth map; and determine a first distance of the target relative to the aircraft based on the depth map.
  • the processor 401 is specifically configured to cluster each pixel in the depth map, identify the target based on the shape and/or size of the point cloud obtained by the clustering, and determine that the target is in the depth map. a position of the target relative to the aircraft based on the position of the target in the depth map.
  • the aircraft is further provided with a second imaging device; correspondingly, the processor 401 is specifically configured to determine a visual frame in which a target is framed in the captured image of the second imaging device; a visual box rotation is mapped to the depth map; a position of the target in the depth map is determined based on a visual frame mapped into the depth map.
  • the aircraft is further provided with a second imaging device; correspondingly, the processor 401 is specifically configured to determine a visual frame in which a target of the second imaging device is framed; The position in the photographing picture determines the first orientation of the target relative to the aircraft.
  • the processor 401 is specifically configured to determine the target in a grayscale image acquired by the first imaging device, where the depth map is determined based on the grayscale map; The position in the grayscale map determines the first orientation of the target relative to the aircraft.
  • the aircraft is further provided with a second imaging device; correspondingly, the processor 401 is specifically configured to determine a visual frame in which a target is framed in the captured image of the second imaging device; a visual frame rotation is mapped to the grayscale image; the target is determined in the grayscale image based on a visual frame mapped into the grayscale image.
  • the processor 401 is specifically configured to identify the target in the grayscale image by using image recognition.
  • the processor 401 is specifically configured to determine, according to the first distance and the first orientation, a first coordinate of the target in a navigation coordinate system; based on coordinates of the aircraft in a navigation coordinate system and the The first coordinate controls the flight of the aircraft; the processor 401 is specifically configured to determine a first coordinate of the target in a navigation coordinate system based on the first distance and the first orientation; based on the aircraft in a navigation coordinate system The coordinates and the first coordinates control flight of the aircraft; the memory 402 is further configured to store the first coordinates.
  • the processor 401 is specifically configured to cluster each pixel in the depth map, identify the target based on the shape and/or size of the point cloud obtained by the clustering, and determine that the target is in the depth map. a position of the target relative to the aircraft based on the position of the target in the depth map.
  • the processor 401 determines the distance of the target relative to the aircraft by using the depth map acquired by the first imaging device, and determines the orientation of the target relative to the aircraft, thereby controlling the aircraft according to the distance and orientation of the target relative to the aircraft.
  • the flight realizes the flight control of the aircraft without the need of a remote controller, and improves the efficiency of the flight control; and the method of determining the distance of the target relative to the aircraft through the depth map can improve the accuracy of the determined distance of the target relative to the aircraft.
  • the accuracy of the flight control of the aircraft can be improved.
  • the fourth embodiment has been described above.
  • the processor 401 is specifically configured to be based on the first distance when the target is located in the field of view of the first imaging device in a near field state. And the first orientation controls the flight of the aircraft.
  • the aircraft is further provided with a second imaging device; the processor 401 is further configured to, in the near field state, when the target disappears from the field of view of the first imaging device, and the target Determining, within the field of view of the second imaging device, determining a visual frame in which a target of the second imaging device is framed; determining a current orientation of the target relative to the aircraft based on the visual frame; Updating the first coordinate of the target in the navigation coordinate system according to the first coordinate and the current orientation of the target in the navigation coordinates.
  • the aircraft is further provided with a second imaging device; correspondingly, the processor 401 is further configured to determine that the frame of the second imaging device has a target in the captured image. a visual frame; determining a second distance and a second orientation between the target and the aircraft based on the visual frame; the processor 401 is further configured to determine the target in the navigation coordinate system based on the first distance and the first orientation a first coordinate; a second coordinate of the target in the navigation coordinate system is determined based on the second distance and the second orientation; a memory 402 is further configured to store the second coordinate; and the processor 401 is further configured to: After switching from the near field state to the far field state, and/or in the near field state and the far field state, based on the first coordinate and/or the second coordinate, and the aircraft is in the navigation coordinate system The coordinates control the flight of the aircraft.
  • the processor 401 is specifically configured to fuse the first coordinate and the second coordinate by using a filter; and control the flight of the aircraft based on the fused coordinates and the coordinates of the aircraft in the navigation coordinate system.
  • the memory 402 is also used to store the merged coordinates.
  • the filter is a Kalman filter.
  • the processor 401 is further configured to acquire a type of the target in a mode in which the aircraft follows the target, and determine the Kalman filter based on the type of the target.
  • the state equation of the device; the first coordinate and the second coordinate are merged based on a Kalman filter that determines the state equation.
  • the flight control device shown in FIG. 4 can be mounted on an aircraft (such as a drone).
  • Figure 5 shows an aircraft equipped with a flight control device. As shown in FIG. 5, the aircraft includes a fuselage 501, a power system 502, a first imaging device 503, and a flight control device (identified as 504) as described above.
  • a power system 502 is mounted to the fuselage for providing flight power.
  • the power system 502 includes at least one of the following: a motor 505, a propeller 506, and an electronic governor 507.
  • the aircraft further includes: a second imaging device 508 and a supporting device 509.
  • the supporting device 509 may specifically be a pan/tilt, and the second imaging device 508 is fixedly connected to the aircraft through the supporting device 509.
  • the fifth embodiment provides a machine readable storage medium on which a plurality of computer instructions are stored, and when the computer instructions are executed, the following processing is performed:
  • the flight of the aircraft is controlled based on the first distance and the first orientation.
  • the computer instructions are executed as follows when executed:
  • a first distance of the target relative to the aircraft is determined based on the depth map.
  • the computer instructions are executed as follows when executed:
  • a first orientation of the target relative to the aircraft is determined based on a location of the target in the depth map.
  • the computer instructions are executed as follows when executed:
  • a position of the target in the depth map is determined based on a visual frame mapped into the depth map.
  • the computer instructions are executed as follows when executed:
  • a first orientation of the target relative to the aircraft is determined based on a position of the visual frame in the captured image.
  • the computer instructions are executed as follows when executed:
  • the depth map being determined based on the grayscale map
  • a first orientation of the target relative to the aircraft is determined based on a location of the target in the grayscale map.
  • the computer instructions are executed as follows when executed:
  • the target is determined in the grayscale map based on a visual frame mapped into the grayscale map.
  • the computer instructions are executed as follows when executed:
  • the target is identified in the grayscale image using image recognition.
  • the computer instructions are executed as follows when executed:
  • the flight of the aircraft is controlled based on coordinates of the aircraft in a navigational coordinate system and the first coordinates.
  • the computer instructions are executed as follows when executed:
  • the aircraft controls the gesture control command in response to the target based on the first distance and the first orientation.
  • the computer instructions are executed as follows when executed:
  • the flight of the aircraft is controlled based on the first distance and the first orientation.
  • the computer instructions are also processed as follows when executed:
  • the computer instructions are also processed as follows when executed:
  • the coordinates control the flight of the aircraft.
  • the computer instructions are executed as follows when executed:
  • the flight of the aircraft is controlled based on the blended coordinates and the coordinates of the aircraft in the navigational coordinate system.
  • the computer instructions are executed as follows when executed:
  • the first coordinate and the second coordinate are fused based on a Kalman filter that determines a state equation.
  • the device embodiment since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without any creative effort.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种飞行控制方法、装置和机器可读存储介质,通过就第一成像装置获取的深度图确定目标相对飞行器的距离(101);并确定目标相对飞行器的方位(102);根据目标相对飞行器的距离和方位控制飞行器的飞行(103)。实现了在不需要遥控器的情况下对飞行器的飞行控制,提高了飞行控制的效率;通过深度图确定目标相对飞行器的距离的方式可以提高所确定的目标相对飞行器的距离的准确性,提高了飞行器的飞行控制的精度。

Description

飞行控制方法、装置和机器可读存储介质 技术领域
本发明实施例涉及图像处理技术,特别涉及飞行控制方法、装置和机器可读存储介质。
背景技术
一直以来,飞行器的主流控制方式都是通过遥控器完成,通过遥控器的拨杆来控制飞行器的前后左右,上下旋转。而通过遥控器控制飞行器的飞行存在许多局限,如必须随身携带遥控器、遥控器出现问题会导致飞行器也无法使用等。
因此,如何摆脱飞行器对遥控器的依赖,让飞行器响应指定目标的动作,如移动、手势等,执行相应的飞行动作成为飞行器的飞行控制领域的热门研究方向。
发明内容
本发明实施例公开了飞行控制方法、装置和机器可读存储介质,以提高飞行器的飞行控制的效率和精度。
本发明实施例的一个方面提供一种飞行控制方法,应用于飞行器,所述飞行器设置有第一成像装置,所述方法包括:基于所述第一成像装置获取的深度图确定目标相对飞行器的第一距离;确定所述目标相对飞行器的第一方位;基于所述第一距离和第一方位控制所述飞行器的飞行。
本发明实施例的一个方面提供一种飞行控制装置,应用于飞行器,所述飞行器设置有第一成像装置所述飞行控制装置包括:处理器,用于基于所述第一成像装置获取的深度图确定目标相对飞行器的第一距离;所述处理器,还用于确定所述目标相对飞行器的第一方位;存储器,用于存储所述第一距离和所述第一方位;所述处理器,还用于基于所述第一距离和第一方位控制所述飞行器的飞行。
本发明实施例的一个方面提供了一种机器可读存储介质,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:基于第一成像装置获取的深度图确定目标相对飞行器的第一距离;确定所述目标相对飞行器的第一方位;基于所述第一距离和第一方位控制所述飞行器的飞行。
综上分析,本发明实施例中,通过基于第一成像装置获取的深度图确定目标相对飞行器的距离,并确定目标相对飞行器的方位,进而,根据目标相对飞行器的距离和方位控制飞行器的飞行,实现了在不需要遥控器的情况下对飞行器的飞行控制,提高了飞行控制的效率;而通过深度图确定目标相对飞行器的距离的方式可以提高所确定的目标相对飞行器的距离的准确性,从而,可以提高飞行器的飞行控制的精度。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例一提供的飞行控制方法的流程图;
图2为本发明实施例二提供的飞行控制方法的流程图;
图3为本发明实施例三提供的飞行控制方法的流程图;
图4为本发明实施例四提供的飞行控制装置的结构图;
图5为本发明实施例五提供的飞行控制装置搭载的飞行器的结构图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,当组件被称为“固定于”另一个组件,它可以直接在另一个组件上或者也可以存在居中的组件。当一个组件被认为是“连接”另一个组件,它可以是直接连接到另一个组件或者可能同时存在居中组件。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。本文所使用的术语“及/或”包括一个或多个相关的所列项目的任意的和所有的组合。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施 例及实施例中的特征可以相互组合。
实施例一:
本发明实施例提供了飞行控制方法。参见图1,图1为本发明实施例提供的飞行控制方法的流程示意图。该流程应用于飞行器,如无人机,该飞行器设置有第一成像装置。作为一个实施例,该第一成像装置包括但不限于双目相机或TOF(Time of Flight,飞行时间)相机等可以获取到深度图的成像装置,该第一成像装置可以固定在飞行器上。
如图1所示,本实施例的方法可以包括以下步骤:
步骤101、基于第一成像装置获取的深度图确定目标相对飞行器的第一距离。
在应用中,为了基于第一成像装置获取的深度图(下文中简称为深度图)确定目标与飞行器的距离(本文中称为第一距离),可以先在深度图中确定目标,然后基于深度图确定目标相对飞行器的第一距离。
作为一个实施例,通过第一成像装置获取到深度图之后,可以通过对深度图进行聚类,将深度图上的不同像素点聚类成不同的点云,然后基于聚类得到的点云的形状和/或大小识别出目标。
作为另一个实施例,飞行器上还可以设置第二成像装置,该第二成像装置包括但不限于数码相机、数字摄像机等。该第二成像装置可以固定连接在飞行器上设置的云台上,并可以随着云台的移动而移动,其拍摄画面可以实时传输到指定的终端设备上,如飞行器使用者的移动终端。
在该实施例中,为了在深度图中确定目标,可以先在第二成像装置的拍摄画面中确定框有目标的视觉框。
在一个例子中,在飞行器跟随目标的模式中,可以由用户在上述指定终端设备显示的拍摄画面中指定目标,进而,生成对应该目标的视觉框。
在另一个例子中,在飞行器跟随目标的模式中,可以通过图像识别的方式在第二成像装置的拍摄画面中识别所有目标以及目标的类型。当第二成像装置的拍摄画面中仅存在一个可选目标时,可以直接其确定为跟随目标,并生成对应该目标的视觉框;当第二成像装置的拍摄画面中存在多个可选目标时,可以按照预设策略确定需要跟随的目标,并生成对应该目标的视觉框,如将最前面的可选目标确定为跟随目标,或,将最中间的可选目标确定为跟随目标,或,将最后面的可选目标确定为跟随目标等。
在该实施例中,在第二成像装置的拍摄画面中确定框有目标的视觉框之后,可以将该视觉框旋转映射到深度图,并基于映射到深度图中的视觉框确定深度图中的目标。
例如,可以将对深度图中像素点聚类得到的点云中与映射到深度图中的视觉框重叠面积最大的点云确定为目标。
步骤102、确定目标相对飞行器的第一方位。
在应用中,为了确定飞行器与目标的位置关系,除了需要确定目标相对飞行器的距离之外,还需要确定目标相对飞行器的方位。
作为一个实施例,可以对深度图的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标,并确定目标在深度图中的位置,进而,基于目标在深度图中的位置确定目标相对飞行器的方位(本文中称为第一方位)。
作为另一个实施例,可以按照步骤101中所描述的方式确定第二成像装置的拍摄画面中框有目标的视觉框,然后,基于视觉框在拍摄画面中的位置确定目标相对飞行器的第一方位。
例如,可以根据第二成像装置的视场角(fov)、第二成像的拍摄画面的分辨率确定相邻两个像素点对应的角度,然后,根据视觉框中心在拍摄画面中的像素坐标确定视觉框中心与拍摄画面中心的像素偏移值,进而,可以得到目标相对第二成像装置的光心轴的偏差角度。而由于第二成像装置与云台固定连接,因此,云台的姿态角就是第二成像装置的光心轴的姿态角,最终确定的目标相对于飞行器的第一方位可以为云台的姿态角与目标相对第二成像装置的光心轴的偏差角之和。
在又一实施例中,可以在第一成像装置获取的灰度图中确定目标,并基于目标在灰度图中的位置确定目标相对飞行器的第一方位。
在一个例子中,为了在第一成像装置获取的灰度图中确定目标,可以先按照步骤101中描述的方式确定第二成像装置的拍摄画面中框有目标的视觉框,并将该视觉框旋转映射到该灰度图,进而,基于映射到灰度图中的视觉框在灰度图中确定目标。
在另一个例子中,为了在第一成像装置获取的灰度图中确定目标,可以直接利用图像识别的方式在灰度图中识别目标。
步骤103、基于第一距离和第一方位控制飞行器的飞行。
在应用中,确定了目标相对于飞行器的第一距离和第一方位之后,可以基于该第一距离和第一方位控制飞行器的飞行。
作为一个实施例,在飞行器跟随目标的模式中,可以基于第一距离和第一方位控制飞行器对目标的跟随。
作为另一个实施例,在基于目标的手势控制飞行器的模式中,可以基于第一距离和第一方位控制飞行器响应目标的手势控制指令。
通过上述步骤101至步骤103可以看出,在本发明中,通过基于第一成像装置获取的深度图确定目标相对飞行器的距离,并确定目标相对飞行器的方位,进而,根据目标相对飞行器的距离和方位控制飞行器的飞行,实现了在不需要遥控器的情况下对飞行器的飞行控制,提高了飞行控制的效率;而通过深度图确定目标相对飞行器的距离的方式可以提高所确定的目标相对飞行器的距离的准确性,从而,可以提高飞行器的飞行控制的精度。
以上对图1所示的实施例一进行了描述。
实施例二
在图1所示实施例的基础上,本发明实施例二提供了另一种飞行控制方法。图2为本发明实施例二提供的飞行控制方法的流程图。如图2所示,在图1所示实施例一的基础上,本实施例二中的方法,可以包括以下步骤:
步骤201、基于第一成像装置获取的深度图确定目标相对飞行器的第一距离。
本步骤201类似步骤101,不再赘述。
步骤202、确定目标相对飞行器的第一方位。
本步骤202类似步骤102,不再赘述。
步骤203、在近场状态下,且目标位于第一成像装置的视场内时,基于第一距离和第一方位控制飞行器的飞行。
本步骤203为上述步骤103具体实现的一个实施例。
在一些实施例中,目标的视觉框在拍摄画面的尺寸占比大于或等于预设第一占比阈值,和/或,目标与飞行器之间的距离小于或等于预设第一距离时,确定为近场状态。
具体地,本步骤203中,考虑到在近场状态下,采用视觉框来确定第一距离的准确性较差;而深度图在近场状态下的效果较好,基于深度图得出的目标相对飞行器的距离的准确性较高。
相应地,在该实施例中,在近场状态下,且标位于第一成像装置的视场内时,可以基于 第一距离和第一方位控制飞行器的飞行。
作为一个实施例,在近场状态下,当目标从第一成像装置的视场内消失时,可以基于视觉框确定目标相对飞行器的当前的方位;进而,可以根据最近一次确定的目标在导航坐标下的第一坐标和当前的方位更新目标在导航坐标系下的第一坐标。
其中,目标在导航坐标下的第一坐标为基于第一距离和第一方位确定的目标在导航坐标系下的坐标,其具体确定方式在下文中说明。
在该实施例中,在近场状态下,当目标从第一成像装置的视场内消失时,需要使用视觉框方式确定的目标相对飞行器的方位维持目标在导航坐标系下的坐标。
具体地,当目标从第一成像装置的视场内消失,且目标存在于第二成像装置的视场内时,可以按照步骤102中描述的方式使用视觉框方式确定目标相对飞行器的方位,即确定第二成像装置的拍摄画面中框有目标的视觉框,并基于该视觉框在拍摄画面中的位置确定目标相对飞行器的方位。
确定了目标相对飞行器的当前的方位之后,可以根据最近一次确定的目标在导航坐标下的第一坐标和当前的方位更新目标在导航坐标系下的第一坐标。
举例来说,假设目标从第一成像装置的视场内消失之前最后一次确定的第一坐标为(Xe1,Ye1),使用视觉框确定的目标相对飞行器的当前的方位为Yaw target2drone2,则第一次更新后的第一坐标(Xe2,Ye2)为:
Xe2=X d1+cos(Y awtarget2drone2)*d pre1
Ye2=Y d1+sin(Y awtarget2drone2)*d pre1
其中,(X d1,Y d1)为确定的第一坐标(Xe1,Ye1)时飞行器在导航坐标系下的坐标,其可以由GPS(Global Positioning System,全球定位系统)和VO(Visual Odometry,视觉里程计)融合得到,d pre1为目标从第一成像装置的视场内消失之前最后一次确定的目标相对飞行器的距离,即导航坐标系下(Xe1,Ye1)与(X d1,Y d1)之间的距离。
在该实施例中,通过上述方式更新了第一坐标之后,可以根据更新后的第一坐标和飞行器在导航坐标系下的最新坐标更新目标相对飞行器的距离,并根据该更新后的距离以及使用视觉框方式确定的目标相对飞行器的最新的当前的方位再次更新第一坐标。
例如,假设更新后的第一坐标为(Xe2,Ye2),飞行器在导航坐标系下的最新坐标为(X d2,Y d2),则更新后的目标相对飞行器的距离d pre2即为导航坐标系下(Xe2,Ye2)与(X d2, Y d2)之间的距离,若此时使用视觉框方式确定的目标相对飞行器的最新的当前的方位为Yaw target2drone3,则再一次更新后的第一坐标(Xe3,Ye3)为:
Xe3=X d2+cos(Y awtarget2drone3)*d pre2
Ye3=Y d2+sin(Y awtarget2drone3)*d pre2
按照上述方式,在近场状态下,在目标再次回到第一成像装置的视场内之前,可以一直更新目标在导航坐标系下的第一坐标。
以上对图1所示的实施例二进行了描述。
实施例三
在图1或图2所示实施例的基础上,本发明实施例三提供了另一种飞行控制方法。图3为本发明实施例三提供的飞行控制方法的流程图。如图3所示,在图1所示实施例一或图2所示实施例二的基础上,本实施例三中的方法,可以包括以下步骤:
步骤301、步骤201、基于第一成像装置获取的深度图确定目标相对飞行器的第一距离。
本步骤301类似步骤101,不再赘述。
步骤302、确定目标相对飞行器的第一方位。
本步骤302类似步骤102,不再赘述。
步骤303、基于第一距离和第一方位确定目标在导航坐标系下的第一坐标。
在应用中,确定了第一距离和第一方位之后,可以按照如下公式确定目标在导航坐标系下的坐标(本文中称为第一坐标)(Xt1,Yt1):
Xt1=X d+cos(Yaw target2drone1)*d1
Yt 1=Y d+sin(Yaw target2drone1)*d1
其中,(X d,Y d)为飞行器在导航坐标系下的坐标,其可以由GPS和VO融合得到,Yaw target2drone1为第一方位,d1为第一距离。
步骤304、确定第二成像装置的拍摄画面中框有目标的视觉框。
在应用中,确定第二成像装置的拍摄画面中框有目标的视觉框的具体实现可以参见步骤101中的相关描述,本发明在此不再赘述。
步骤305、基于视觉框确定目标相对飞行器之间的第二距离和第二方位。
在应用中,基于视觉框确定目标相对飞行器之间的距离(本文中称为第二距离)的具体实现可以参见现有相关方案中的相关描述,本发明在此不做赘述。
基于视觉框确定目标相对飞行器的方位(本文中称为第二方位)的具体实现可以参见步骤102中的相关描述,本发明在此不再赘述。
步骤306、基于第二距离和第二方位确定目标在导航坐标下的第二坐标。
在应用中,基于第二距离和第二方位确定目标在导航坐标下的坐标(本文中称为第二坐标)的具体实现与基于第一距离和第一方位确定目标在导航坐标下的第一坐标的具体实现相类似,本发明在此不再赘述。
需要说明的是,在该实施例中,步骤301~步骤303和步骤304~步骤306之间并不存在必然的时序关系,即可以先执行步骤301~步骤303,后执行步骤304~步骤306;也可以先执行步骤304~步骤306,后执行步骤301~303;还可以二者并发执行。
步骤307、当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于第一坐标和/或所述第二坐标,以及飞行器在导航坐标系下的坐标控制飞行器的飞行。
在一些实施例中,目标的视觉框在拍摄画面的尺寸占比小于预设第一占比阈值,和/或,目标与飞行器之间的距离大于预设第一距离时,确定为远场状态。
在应用中,当从近场状态切换为远场状态后,可以基于第一坐标和/或第二坐标,以及飞行器在导航坐标系下的坐标控制飞行器的飞行。
在近场状态以及远场状态时,可以基于第一坐标和/或第二坐标,以及飞行器在导航坐标系下的坐标控制飞行器的飞行。
其中,基于第一坐标以及飞行器在导航坐标系下的坐标控制飞行器的飞行,或,基于第二坐标以及飞行器在导航坐标系下的坐标控制飞行器飞行的具体实现可以参照上述方法实施例中的相关描述,本发明在此不再赘述。
作为一种实施例中,基于第一坐标和第二坐标,以及飞行器在导航坐标系下坐标控制飞行器的飞行,可以包括:
通过滤波器对第一坐标和第二坐标进行融合;
基于融合后的坐标以及飞行器在导航坐标系下的坐标控制飞行器的飞行。
具体地,考虑到通过深度图的方式或视觉框的方式确定的目标的坐标相对于目标的真实坐标总会有一些偏差,即会存在噪音,为了提高目标的坐标准确性,在确定了第一坐标和第 二坐标之后,可以通过滤波器对第一坐标和第二坐标进行融合,并基于融合后的坐标以及飞行器在导航坐标系下的坐标控制飞行器的飞行。
在一个例子中,上述滤波器可以为卡尔曼滤波器。
相应地,上述通过滤波器对第一坐标和第二坐标进行融合,可以包括:
在飞行器跟随目标的模式中,获取目标的类型,基于目标的类型确定卡尔曼滤波器的状态方程;
基于确定了状态方程的卡尔曼滤波器对第一坐标和第二坐标进行融合。
具体地,由于使用卡尔曼滤波器进行噪音过滤时,目标类型的不同对应的卡尔曼滤波器的状态方程也不相同,因此,当需要使用卡尔曼滤波器进行噪音过滤时,需要先确定目标的类型,并确定目标的类型对应的卡尔曼滤波器的状态方程。
例如,若目标的类型为汽车,则可以使用自行车模型;若目标的类型为行人,则可以使用匀加速运动模型。
相应地,在飞行器跟随目标的模式中,在使用卡尔曼滤波器进行坐标融合之前,可以先获取目标的类型,并基于目标的类型确定卡尔曼滤波的状态方程,进而,基于确定了状态方程的卡尔曼滤波器对第一坐标和第二坐标进行融合。
举例来说,假设目标的类型为行人,则可以使用匀加速运动模型:
Figure PCTCN2018073870-appb-000001
其中,x(n)为系统状态向量,u(n)是驱动输入向量,w(n)为估计噪声,A,B为常系数矩阵,即是状态空间中的状态方程。z(n)为观测结果(即测量结果),H(n)为观测矢量,v(n)为观测噪声。
状态方程为:
Figure PCTCN2018073870-appb-000002
其中,x(n-1|n-1)为n-1时刻的最优估计误差均值,x(n|n-1)为n时刻的估计误差均值,x(n|n)为n时刻的最优估计误差均值。
最小均方误差矩阵:
P(n|n-1)=AP(n-1|n-1)A T+Q    (4)
其中,P(n-1|n-1)是n-1时刻的方差矩阵的最优估计值;P(n|n-1)是n时刻的方差矩阵的估计值,P(n|n)是n时刻的方差矩阵的最优估计值。
卡尔曼增益系数方程为:
Figure PCTCN2018073870-appb-000003
其中,P(n|n-1)H T(n)为n时刻的估计最小均方误差,R(n)为n时刻测量误差,R(n)+H(n)P(n|n-1)H T(n)为n时刻的总误差。
需要说明的是,在本发明实施例中,当仅使用第一坐标以及飞行器在导航坐标下的坐标控制飞行器飞行,或仅使用第二坐标以及飞行器在导航坐标系下的坐标控制飞行器飞行时,也可以使用滤波器(如卡尔曼滤波器)对第一坐标和第二坐标进行过滤,以提高目标在导航坐标系下的坐标的准确性,提高飞行器的飞行控制的精度。
应该认识到,上述滤波器并不限于卡尔曼滤波器,例如,滤波器还可以为Butterworth(巴特沃斯)滤波器,其具体实现在此不做赘述。
此外,在本发明实施例中,当目标设置有GPS装置,或UWB(Ultra-Wideband)定位装置时,可以直接通过该GPS装置或UWB装置确定目标在导航坐标系下的坐标。或者,当飞行器上设置有激光雷达时,也可以通过该激光雷达装置获取目标在导航坐标系下的坐标,其具体实现在此不做赘述。
以上对图3所示的实施例三进行了描述。
实施例四
本实施例四提供了飞行控制装置的结构图。图4为本发明实施例四提供的飞行控制装置的结构图。该装置对应图1所示的方法流程。如图4所示,该装置可以包括处理器401和存储器402。
其中,处理器401,用于基于所述第一成像装置获取的深度图确定目标相对飞行器的第一距离。处理器401,还用于确定所述目标相对飞行器的第一方位。存储器402,用于存储所述第一距离和所述第一方位。处理器401,还用于基于所述第一距离和第一方位控制所述飞行器的飞行。
作为一个实施例,处理器401,具体用于在所述深度图中确定所述目标;基于所述深度图确定所述目标相对飞行器的第一距离.
作为一个实施例,处理器401,具体用于对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;确定所述目标在所述深度图中的位置;基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
在一个例子中,所述飞行器还设置有第二成像装置;相应地,处理器401,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;将所述拍摄画面上的视觉框旋转映射到所述深度图;基于映射到所述深度图中的视觉框确定所述目标在所述深度图中的位置。
作为一个实施例,所述飞行器还设置有第二成像装置;相应地,处理器401,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框在所述拍摄画面中的位置确定所述目标相对飞行器的第一方位。
作为一个实施例,处理器401,具体用于在所述第一成像装置获取的灰度图中确定所述目标,所述深度图是基于所述灰度图确定的;基于所述目标在所述灰度图中的位置确定所述目标相对飞行器的第一方位。
在一个例子中,所述飞行器还设置有第二成像装置;相应地,处理器401,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;将所述拍摄画面上的视觉框旋转映射到所述灰度图;基于映射到所述灰度图中的视觉框在所述灰度图中确定所述目标。
在另一个例子中,处理器401,具体用于利用图像识别在所述灰度图中识别所述目标。
作为一个实施例,处理器401,具体用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行;处理器401,具体用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行;存储器402,还用于存储所述第一坐标。
作为一个实施例,处理器401,具体用于对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;确定所述目标在所述深度图中的位置;基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
可以看出,在本发明中,处理器401通过基于第一成像装置获取的深度图确定目标相对飞行器的距离,并确定目标相对飞行器的方位,进而,根据目标相对飞行器的距离和方位控制飞行器的飞行,实现了在不需要遥控器的情况下对飞行器的飞行控制,提高了飞行控制 的效率;而通过深度图确定目标相对飞行器的距离的方式可以提高所确定的目标相对飞行器的距离的准确性,从而,可以提高飞行器的飞行控制的精度。
以上对实施例四进行了描述。
在上述实施例四的基础上,对应实施例二,处理器401,具体用于在近场状态下,且所述目标位于所述第一成像装置的视场内时,基于所述第一距离和第一方位控制所述飞行器的飞行。
在一个例子中,所述飞行器还设置有第二成像装置;处理器401,还用于在近场状态下,当所述目标从所述第一成像装置的视场内消失,且所述目标存在于所述第二成像装置的视场内时,确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框确定所述目标相对所述飞行器的当前的方位;根据最近一次确定的所述目标在导航坐标下的第一坐标和当前的方位更新所述目标在导航坐标系下的第一坐标。
在上述实施例四的基础上,对应实施例三,所述飞行器还设置有第二成像装置;相应地,处理器401,还用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框确定所述目标相对飞行器之间的第二距离和第二方位;处理器401,还用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述第二距离和第二方位确定所述目标在导航坐标系下的第二坐标;存储器402,还用于存储所述第二坐标;处理器401,还用于当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
作为一个实施例,处理器401,具体用于通过滤波器对所述第一坐标和第二坐标进行融合;基于融合后的坐标以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行;存储器402,还用于存储融合后的坐标。
在一个例子中,所述滤波器为卡尔曼滤波器,相应地,处理器401,还用于在飞行器跟随目标的模式中,获取目标的类型,基于所述目标的类型确定所述卡尔曼滤波器的状态方程;基于确定了状态方程的卡尔曼滤波器对所述第一坐标和第二坐标进行融合。
需要说明的是,本发明实施例中,图4所示的飞行控制装置可搭载在飞行器(如无人机)上。图5示出了搭载飞行控制装置的飞行器。如图5所示,飞行器包括:机身501、动力系统502、第一成像装置503以及如上描述的飞行控制装置(标识为504)。
动力系统502安装在所述机身,用于提供飞行动力。动力系统502包括如下至少一种: 电机505、螺旋桨506和电子调速器507。
飞行控制装置的具体原理和实现方式均与上述实施例类似,此处不再赘述。
另外,如图5所示,飞行器还包括:第二成像装置508和支撑设备509。其中,支撑设备509具体可以是云台,第二成像装置508通过支撑设备509与飞行器固定连接。
实施例五
本实施例五提供了一种机器可读存储介质,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:
基于第一成像装置获取的深度图确定目标相对飞行器的第一距离;
确定所述目标相对飞行器的第一方位;
基于所述第一距离和第一方位控制所述飞行器的飞行。
在一个实施例中,所述计算机指令被执行时进行如下处理:
在所述深度图中确定所述目标;
基于所述深度图确定所述目标相对飞行器的第一距离。
在一个实施例中,所述计算机指令被执行时进行如下处理:
对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;
确定所述目标在所述深度图中的位置;
基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
在一个实施例中,所述计算机指令被执行时进行如下处理:
确定所述第二成像装置的拍摄画面中框有目标的视觉框;
将所述拍摄画面上的视觉框旋转映射到所述深度图;
基于映射到所述深度图中的视觉框确定所述目标在所述深度图中的位置。
在一个实施例中,所述计算机指令被执行时进行如下处理:
确定所述第二成像装置的拍摄画面中框有目标的视觉框;
基于所述视觉框在所述拍摄画面中的位置确定所述目标相对飞行器的第一方位。
在一个实施例中,所述计算机指令被执行时进行如下处理:
在所述第一成像装置获取的灰度图中确定所述目标,所述深度图是基于所述灰度图确定的;
基于所述目标在所述灰度图中的位置确定所述目标相对飞行器的第一方位。
在一个实施例中,所述计算机指令被执行时进行如下处理:
确定所述第二成像装置的拍摄画面中框有目标的视觉框;
将所述拍摄画面上的视觉框旋转映射到所述灰度图;
基于映射到所述灰度图中的视觉框在所述灰度图中确定所述目标。
在一个实施例中,所述计算机指令被执行时进行如下处理:
利用图像识别在所述灰度图中识别所述目标。
在一个实施例中,所述计算机指令被执行时进行如下处理:
基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行。
在一个实施例中,所述计算机指令被执行时进行如下处理:
在飞行器跟随目标的模式中,基于所述第一距离和第一方位控制所述飞行器对所述目标的跟随;或/和,
在基于目标的手势控制飞行器的模式中,基于所述第一距离和第一方位控制所述飞行器响应所述目标的手势控制指令。
在一个实施例中,所述计算机指令被执行时进行如下处理:
在近场状态下,且所述目标位于所述第一成像装置的视场内时,基于所述第一距离和第一方位控制所述飞行器的飞行。
在一个实施例中,所述计算机指令被执行时还进行如下处理:
在近场状态下,当所述目标从所述第一成像装置的视场内消失时,基于所述视觉框确定所述目标相对所述飞行器的当前的方位;
根据最近一次确定的所述目标在导航坐标下的第一坐标和当前的方位更新所述目标在导航坐标系下的第一坐标。
在一个实施例中,所述计算机指令被执行时还进行如下处理:
确定所述第二成像装置的拍摄画面中框有目标的视觉框;
基于所述视觉框确定所述目标相对飞行器之间的第二距离和第二方位;
基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
基于所述第二距离和第二方位确定所述目标在导航坐标系下的第二坐标;
当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
在一个实施例中,所述计算机指令被执行时进行如下处理:
通过滤波器对所述第一坐标和第二坐标进行融合;
基于融合后的坐标以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
在一个实施例中,所述计算机指令被执行时进行如下处理:
在飞行器跟随目标的模式中,获取目标的类型,基于所述目标的类型确定所述卡尔曼滤波器的状态方程;
基于确定了状态方程的卡尔曼滤波器对所述第一坐标和第二坐标进行融合。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本发明实施例所提供的方法和装置进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及 其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (45)

  1. 一种飞行控制方法,其特征在于,应用于飞行器,所述飞行器设置有第一成像装置,所述方法包括:
    基于所述第一成像装置获取的深度图确定目标相对飞行器的第一距离;
    确定所述目标相对飞行器的第一方位;
    基于所述第一距离和第一方位控制所述飞行器的飞行。
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述第一成像装置获取的深度图确定目标与飞行器的第一距离,包括:
    在所述深度图中确定所述目标;
    基于所述深度图确定所述目标相对飞行器的第一距离。
  3. 根据权利要求1所述的方法,其特征在于,所述确定所述目标相对飞行器的第一方位,包括:
    对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;
    确定所述目标在所述深度图中的位置;
    基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
  4. 根据权利要求3所述的方法,其特征在于,所述飞行器还设置有第二成像装置;
    所述确定所述目标在所述深度图中的位置,包括:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    将所述拍摄画面上的视觉框旋转映射到所述深度图;
    基于映射到所述深度图中的视觉框确定所述目标在所述深度图中的位置。
  5. 根据权利要求1所述的方法,其特征在于,所述飞行器还设置有第二成像装置;
    所述确定所述目标相对飞行器的第一方位,包括:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框在所述拍摄画面中的位置确定所述目标相对飞行器的第一方位。
  6. 根据权利要求1所述的方法,其特征在于,所述确定所述目标相对飞行器的第一方位,包括:
    在所述第一成像装置获取的灰度图中确定所述目标,所述深度图是基于所述灰度图确定的;
    基于所述目标在所述灰度图中的位置确定所述目标相对飞行器的第一方位。
  7. 根据权利要求6所述的方法,其特征在于,所述飞行器还设置有第二成像装置;
    所述在所述第一成像装置获取的灰度图中确定所述目标,包括:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    将所述拍摄画面上的视觉框旋转映射到所述灰度图;
    基于映射到所述灰度图中的视觉框在所述灰度图中确定所述目标。
  8. 根据权利要求6所述的方法,其特征在于,所述在所述第一成像装置获取的灰度图中确定所述目标,包括:
    利用图像识别在所述灰度图中识别所述目标。
  9. 根据权利要求1所述的方法,其特征在于,所述基于所述第一距离和第一方位控制所述飞行器的飞行,包括:
    基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
    基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行。
  10. 根据权利要求1所述的方法,其特征在于,所述基于所述第一距离和第一方位控制所述飞行器的飞行,包括:
    在飞行器跟随目标的模式中,基于所述第一距离和第一方位控制所述飞行器对所述目标的跟随;或/和,
    在基于目标的手势控制飞行器的模式中,基于所述第一距离和第一方位控制所述飞行器响应所述目标的手势控制指令。
  11. 根据权利要求1所述的方法,其特征在于,所述基于所述第一距离和第一方位控制所述飞行器的飞行,包括:
    在近场状态下,且所述目标位于所述第一成像装置的视场内时,基于所述第一距离和第一方位控制所述飞行器的飞行。
  12. 根据权利要求11所述的方法,其特征在于,所述飞行器还设置有第二成像装置;
    所述方法还包括:
    在近场状态下,当所述目标从所述第一成像装置的视场内消失,且所述目标存在于所述第二成像装置的视场内时,确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框确定所述目标相对所述飞行器的当前的方位;
    根据最近一次确定的所述目标在导航坐标下的第一坐标和当前的方位更新所述目标在导航坐标系下的第一坐标。
  13. 根据权利要求11所述的方法,其特征在于,所述飞行器还设置有第二成像装置;
    所述方法还包括:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框确定所述目标相对飞行器之间的第二距离和第二方位;
    基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
    基于所述第二距离和第二方位确定所述目标在导航坐标系下的第二坐标;
    当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
  14. 根据权利要求13所述的方法,其特征在于,所述基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行,包括:
    通过滤波器对所述第一坐标和第二坐标进行融合;
    基于融合后的坐标以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
  15. 根据权利要求14所述的方法,其特征在于,所述滤波器为卡尔曼滤波器,所述通过滤波器对所述第一坐标和第二坐标进行融合,包括:
    在飞行器跟随目标的模式中,获取目标的类型,基于所述目标的类型确定所述卡尔曼滤波器的状态方程;
    基于确定了状态方程的卡尔曼滤波器对所述第一坐标和第二坐标进行融合。
  16. 一种飞行控制装置,其特征在于,应用于飞行器,所述飞行器设置有第一成像装置所述飞行控制装置包括:
    处理器,用于基于所述第一成像装置获取的深度图确定目标相对飞行器的第一距离;
    所述处理器,还用于确定所述目标相对飞行器的第一方位;
    存储器,用于存储所述第一距离和所述第一方位;
    所述处理器,还用于基于所述第一距离和第一方位控制所述飞行器的飞行。
  17. 根据权利要求16所述的飞行控制装置,其特征在于,所述处理器,具体用于在所述深度图中确定所述目标;基于所述深度图确定所述目标相对飞行器的第一距离。
  18. 根据权利要求16所述的飞行控制装置,其特征在于,所述处理器,具体用于对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;确定所述目标在所述深度图中的位置;基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
  19. 根据权利要求18所述的飞行控制装置,其特征在于,所述飞行器还设置有第二成像装置;
    所述处理器,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;将所述 拍摄画面上的视觉框旋转映射到所述深度图;基于映射到所述深度图中的视觉框确定所述目标在所述深度图中的位置。
  20. 根据权利要求16所述的飞行控制装置,其特征在于,所述飞行器还设置有第二成像装置;
    所述处理器,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框在所述拍摄画面中的位置确定所述目标相对飞行器的第一方位。
  21. 根据权利要求16所述的飞行控制装置,其特征在于,所述处理器,具体用于在所述第一成像装置获取的灰度图中确定所述目标,所述深度图是基于所述灰度图确定的;基于所述目标在所述灰度图中的位置确定所述目标相对飞行器的第一方位。
  22. 根据权利要求21所述的飞行控制装置,其特征在于,所述飞行器还设置有第二成像装置;
    所述处理器,具体用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;将所述拍摄画面上的视觉框旋转映射到所述灰度图;基于映射到所述灰度图中的视觉框在所述灰度图中确定所述目标。
  23. 根据权利要求21所述的飞行控制装置,其特征在于,
    所述处理器,具体用于利用图像识别在所述灰度图中识别所述目标。
  24. 根据权利要求16所述的飞行控制装置,其特征在于,
    所述处理器,具体用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行。
    所述处理器,具体用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行;
    所述存储器,还用于存储所述第一坐标。
  25. 根据权利要求16所述的飞行控制装置,其特征在于,
    所述处理器,具体用于在飞行器跟随目标的模式中,基于所述第一距离和第一方位控制所述飞行器对所述目标的跟随;或/和,在基于目标的手势控制飞行器的模式中,基于所述第一距离和第一方位控制所述飞行器响应所述目标的手势控制指令。
  26. 根据权利要求16所述的飞行控制装置,其特征在于,
    所述处理器,具体用于在近场状态下,且所述目标位于所述第一成像装置的视场内时,基于所述第一距离和第一方位控制所述飞行器的飞行。
  27. 根据权利要求26所述的飞行控制装置,其特征在于,所述飞行器还设置有第二成像 装置;
    所述处理器,还用于在近场状态下,当所述目标从所述第一成像装置的视场内消失,且所述目标存在于所述第二成像装置的视场内时,确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框确定所述目标相对所述飞行器的当前的方位;根据最近一次确定的所述目标在导航坐标下的第一坐标和当前的方位更新所述目标在导航坐标系下的第一坐标。
  28. 根据权利要求26所述的飞行控制装置,其特征在于,所述飞行器还设置有第二成像装置;
    所述处理器,还用于确定所述第二成像装置的拍摄画面中框有目标的视觉框;基于所述视觉框确定所述目标相对飞行器之间的第二距离和第二方位;
    所述处理器,还用于基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;基于所述第二距离和第二方位确定所述目标在导航坐标系下的第二坐标;
    所述存储器,还用于存储所述第二坐标;
    所述处理器,还用于当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
  29. 根据权利要求28所述的飞行控制装置,其特征在于,
    所述处理器,具体用于通过滤波器对所述第一坐标和第二坐标进行融合;基于融合后的坐标以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行;
    所述存储器,还用于存储融合后的坐标。
  30. 根据权利要求29所述的飞行控制装置,其特征在于,所述滤波器为卡尔曼滤波器,所述处理器,还用于在飞行器跟随目标的模式中,获取目标的类型,基于所述目标的类型确定所述卡尔曼滤波器的状态方程;基于确定了状态方程的卡尔曼滤波器对所述第一坐标和第二坐标进行融合。
  31. 一种机器可读存储介质,其特征在于,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:
    基于第一成像装置获取的深度图确定目标相对飞行器的第一距离;
    确定所述目标相对飞行器的第一方位;
    基于所述第一距离和第一方位控制所述飞行器的飞行。
  32. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时 进行如下处理:
    在所述深度图中确定所述目标;
    基于所述深度图确定所述目标相对飞行器的第一距离。
  33. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    对深度图中的各像素点进行聚类,基于聚类得到的点云的形状和/或大小识别目标;
    确定所述目标在所述深度图中的位置;
    基于所述目标在所述深度图中的位置确定所述目标相对飞行器的第一方位。
  34. 根据权利要求33所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    将所述拍摄画面上的视觉框旋转映射到所述深度图;
    基于映射到所述深度图中的视觉框确定所述目标在所述深度图中的位置。
  35. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框在所述拍摄画面中的位置确定所述目标相对飞行器的第一方位。
  36. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    在所述第一成像装置获取的灰度图中确定所述目标,所述深度图是基于所述灰度图确定的;
    基于所述目标在所述灰度图中的位置确定所述目标相对飞行器的第一方位。
  37. 根据权利要求36所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    将所述拍摄画面上的视觉框旋转映射到所述灰度图;
    基于映射到所述灰度图中的视觉框在所述灰度图中确定所述目标。
  38. 根据权利要求36所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    利用图像识别在所述灰度图中识别所述目标。
  39. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
    基于所述飞行器在导航坐标系下的坐标和所述第一坐标控制所述飞行器的飞行。
  40. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    在飞行器跟随目标的模式中,基于所述第一距离和第一方位控制所述飞行器对所述目标的跟随;或/和,
    在基于目标的手势控制飞行器的模式中,基于所述第一距离和第一方位控制所述飞行器响应所述目标的手势控制指令。
  41. 根据权利要求31所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    在近场状态下,且所述目标位于所述第一成像装置的视场内时,基于所述第一距离和第一方位控制所述飞行器的飞行。
  42. 根据权利要求41所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    在近场状态下,当所述目标从所述第一成像装置的视场内消失,且所述目标存在于所述第二成像装置的视场内时,确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框确定所述目标相对所述飞行器的当前的方位;
    根据最近一次确定的所述目标在导航坐标下的第一坐标和当前的方位更新所述目标在导航坐标系下的第一坐标。
  43. 根据权利要求41所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    确定所述第二成像装置的拍摄画面中框有目标的视觉框;
    基于所述视觉框确定所述目标相对飞行器之间的第二距离和第二方位;
    基于所述第一距离和第一方位确定所述目标在导航坐标系下的第一坐标;
    基于所述第二距离和第二方位确定所述目标在导航坐标系下的第二坐标;
    当从近场状态切换为远场状态后,和/或,在近场状态以及远场状态时,基于所述第一坐标和/或所述第二坐标,以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
  44. 根据权利要求43所述的机器可读存储介质,其特征在于,所述计算机指令被执行时 进行如下处理:
    通过滤波器对所述第一坐标和第二坐标进行融合;
    基于融合后的坐标以及所述飞行器在导航坐标系下的坐标控制所述飞行器的飞行。
  45. 根据权利要求44所述的机器可读存储介质,其特征在于,所述计算机指令被执行时进行如下处理:
    在飞行器跟随目标的模式中,获取目标的类型,基于所述目标的类型确定所述卡尔曼滤波器的状态方程;
    基于确定了状态方程的卡尔曼滤波器对所述第一坐标和第二坐标进行融合。
PCT/CN2018/073870 2018-01-23 2018-01-23 飞行控制方法、装置和机器可读存储介质 WO2019144291A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201880011997.7A CN110312978B (zh) 2018-01-23 2018-01-23 飞行控制方法、装置和机器可读存储介质
PCT/CN2018/073870 WO2019144291A1 (zh) 2018-01-23 2018-01-23 飞行控制方法、装置和机器可读存储介质
US16/934,948 US20210011490A1 (en) 2018-01-23 2020-07-21 Flight control method, device, and machine-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/073870 WO2019144291A1 (zh) 2018-01-23 2018-01-23 飞行控制方法、装置和机器可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/934,948 Continuation US20210011490A1 (en) 2018-01-23 2020-07-21 Flight control method, device, and machine-readable storage medium

Publications (1)

Publication Number Publication Date
WO2019144291A1 true WO2019144291A1 (zh) 2019-08-01

Family

ID=67394527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073870 WO2019144291A1 (zh) 2018-01-23 2018-01-23 飞行控制方法、装置和机器可读存储介质

Country Status (3)

Country Link
US (1) US20210011490A1 (zh)
CN (1) CN110312978B (zh)
WO (1) WO2019144291A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469139B (zh) * 2021-07-30 2022-04-05 广州中科智云科技有限公司 无人机边缘侧嵌入式ai芯片的数据安全传输方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105761265A (zh) * 2016-02-23 2016-07-13 英华达(上海)科技有限公司 利用影像深度信息提供避障的方法及无人飞行载具
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机
CN106054929A (zh) * 2016-06-27 2016-10-26 西北工业大学 一种基于光流的无人机自动降落引导方法
CN106774947A (zh) * 2017-02-08 2017-05-31 亿航智能设备(广州)有限公司 一种飞行器及其控制方法
US20180004232A1 (en) * 2015-07-08 2018-01-04 SZ DJI Technology Co., Ltd Camera configuration on movable objects

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8773427B2 (en) * 2010-12-22 2014-07-08 Sony Corporation Method and apparatus for multiview image generation using depth map information
CN103796001B (zh) * 2014-01-10 2015-07-29 深圳奥比中光科技有限公司 一种同步获取深度及色彩信息的方法及装置
CN104918035A (zh) * 2015-05-29 2015-09-16 深圳奥比中光科技有限公司 一种获取目标三维图像的方法及系统
CN105468014B (zh) * 2016-01-18 2018-07-31 中国人民解放军国防科学技术大学 一种单自驾仪一体化的飞行器系统及其二维云台控制方法
CN106354156A (zh) * 2016-09-29 2017-01-25 腾讯科技(深圳)有限公司 一种跟踪目标对象的方法、装置及飞行器
CN107194962B (zh) * 2017-04-01 2020-06-05 深圳市速腾聚创科技有限公司 点云与平面图像融合方法及装置
CN107329490B (zh) * 2017-07-21 2020-10-09 歌尔科技有限公司 无人机避障方法及无人机

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180004232A1 (en) * 2015-07-08 2018-01-04 SZ DJI Technology Co., Ltd Camera configuration on movable objects
CN105761265A (zh) * 2016-02-23 2016-07-13 英华达(上海)科技有限公司 利用影像深度信息提供避障的方法及无人飞行载具
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机
CN106054929A (zh) * 2016-06-27 2016-10-26 西北工业大学 一种基于光流的无人机自动降落引导方法
CN106774947A (zh) * 2017-02-08 2017-05-31 亿航智能设备(广州)有限公司 一种飞行器及其控制方法

Also Published As

Publication number Publication date
CN110312978B (zh) 2022-06-24
US20210011490A1 (en) 2021-01-14
CN110312978A (zh) 2019-10-08

Similar Documents

Publication Publication Date Title
CN110312912B (zh) 车辆自动停车系统和方法
CN110582798B (zh) 用于虚拟增强视觉同时定位和地图构建的系统和方法
CN111274343B (zh) 一种车辆定位方法、装置、电子设备及存储介质
WO2020014909A1 (zh) 拍摄方法、装置和无人机
US9025825B2 (en) System and method for visual motion based object segmentation and tracking
JP2020030204A (ja) 距離測定方法、プログラム、距離測定システム、および可動物体
CN106873619B (zh) 一种无人机飞行路径的处理方法
WO2018120350A1 (zh) 对无人机进行定位的方法及装置
WO2020113423A1 (zh) 目标场景三维重建方法、系统及无人机
WO2020133172A1 (zh) 图像处理方法、设备及计算机可读存储介质
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
De Croon et al. Optic-flow based slope estimation for autonomous landing
WO2020014987A1 (zh) 移动机器人的控制方法、装置、设备及存储介质
WO2021043214A1 (zh) 一种标定方法、装置及飞行器
US10606360B2 (en) Three-dimensional tilt and pan navigation using a single gesture
WO2022077296A1 (zh) 三维重建方法、云台负载、可移动平台以及计算机可读存储介质
WO2021081774A1 (zh) 一种参数优化方法、装置及控制设备、飞行器
US20210097696A1 (en) Motion estimation methods and mobile devices
WO2020024134A1 (zh) 轨迹切换的方法和装置
WO2019205087A1 (zh) 图像增稳方法和装置
WO2019144286A1 (zh) 障碍物检测方法、移动平台及计算机可读存储介质
CN109863745A (zh) 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质
WO2019144291A1 (zh) 飞行控制方法、装置和机器可读存储介质
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机
WO2021056411A1 (zh) 航线调整方法、地面端设备、无人机、系统和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902847

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902847

Country of ref document: EP

Kind code of ref document: A1