WO2020135447A1 - 一种目标距离估计方法、装置及无人机 - Google Patents

一种目标距离估计方法、装置及无人机 Download PDF

Info

Publication number
WO2020135447A1
WO2020135447A1 PCT/CN2019/128058 CN2019128058W WO2020135447A1 WO 2020135447 A1 WO2020135447 A1 WO 2020135447A1 CN 2019128058 W CN2019128058 W CN 2019128058W WO 2020135447 A1 WO2020135447 A1 WO 2020135447A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting device
target
drone
angle
distance
Prior art date
Application number
PCT/CN2019/128058
Other languages
English (en)
French (fr)
Inventor
黄金鑫
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Priority to EP19905518.7A priority Critical patent/EP3905197A4/en
Publication of WO2020135447A1 publication Critical patent/WO2020135447A1/zh
Priority to US17/356,656 priority patent/US11747833B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming

Definitions

  • the present invention relates to the field of drones, and more specifically, to a target distance estimation method, device, and drones.
  • the intelligent following function based on UAV is roughly divided into two parts, namely target detection and following control.
  • the target detection on the UAV is achieved by carrying a pan-tilt camera. This part will find the target to be followed on the picture obtained by the camera, and mark it with a rectangular frame (or other formation).
  • the following control part is to realize that under certain requirements (such as keeping the distance between the target and the UAV unchanged), the UAV can follow the target to move.
  • the simplest method is to follow according to the target's three-dimensional space position.
  • the gimbal camera is a monocular camera, it is not possible to obtain the three-dimensional space position of the followed target directly from the camera.
  • one method is to calculate the distance of the target relative to the drone through the ground height obtained by the drone.
  • This method relies on a high degree of accuracy, and the height data is unstable, which tends to cause greater and greater deviations in the calculated distance; in addition, this method uses a strong assumption-that the ground has been kept level, however, the actual The ground has ups and downs, so the ups and downs of the ground will directly cause the method to fail.
  • Another method is to estimate the position of the target through triangulation. This method requires the UAV to generate displacement. If the UAV has been hovering, the distance between the target and the UAV cannot be estimated.
  • the technical problem to be solved by the present invention is to provide a target distance estimation method, device and drone in view of the above-mentioned defects of the prior art.
  • the technical solution adopted by the present invention to solve its technical problem is to construct a target distance estimation method for a drone, and the drone includes a shooting device, including:
  • the posture information includes a pitch angle of the shooting device
  • the acquiring the distance d of the target relative to the drone according to the position information of the target and the posture information of the shooting device includes:
  • the two-dimensional pixel coordinates of the target in the image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, and the smallest circumscribed frame is in the image
  • the two-dimensional pixel coordinates in include the vertical coordinate v min of the highest point of the smallest circumscribed frame and the vertical coordinate v max of the lowest point of the smallest circumscribed frame;
  • the pitch angle ⁇ with the shooting device obtains the distance d of the target relative to the drone.
  • the acquiring is performed by connecting a line between the highest point of the smallest circumscribed frame and the shooting device, and the lowest point of the smallest circumscribed frame and the shooting device.
  • the angle formed by the connection of include:
  • the angle ⁇ between the line connecting the highest point of the smallest circumscribed frame and the shooting device and the optical axis of the shooting device includes:
  • v min is the ordinate of the highest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the shooting device
  • f y is the focal length of the shooting device in the y-axis direction.
  • the angle ⁇ between the line connecting the lowest point of the minimum circumscribed frame and the shooting device and the optical axis of the shooting device includes:
  • v max is the ordinate of the lowest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the camera
  • f y is the focal length of the camera in the y-axis direction.
  • the angle Obtaining the distance d of the target relative to the drone from the pitch angle ⁇ of the shooting device includes:
  • the distance d of the target relative to the drone is obtained.
  • the angle ⁇ formed by the connection between the lowest point of the smallest circumscribed frame and the shooting device and the vertical direction is obtained include:
  • the angle Obtaining the distance d between the target and the drone with the angle ⁇ includes:
  • L 2 is the length of the connection line between the lowest point of the smallest circumscribed frame and the shooting device.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device can be obtained by the following formula:
  • L 1 is the length of the connection line between the highest point of the smallest circumscribed frame and the shooting device, and L is the height of the target.
  • the shooting device includes a monocular camera.
  • the invention also provides a target distance estimation device for a drone, including:
  • An image acquisition unit configured to acquire the current frame image of the target
  • a position information acquiring unit configured to acquire position information of the target according to the current frame image, wherein the position information includes the height of the target and the two-dimensional pixel coordinates of the target in the image;
  • An attitude information acquiring unit configured to acquire attitude information of a shooting device of the drone, wherein the attitude information includes a pitch angle of the shooting device;
  • the distance determining unit is configured to acquire the distance d of the target relative to the drone according to the position information of the target and the posture information of the shooting device.
  • the distance determining unit is specifically used to:
  • the two-dimensional pixel coordinates of the target in the image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, and the smallest circumscribed frame is in the image
  • the two-dimensional pixel coordinates in include the vertical coordinate v min of the highest point of the smallest circumscribed frame and the vertical coordinate v max of the lowest point of the smallest circumscribed frame;
  • the pitch angle ⁇ with the shooting device obtains the distance d of the target relative to the drone.
  • the distance determining unit is specifically used to:
  • the first sub-acquisition module includes: calculating the included angle ⁇ using the following formula:
  • v min is the ordinate of the highest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the shooting device
  • f y is the focal length of the shooting device in the y-axis direction.
  • the distance determining unit is specifically used to:
  • v max is the ordinate of the lowest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the camera
  • f y is the focal length of the camera in the y-axis direction.
  • the distance determining unit is specifically used to:
  • the distance d of the target relative to the drone is obtained.
  • the distance determining unit is specifically used to:
  • the distance determining unit is specifically used to:
  • L 2 is the length of the connection line between the lowest point of the smallest circumscribed frame and the shooting device.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device can be obtained by the following formula:
  • L 1 is the length of the connection line between the highest point of the smallest circumscribed frame and the shooting device, and L is the height of the target.
  • the shooting device includes a monocular camera.
  • the invention also provides a drone, including:
  • the machine arm is connected to the fuselage
  • a power device which is provided on the arm and used to provide power for the drone to fly;
  • the shooting device is provided on the body and used to acquire the current frame image of the target;
  • An inertial measurement unit provided in the shooting device for acquiring posture information of the shooting device, wherein the posture information includes a pitch angle of the shooting device;
  • a vision chip is provided on the fuselage and is electrically connected to the inertial measurement unit;
  • the vision chip is used for:
  • the vision chip is specifically used for:
  • the two-dimensional pixel coordinates of the target in the image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, and the smallest circumscribed frame is in the image
  • the two-dimensional pixel coordinates in include the vertical coordinate v min of the highest point of the smallest circumscribed frame and the vertical coordinate v max of the lowest point of the smallest circumscribed frame;
  • the pitch angle ⁇ with the shooting device obtains the distance d of the target relative to the drone.
  • the vision chip is specifically used for:
  • the vision chip is specifically used for:
  • v min is the ordinate of the highest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the shooting device
  • f y is the focal length of the shooting device in the y-axis direction.
  • the vision chip is specifically used for:
  • v max is the ordinate of the lowest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the camera
  • f y is the focal length of the camera in the y-axis direction.
  • the vision chip is specifically used for:
  • the distance d of the target relative to the drone is obtained.
  • the vision chip is specifically used for:
  • the vision chip is specifically used for:
  • L 2 is the length of the connection line between the lowest point of the smallest circumscribed frame and the shooting device.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device can be obtained by the following formula:
  • L 1 is the length of the connection line between the highest point of the smallest circumscribed frame and the shooting device, and L is the height of the target.
  • the shooting device includes a monocular camera.
  • the visual drone target tracking method of the present invention obtains each frame of the target image through an image acquisition device, and adopts a preset method for each Frame images are processed to obtain the movement information of the target at each time corresponding to the frame, and combined with the equipment information of the image acquisition device at this time to perform arithmetic processing, and finally obtain the real-time spatial position information of the target in each frame, so as to Real-time spatial location information controls the UAV to track the target.
  • the spatial position information of the target can be accurately calculated, there is no strong assumption, and the calculation amount is small, not only can the spatial position information of the target be estimated in real time, but also the accuracy is high.
  • FIG. 1 is a schematic flowchart of a target distance estimation method according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a target distance estimation method in an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of the spatial position of the target in the embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of a target distance estimation device according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • the embodiment of the invention provides a target distance estimation method, which can accurately estimate the distance of the target relative to the drone, and does not depend on the ground plane assumption, there is no strong assumption condition, the calculation amount is small, and the accuracy can be high Real-time estimation of the target's three-dimensional space position information, so as to ensure that the UAV can achieve accurate tracking based on the estimated three-dimensional space position information, or that the UAV can achieve autonomous obstacle avoidance control based on the estimated three-dimensional space position information.
  • an embodiment of the present invention provides a target distance estimation method.
  • This method can be used for the UAV to obtain the distance of the target relative to the UAV in real time, and to achieve real-time tracking of the target according to the obtained distance, where the target here can be a moving target or a stationary target, and In the real-time tracking process, the motion state of the target (moving or staying stationary) does not affect the accuracy of the calculation result.
  • obstacles can be avoided based on the distance of the acquired target relative to the drone.
  • FIG. 1 shows a schematic flowchart of a target distance estimation method provided by an embodiment of the present invention.
  • the target distance estimation method can be applied to drones, and the applied drones include shooting devices.
  • the method may specifically include the following steps:
  • Step S10 Acquire the current frame image of the target captured by the shooting device.
  • the photographing device may be a gimbal mounted on the drone and a camera mounted on the gimbal.
  • the camera includes but is not limited to a monocular camera.
  • the drone before the drone starts target tracking control or obstacle avoidance control, firstly, real-time shooting is carried out through the shooting device, and then the image sequence including the target is obtained, and the obtained image sequence is sent to the drone in real time.
  • the image sequence includes images of all frames captured by the camera.
  • Step S20 Acquire position information of the target according to the current frame image.
  • the position information of the target includes: the height of the target and the two-dimensional pixel coordinates of the target in the current frame image.
  • Step S30 Acquire posture information of the shooting device.
  • the attitude information of the camera includes the pitch angle of the camera.
  • Step S40 Acquire the distance of the target relative to the drone according to the position information of the target and the posture information of the shooting device.
  • the distance d of the target relative to the drone can be obtained by the following steps:
  • step S40 includes:
  • Step S401 Acquire the minimum external frame of the target.
  • the smallest circumscribed frame of the target may be the smallest circumscribed rectangular area including the target.
  • the two-dimensional pixel coordinates of the target in the current frame image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, where the two-dimensional pixel coordinates of the smallest circumscribed frame in the image include the smallest circumscribed frame
  • the ordinate coordinate v min of the highest point and the ordinate coordinate v max of the lowest point of the smallest bounding box are shown in Figure 3.
  • Step S402 Acquire a connection between the highest point of the smallest external frame and the shooting device, and a connection between the lowest point of the minimum external frame and the shooting device Angle ⁇ .
  • the angle ⁇ between the connection between the highest point of the smallest circumscribed frame and the shooting device and the optical axis of the shooting device is first obtained; then, the lowest point of the smallest circumscribed frame and the The angle ⁇ between the connecting line between the shooting devices and the optical axis of the shooting device. Then, the connection between the highest point of the smallest circumscribed frame and the shooting device and the lowest point of the smallest circumscribed frame and the shooting device are calculated from the obtained angle ⁇ and angle ⁇ The angle ⁇ formed by the connection between them.
  • the angle ⁇ can be calculated using the following formula:
  • the included angle ⁇ can be calculated by the following formula:
  • v min is the ordinate of the highest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the shooting device
  • f y is the focal length of the shooting device in the y-axis direction.
  • the included angle ⁇ can be calculated by the following formula:
  • v max is the ordinate of the lowest point of the smallest circumscribed frame
  • c x is the ordinate of the optical center of the camera
  • f y is the focal length of the camera in the y-axis direction.
  • Step S403 Obtain the distance d of the target relative to the drone according to the included angle ⁇ and the pitch angle ⁇ of the shooting device.
  • step S403 may include the following steps:
  • Step S4031 Obtain an angle ⁇ formed by the connection between the lowest point of the smallest circumscribed frame and the shooting device and the vertical direction according to the pitch angle ⁇ of the shooting device.
  • the included angle ⁇ can be calculated using the following formula:
  • Step S4032 Obtain the distance d of the target relative to the drone according to the included angle ⁇ and the included angle ⁇ .
  • the distance d between the target and the drone can be calculated using the following formula:
  • L 2 is the length of the connection line between the lowest point of the smallest circumscribed frame and the shooting device. As shown in Figure 3.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device can be obtained by the following formula:
  • L 1 is the length of the connection line between the highest point of the smallest circumscribed frame and the shooting device, and L is the height of the target. As shown in Figure 3.
  • the drone after obtaining the distance d of the target relative to the drone in step S40, the drone can be controlled to track the target according to the distance.
  • autonomous obstacle avoidance control may also be performed according to the distance.
  • the embodiment of the present invention uses the above steps S10 to S40 for each frame of image received to obtain the distance relative to the drone in each frame of the target, and according to each obtained The frame distance is tracked or obstacle avoidance controlled, enabling the UAV to acquire, track or autonomous obstacle avoidance control of the target's real-time position.
  • the target distance estimation method of the present invention By implementing the target distance estimation method of the present invention, the problem that the monocular camera is difficult to obtain the three-dimensional space position information of the target is effectively solved, and the present invention does not depend on the ground plane assumption, nor does it exist other strong assumption conditions, and the calculation amount is small, It can estimate the three-dimensional space position information of the target in real time.
  • the movement state of the target and the displacement of the drone will not affect the implementation of the present invention, that is, whether the target is in a moving state or a stationary state, the present invention can still accurately estimate the three-dimensional space position information of the target, and when the drone Without displacement, that is, when the drone is always in the hovering state, the present invention can still accurately estimate the distance of the target relative to the drone, and accurately obtain the three-dimensional space position information of the target.
  • FIG. 4 it is a schematic block diagram of an apparatus for estimating a target distance according to an embodiment of the present invention.
  • the target distance estimation device can be used to implement the above target distance estimation method.
  • the target distance estimation device of the embodiment of the present invention can be applied to a drone.
  • the target distance estimation device may include: an image acquisition unit 401, a position information acquisition unit 402, a posture information acquisition unit 403, and a distance determination unit 404.
  • the image acquisition unit 401 is used to acquire the current frame image of the target.
  • the position information obtaining unit 402 is configured to obtain the position information of the target according to the current frame image, where the position information includes the height of the target and the two-dimensional pixel coordinates of the target in the image.
  • the posture information obtaining unit 403 is used to obtain posture information of the shooting device, wherein the posture information includes a pitch angle of the shooting device.
  • the distance determining unit 404 is configured to acquire the distance d of the target relative to the drone based on the position information of the target and the posture information of the shooting device.
  • the distance determining unit 404 is specifically configured to:
  • the two-dimensional pixel coordinates of the target in the image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, and the smallest circumscribed frame is in the image
  • the two-dimensional pixel coordinates in include the vertical coordinate v min of the highest point of the smallest circumscribed frame and the vertical coordinate v max of the lowest point of the smallest circumscribed frame. ;as well as
  • the distance determining unit 404 is specifically used to:
  • the angle ⁇ can be calculated using the following formula:
  • v min is the vertical coordinate of the highest point of the smallest circumscribed frame
  • c x is the horizontal coordinate of the optical center of the shooting device
  • f y is the focal length of the shooting device.
  • the distance determining unit 404 can also be used to obtain an angle ⁇ between the connection line between the lowest point of the smallest circumscribed frame and the shooting device and the optical axis of the shooting device.
  • the angle ⁇ can be calculated using the following formula:
  • the angle ⁇ is calculated using the following formula:
  • v max is the vertical coordinate of the lowest point of the smallest circumscribed frame
  • c x is the horizontal coordinate of the optical center of the photographing device
  • f is the focal length of the photographing device.
  • the distance determining unit 404 can also be used to obtain the distance d of the target relative to the drone according to the included angle ⁇ and the pitch angle ⁇ of the camera.
  • the distance determining unit 404 is specifically configured to:
  • an angle ⁇ formed by a line between the lowest point of the smallest circumscribed frame and the shooting device and the vertical direction is obtained.
  • the included angle ⁇ can be calculated using the following formula:
  • the distance determining unit 404 can also be used to obtain the distance d of the target relative to the drone according to the included angle ⁇ and the included angle ⁇ .
  • the distance d of the target relative to the drone is calculated using the following formula:
  • L 2 is the length of the connection line between the lowest point of the smallest circumscribed frame and the shooting device.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device can be obtained by the following formula:
  • L 1 is the length of the connection line between the highest point of the smallest circumscribed frame and the shooting device, and L is the height of the target.
  • the image acquisition unit 401 may be a shooting device mounted on the drone, where the shooting device may include a pan-tilt head and a camera mounted on the pan-tilt head.
  • the cameras include but are not limited to monocular cameras.
  • the position information acquisition unit 402 and the distance determination unit 404 may be a vision chip of the drone.
  • the attitude information acquiring unit 403 may be an IMU (Inertial Measurement Unit) installed on the UAV platform.
  • the present invention also provides a drone, which can be used to realize the aforementioned target distance estimation method, which can estimate the distance of the target relative to the drone in real time, and can also estimate the target relative to the drone in real time Distance to achieve target tracking or obstacle avoidance.
  • the drone provided by the implementation of the present invention includes a fuselage 100, an arm 200 connected to the fuselage 100, a power device 201 provided on the arm 200, a shooting device 101 provided on the fuselage 100, and a vision The chip 102 and the inertial measurement unit 103 provided in the photographing device 101.
  • the vision chip 102 is electrically connected to the inertial measurement unit 103.
  • the power device 201 is used to provide power for flying a drone.
  • the power device 201 may include a motor provided on the arm 200 and a propeller connected to the motor.
  • the motor drives the propeller to rotate at a high speed to provide the power required for the drone to fly.
  • the shooting device 101 is used to acquire the current frame image of the target.
  • the shooting device 101 may be a gimbal camera mounted on the drone.
  • the shooting device 101 may include a gimbal connected to the body 100 of the drone and a camera connected to the gimbal.
  • the camera includes but is not limited to a monocular camera.
  • the inertial measurement unit 103 is used to acquire posture information of the camera 101, wherein the posture information includes the pitch angle of the camera 101.
  • the inertial measurement unit 103 is installed on the gimbal.
  • the vision chip 102 is used to perform the following actions:
  • Acquire position information of the target according to the current frame image where the position information includes the height of the target and the two-dimensional pixel coordinates of the target in the image.
  • vision chip 102 may be specifically used to perform the following steps:
  • the two-dimensional pixel coordinates of the target in the image are the two-dimensional pixel coordinates of the smallest circumscribed frame in the image, and the smallest circumscribed frame is in the image
  • the two-dimensional pixel coordinates in include the vertical coordinate v min of the highest point of the smallest circumscribed frame and the vertical coordinate v max of the lowest point of the smallest circumscribed frame.
  • the pitch angle ⁇ of the shooting device 101 is used to obtain the distance d of the target relative to the drone.
  • vision chip 102 can also be specifically used for:
  • the included angle ⁇ can be calculated using the following formula:
  • vision chip 102 can also be specifically used for:
  • v min is the ordinate of the highest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the camera 101
  • f y is the focal length of the camera 101 in the y-axis direction.
  • vision chip 102 can also be specifically used for:
  • v max is the ordinate of the lowest point of the smallest circumscribed frame
  • c y is the ordinate of the optical center of the camera 101
  • f y is the focal length of the camera 101 in the y-axis direction.
  • vision chip 102 can also be specifically used for:
  • the distance d of the target relative to the drone is obtained.
  • vision chip 102 can also be specifically used for:
  • vision chip 102 can also be specifically used for:
  • L 2 is the length of the connection between the lowest point of the smallest circumscribed frame and the shooting device 101.
  • the length L 2 of the connection line between the lowest point of the smallest circumscribed frame and the shooting device 101 can be obtained by the following formula:
  • L is the height of the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明涉及一种目标距离估计方法、装置及无人机,该方法用于无人机,该无人机包括拍摄装置,该方法包括:获取由拍摄装置拍摄的目标的当前帧图像;根据当前帧图像,获取目标的位置信息,其中,位置信息包括目标的高度和目标在图像中的二维像素坐标;获取拍摄装置的姿态信息,其中,姿态信息包括拍摄装置的俯仰角;根据目标的位置信息和拍摄装置的姿态信息,获取目标相对于无人机的距离。本发明可以准确地算出目标相对于无人机的距离,且不存在强假设条件,计算量小,可实时估算目标的空间位置信息,准确性高。

Description

一种目标距离估计方法、装置及无人机 技术领域
本发明涉及无人机领域,更具体地说,涉及一种目标距离估计方法、装置及无人机。
背景技术
基于无人机的智能跟随功能大致上分为两个部分,即目标检测和跟随控制。
一般地,无人机上的目标检测,是通过搭载一个云台相机实现的。该部分会在相机获得的图片上找出被跟随的目标,并用矩形框(或者其他形成)将其标出。而跟随控制部分则是实现在一定的需求下(如保持目标与无人机距离不变等),无人机能跟随目标移动。要实现跟随控制,最简单的方法是根据目标的三维空间位置进行跟随。然而,受限于云台相机是一个单目相机,无法直接从该相机获得被跟随目标的三维空间位置。
目前为了解决该问题,一种方法是通过无人机获得的对地高度,来计算目标相对无人机的距离。这种方法依赖于高度的准确性,而高度数据不稳定,容易使计算的距离产生越来越大的偏差;此外,该方法使用了一个强假设条件——即地面一直保持水平,然而,实际的地面存在高低起伏的现象,所以地面起伏会直接导致该方法失效。
另一种方法是通过三角测量对目标位置进行估计,该方法要求无人机产生位移,若无人机一直悬停,则无法估计目标相对无人机的距离。
发明内容
本发明要解决的技术问题在于,针对现有技术的上述缺陷,提供一种目标距离估计方法、装置及无人机。
本发明解决其技术问题所采用的技术方案是:构造一种目标距离估计方法,用于无人机,所述无人机包括拍摄装置,包括:
获取由所述拍摄装置拍摄的所述目标的当前帧图像;
根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标;
获取所述拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角;
根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
在其中一个实施例中,所述根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离d,包括:
获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
Figure PCTCN2019128058-appb-000001
根据所述夹角
Figure PCTCN2019128058-appb-000002
与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线构成的夹角
Figure PCTCN2019128058-appb-000003
包括:
获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍 摄装置的光轴之间的夹角α;
获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;
利用以下公式计算所述夹角
Figure PCTCN2019128058-appb-000004
Figure PCTCN2019128058-appb-000005
在其中一个实施例中,所述获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α,包括:
利用以下公式计算所述夹角α:
Figure PCTCN2019128058-appb-000006
其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
在其中一个实施例中,所述获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β,包括:
利用以下公式计算所述夹角β:
β=tan -1(v max-c y)/f y
其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
在其中一个实施例中,所述根据所述夹角
Figure PCTCN2019128058-appb-000007
与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d,包括:
根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;
根据所述夹角
Figure PCTCN2019128058-appb-000008
与所述夹角δ,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ,包括:
利用以下公式计算所述夹角δ:
Figure PCTCN2019128058-appb-000009
在其中一个实施例中,所述根据所述夹角
Figure PCTCN2019128058-appb-000010
与所述夹角δ,获取所述目标相对于所述无人机的距离d,包括:
利用以下公式计算所述目标相对于所述无人机的距离d:
d=L 2 sinδ;
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
在其中一个实施例中,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000011
Figure PCTCN2019128058-appb-000012
其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
在其中一个实施例中,所述拍摄装置包括单目相机。
本发明还提供一种目标距离估计装置,用于无人机,包括:
图像获取单元,用于获取所述目标的当前帧图像;
位置信息获取单元,用于根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标;
姿态信息获取单元,用于获取所述无人机的拍摄装置的姿态信息,其中, 所述姿态信息包括所述拍摄装置的俯仰角;以及
距离确定单元,用于根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述距离确定单元具体用于:
获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
Figure PCTCN2019128058-appb-000013
以及
根据所述夹角
Figure PCTCN2019128058-appb-000014
与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述距离确定单元具体用于:
获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α;
获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;以及
利用以下公式计算所述夹角
Figure PCTCN2019128058-appb-000015
Figure PCTCN2019128058-appb-000016
在其中一个实施例中,所述第一子获取模块包括:利用以下公式计算所述夹角α:
Figure PCTCN2019128058-appb-000017
其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
在其中一个实施例中,所述距离确定单元具体用于:
利用以下公式计算所述夹角β:
β=tan -1(v max-c y)/f y
其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
在其中一个实施例中,所述距离确定单元具体用于:
根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;以及
根据所述夹角
Figure PCTCN2019128058-appb-000018
与所述夹角δ,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述距离确定单元具体用于:
利用以下公式计算所述夹角δ:
Figure PCTCN2019128058-appb-000019
在其中一个实施例中,所述距离确定单元具体用于:
利用以下公式计算所述目标相对于所述无人机的距离d:
d=L 2 sinδ;
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
在其中一个实施例中,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000020
Figure PCTCN2019128058-appb-000021
其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L 为所述目标的高度。
在其中一个实施例中,所述拍摄装置包括单目相机。
本发明还提供一种无人机,包括:
机身;
机臂,与所述机身相连;
动力装置,设于所述机臂,用于提供所述无人机飞行的动力;
拍摄装置,设于所述机身,用于获取目标的当前帧图像;
惯性测量单元,设于所述拍摄装置,用于获取所述拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角;以及
视觉芯片,设于所述机身,并与所述惯性测量单元电连接;
其中,所述视觉芯片用于:
根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标;
根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
在其中一个实施例中,所述视觉芯片具体用于:
获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
Figure PCTCN2019128058-appb-000022
根据所述夹角
Figure PCTCN2019128058-appb-000023
与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无 人机的距离d。
在其中一个实施例中,所述视觉芯片具体用于:
获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α;
获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;
利用以下公式计算所述夹角
Figure PCTCN2019128058-appb-000024
Figure PCTCN2019128058-appb-000025
在其中一个实施例中,所述视觉芯片具体用于:
利用以下公式计算所述夹角α:
Figure PCTCN2019128058-appb-000026
其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
;在其中一个实施例中,所述视觉芯片具体用于:
利用以下公式计算所述夹角β:
β=tan -1(v max-c y)/f y
其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
在其中一个实施例中,所述视觉芯片具体用于:
根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;
根据所述夹角
Figure PCTCN2019128058-appb-000027
与所述夹角δ,获取所述目标相对于所述无人机的距离d。
在其中一个实施例中,所述视觉芯片具体用于:
利用以下公式计算所述夹角δ:
Figure PCTCN2019128058-appb-000028
在其中一个实施例中,所述视觉芯片具体用于:
利用以下公式计算所述目标相对于所述无人机的距离d:
d=L 2 sinδ;
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
在其中一个实施例中,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000029
Figure PCTCN2019128058-appb-000030
其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
在其中一个实施例中,所述拍摄装置包括单目相机。
实施本发明的基于视觉人无人机目标跟踪方法,具有以下有益效果:本发明的基于视觉无人机目标跟踪方法通过图像采集装置获得目标的每一帧图像,并采用预设方法对每一帧图像进行处理,获得目标每一帧对应时刻的移动信息,并结合此时图像采集装置的设备信息进行运算处理,最终获得目标在每一帧的实时空间位置信息,从而根据目标每一帧的实时空间位置信息控制无人机对目标进行跟踪。通过该方法可以准确地算出目标的空间位置信息,不存在强假设条件,且计算量小,不仅可以实时估算目标的空间位置信息,而且准确性高。
附图说明
下面将结合附图及实施例对本发明作进一步说明,附图中:
图1是本发明实施例的目标距离估计方法的流程示意图;
图2是本发明实施例中目标距离估计方法具体的流程示意图;
图3是本发明实施例中的目标的空间位置的示意图;
图4是本发明实施例的目标距离估计装置的原理框图;
图5是本发明实施例的一种无人机的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为了解决现有无人机对目标的三维空间位置信息由于受限于所采用的方法的条件导致估算目标的三维空间信息失效或者无法估计目标的三维空间信息或者估算准确度不高的问题,本发明实施例提供了一种目标距离估计方法,该方法可以准确地估算出目标相对于无人机的距离,且不依赖于地平面假设,不存在强假设条件,计算量小,可以高准确度实时估算目标的三维空间位置信息,从而可以保证无人机可以根据所估算的三维空间位置信息实现精确跟踪,或者,也可以保证无人机根据所估算的三维空间位置信息实现自主避障控制。
参考图1,本发明实施例提供了一种目标距离估计方法。该方法可用于供无人机实时获取目标相对于无人机的距离,并根据所获取的距离实现对目标的实时跟踪,其中,这里的目标可以为移动的目标也可以为静止的目标,且在实时跟踪过程中,目标的运动状态(移动或者保持静止状态)不影响计算结果的准确性。另外,还可以根据所获取的目标相对于无人机的距离进行避 障。
如图1,示出了本发明实施例提供的一种目标距离估计方法的流程示意图。该目标距离估计方法可以应用于无人机,所应用的无人机包括拍摄装置。该方法具体可以包括以下步骤:
步骤S10、获取由所述拍摄装置拍摄的所述目标的当前帧图像。
本发明实施例中,拍摄装置可以为搭载在无人机上的云台和搭载于所述云台上的相机,该相机包括但不限于单目相机。
其中,在无人机开始进行目标跟踪控制或者避障控制前,先通过拍摄装置进行实时拍摄,进而获得包括目标在内的图像序列,并将所获得的图像序列实时发送给无人机,该图像序列包括拍摄装置所拍摄的所有帧的图像。
步骤S20、根据所述当前帧图像,获取所述目标的位置信息。
其中,目标的位置信息包括:目标的高度和目标在当前帧图像中的二维像素坐标。
步骤S30、获取所述拍摄装置的姿态信息。
其中,拍摄装置的姿态信息包括拍摄装置的俯仰角。
步骤S40、根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
具体的,如图2所示,目标相对于无人机的距离d可以通过以下步骤获得:
即,在本发明的一实施例中,步骤S40包括:
步骤S401、获取所述目标的最小外接框。
本发明实施例中,目标的最小外接框可以为包括目标在内的最小外接矩形区域。
其中,所述目标在所述当前帧图像中的所述二维像素坐标即为最小外接框在图像中的二维像素坐标,其中,最小外接框在图像中的二维像素坐标包括最小外接框的最高点的纵坐标坐标v min和所述最小外接框的最低点的纵坐标v max。具体如图3所示。
步骤S402、获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角φ。
具体的,在该步骤中,先获取最小外接框的最高点与拍摄装置之间的连接与拍摄装置的光轴之间的夹角α;接着,获取最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β。然后通过所获得的夹角α和夹角β计算出最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线构成的夹角φ。
其中,夹角φ可以利用以下公式计算得到:
φ=β-α。
本发明实施例中,夹角α可以通过以下公式计算得到:
Figure PCTCN2019128058-appb-000031
其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
本发明实施例中,夹角β可以通过以下公式计算得到:
Figure PCTCN2019128058-appb-000032
其中,v max为所述最小外接框的所述最低点的纵坐标,c x为所述拍摄装 置的光心纵坐标,f y为所述拍摄装置的在y轴方向上的焦距。
步骤S403、根据所述夹角φ与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
本发明实施例中,步骤S403可以包括以下步骤:
步骤S4031、根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ。
本发明实施例中,夹角δ可以利用以下公式计算得到:
Figure PCTCN2019128058-appb-000033
步骤S4032、根据所述夹角φ与所述夹角δ,获取所述目标相对于所述无人机的距离d。
本发明实施例中,目标与无人机的距离d可以利用以下公式计算得到:
d=L 2 sinδ。
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。如图3所示。
进一步地,最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000034
Figure PCTCN2019128058-appb-000035
其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。如图3所示。
具体的,在步骤S40中获取目标相对于无人机的距离d后,可以根据该距离控制无人机对目标进行跟踪。当然,可以理解地,在其他一些实施例中,也可以根据该距离进行自主避障控制。
这里需要说明的是,本发明实施例对所接收的每一帧图像均采用上述步骤S10~步骤S40的方法获得在目标每一帧中相对于无人机的距离,并根据所获得的每一帧的距离进行跟踪控制或者避障控制,使无人机实现对目标的实时位置的获取、跟踪或者自主避障控制等。
通过实施本发明的目标距离估计方法,有效的解决了单目相机难以获取目标的三维空间位置信息的问题,且本发明不依赖于地平面假设,也不存在其他强假设条件,计算量小,可实时估计目标的三维空间位置信息。且目标的运动状态以及无人机的位移情况也不会影响本发明的实施,即不论目标为运动状态还是静止状态,本发明仍可准确估算出目标的三维空间位置信息,且当无人机不产生位移,即无人机一直处于悬停状态时,本发明依然可以准确地估算出目标相对于无人机的距离,准确地获取目标的三维空间位置信息。
参考图4,为本发明实施例目标距离估计装置的原理框图。该目标距离估计装置可用于实现上述目标距离估计方法。本发明实施例的目标距离估计装置可以应用于无人机。
如图4所示,该目标距离估计装置可以包括:图像获取单元401、位置信息获取单元402、姿态信息获取单元403和距离确定单元404。
具体的,图像获取单元401,用于获取所述目标的当前帧图像。
位置信息获取单元402,用于根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标。
姿态信息获取单元403,用于获取所述拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角。
距离确定单元404,用于根据所述目标的所述位置信息和所述拍摄装置的 所述姿态信息,获取所述目标相对于所述无人机的距离d。
可选的,本发明实施例中,距离确定单元404具体用于:
获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max。;以及
获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角φ。
可选的,距离确定单元404具体用于:
获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α。;
其中,夹角α可以利用以下公式计算得到:
Figure PCTCN2019128058-appb-000036
其中,v min为所述最小外接框的所述最高点的纵坐标,c x为所述拍摄装置的光心横坐标,f y为所述拍摄装置的焦距。
距离确定单元404还可以用于获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β。
其中,夹角φ可以利用以下公式计算所述:
φ=β-α。
夹角β利用以下公式计算得到:
Figure PCTCN2019128058-appb-000037
其中,v max为所述最小外接框的所述最低点的纵坐标,c x为所述拍摄装 置的光心横坐标,f为所述拍摄装置的焦距。
距离确定单元404还可以用于根据所述夹角φ与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
可选的,本发明实施例中,距离确定单元404具体用于:
根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ。
本发明实施例中,可以利用以下公式计算所述夹角δ:
Figure PCTCN2019128058-appb-000038
距离确定单元404还可以用于根据所述夹角φ与所述夹角δ,获取所述目标相对于所述无人机的距离d。
本发明实施例中,利用以下公式计算所述目标相对于所述无人机的距离d:
d=L 2 sinδ;
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
进一步地,最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000039
Figure PCTCN2019128058-appb-000040
其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
在本发明的一实施例中,图像获取单元401可以为搭载在无人机上的拍摄装置,其中,拍摄装置可以包括云台和搭载在所述云台上的相机。所述相机包括但不限于单目相机。位置信息获取单元402和距离确定单元404可以为无人机的视觉芯片。姿态信息获取单元403可以为安装在无人机云台上的 IMU(惯性测量单元)。
本发明还提供了一种无人机,该无人机可以用于实现前述的目标距离估计方法,可以实时估算目标相对于无人机的距离,还可以根据实时估算的目标相对于无人机的距离实现对目标的跟踪或者避障。
如图5所示,本发明实施提供的无人机包括机身100、与机身100相连的机臂200、设于机臂200的动力装置201、设于机身100的拍摄装置101、视觉芯片102、以及设置于拍摄装置101的惯性测量单元103其中,视觉芯片102与惯性测量单元103电连接。
本发明实施例中,动力装置201用于提供无人机飞行的动力。可选的,动力装置201可以包括设于机臂200的电机和与电机相连的螺旋桨,电机带动螺旋桨高速旋转以提供无人机飞行所需的动力。
本发明实施例中,拍摄装置101用于获取目标的当前帧图像。可选的,拍摄装置101可以为搭载在无人机上的云台相机。具体的,该拍摄装置101可以包括与无人机的机身100相连的云台和与云台相连的相机,该相机包括但不限于单目相机。
本发明实施例中,惯性测量单元103用于获取所述拍摄装置101的姿态信息,其中,所述姿态信息包括所述拍摄装置101的俯仰角。具体的,该惯性测量单元103设置在云台上。
本发明实施例中,视觉芯片102用于执行以下动作:
根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标。
根据所述目标的所述位置信息和所述拍摄装置101的所述姿态信息,获取所述目标相对于所述无人机的距离。
进一步地,该视觉芯片102具体可以用于执行以下步骤:
获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
获取由所述最小外接框的所述最高点与所述拍摄装置101之间的连线、与所述最小外接框的所述最低点与所述拍摄装置101之间的连线所构成的夹角
Figure PCTCN2019128058-appb-000041
根据所述夹角
Figure PCTCN2019128058-appb-000042
与所述拍摄装置101的俯仰角θ,获取所述目标相对于所述无人机的距离d。
进一步地,该视觉芯片102具体还可以用于:
获取所述最小外接框的所述最高点与所述拍摄装置101之间的连线与所述拍摄装置101的光轴之间的夹角α。
获取所述最小外接框的所述最低点与所述拍摄装置101之间的连线与所述拍摄装置101的光轴之间的夹角β。
其中,可以利用以下公式计算所述夹角φ:
φ=β-α。
进一步地,该视觉芯片102具体还可以用于:
利用以下公式计算所述夹角α:
Figure PCTCN2019128058-appb-000043
其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置101的光心纵坐标,f y为所述拍摄装置101在y轴方向上的焦距。
进一步地,该视觉芯片102具体还可以用于:
利用以下公式计算所述夹角β:
β=tan -1(v max-c y)/f y
其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置101的光心纵坐标,f y为所述拍摄装置101在y轴方向上的焦距。
进一步地,该视觉芯片102具体还可以用于:
根据所述拍摄装置101的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置101之间的连线与竖直方向构成的夹角δ。
根据所述夹角φ与所述夹角δ,获取所述目标相对于所述无人机的距离d。
进一步地,该视觉芯片102具体还可以用于:
利用以下公式计算所述夹角δ:
Figure PCTCN2019128058-appb-000044
进一步地,该视觉芯片102具体还可以用于:
利用以下公式计算所述目标相对于所述无人机的距离d:
d=L 2 sinδ;
其中,L 2为所述最小外接框的所述最低点与所述拍摄装置101的连线的长度。
进一步地,最小外接框的所述最低点与所述拍摄装置101的连线的长度L 2可由以下公式求得:
Figure PCTCN2019128058-appb-000045
Figure PCTCN2019128058-appb-000046
其中,L 1所述最小外接框的所述最高点与所述拍摄装置101的连线的长度,L为所述目标的高度。
以上实施例只为说明本发明的技术构思及特点,其目的在于让熟悉此项技术的人士能够了解本发明的内容并据此实施,并不能限制本发明的保护范围。凡跟本发明权利要求范围所做的均等变化与修饰,均应属于本发明权利要求的涵盖范围。
应当理解的是,对本领域普通技术人员来说,可以根据上述说明加以改进或变换,而所有这些改进和变换都应属于本发明所附权利要求的保护范围。

Claims (30)

  1. 一种目标距离估计方法,用于无人机,所述无人机包括拍摄装置,其特征在于,包括:
    获取由所述拍摄装置拍摄的所述目标的当前帧图像;
    根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标;
    获取所述拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角;
    根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离d,包括:
    获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
    获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
    Figure PCTCN2019128058-appb-100001
    根据所述夹角
    Figure PCTCN2019128058-appb-100002
    与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
  3. 根据权利要求2所述的方法,其特征在于,所述获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线构成的夹角
    Figure PCTCN2019128058-appb-100003
    包括:
    获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α;
    获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;
    利用以下公式计算所述夹角
    Figure PCTCN2019128058-appb-100004
    Figure PCTCN2019128058-appb-100005
  4. 根据权利要求3所述的方法,其特征在于,所述获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α,包括:
    利用以下公式计算所述夹角α:
    α=tan -1(v min-c y)/f y
    其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
  5. 根据权利要求3或4所述的方法,其特征在于,所述获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β,包括:
    利用以下公式计算所述夹角β:
    β=tan -1(v max-c y)/f y
    其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
  6. 根据权利要求3-5中任一项所述的方法,其特征在于,所述根据所述夹角
    Figure PCTCN2019128058-appb-100006
    与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d,包括:
    根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;
    根据所述夹角
    Figure PCTCN2019128058-appb-100007
    与所述夹角δ,获取所述目标相对于所述无人机的距离d。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ,包括:
    利用以下公式计算所述夹角δ:
    Figure PCTCN2019128058-appb-100008
  8. 根据权利要求6或7所述的方法,其特征在于,所述根据所述夹角
    Figure PCTCN2019128058-appb-100009
    与所述夹角δ,获取所述目标相对于所述无人机的距离d,包括:
    利用以下公式计算所述目标相对于所述无人机的距离d:
    d=L 2sinδ;
    其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
  9. 根据权利要求8所述的方法,其特征在于,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
    Figure PCTCN2019128058-appb-100010
    Figure PCTCN2019128058-appb-100011
    其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,所述拍摄装置包括单目相机。
  11. 一种目标距离估计装置,用于无人机,其特征在于,包括:
    图像获取单元,用于获取所述目标的当前帧图像;
    位置信息获取单元,用于根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包括所述目标的高度和所述目标在所述图像中的二维像素坐标;
    姿态信息获取单元,用于获取所述无人机的拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角;以及
    距离确定单元,用于根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
  12. 根据权利要求11所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
    获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
    Figure PCTCN2019128058-appb-100012
    以及
    根据所述夹角
    Figure PCTCN2019128058-appb-100013
    与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
  13. 根据权利要求12所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α;
    获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;以及
    利用以下公式计算所述夹角
    Figure PCTCN2019128058-appb-100014
    Figure PCTCN2019128058-appb-100015
  14. 根据权利要求13所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    利用以下公式计算所述夹角α:
    α=tan -1(v min-c y)/f y
    其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置 的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
  15. 根据权利要求13或14所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    利用以下公式计算所述夹角β:
    β=tan -1(v max-c y)/f y
    其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
  16. 根据权利要求13-15中任一项所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;以及
    根据所述夹角
    Figure PCTCN2019128058-appb-100016
    与所述夹角δ,获取所述目标相对于所述无人机的距离d。
  17. 根据权利要求16所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    利用以下公式计算所述夹角δ:
    Figure PCTCN2019128058-appb-100017
  18. 根据权利要求16或17所述的目标距离估计装置,其特征在于,所述距离确定单元具体用于:
    利用以下公式计算所述目标相对于所述无人机的距离d:
    d=L 2sinδ;
    其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
  19. 根据权利要求18所述的目标距离估计装置,其特征在于,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
    Figure PCTCN2019128058-appb-100018
    Figure PCTCN2019128058-appb-100019
    其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
  20. 根据权利要求11-19任一项所述的目标距离估计装置,其特征在于,所述拍摄装置包括单目相机。
  21. 一种无人机,其特征在于,包括:
    机身;
    机臂,与所述机身相连;
    动力装置,设于所述机臂,用于提供所述无人机飞行的动力;
    拍摄装置,设于所述机身,用于获取目标的当前帧图像;
    惯性测量单元,设于所述拍摄装置,用于获取所述拍摄装置的姿态信息,其中,所述姿态信息包括所述拍摄装置的俯仰角;以及
    视觉芯片,设于所述机身,并与所述惯性测量单元电连接;
    其中,所述视觉芯片用于:
    根据所述当前帧图像,获取所述目标的位置信息,其中,所述位置信息包 括所述目标的高度和所述目标在所述图像中的二维像素坐标;
    根据所述目标的所述位置信息和所述拍摄装置的所述姿态信息,获取所述目标相对于所述无人机的距离。
  22. 根据权利要求21所述的无人机,其特征在于,所述视觉芯片具体用于:
    获取所述目标的最小外接框,其中,所述目标在所述图像中的二维像素坐标为所述最小外接框在所述图像中的二维像素坐标,所述最小外接框在所述图像中的二维像素坐标包括所述最小外接框的最高点的纵坐标v min和所述最小外接框的最低点的纵坐标v max
    获取由所述最小外接框的所述最高点与所述拍摄装置之间的连线、与所述最小外接框的所述最低点与所述拍摄装置之间的连线所构成的夹角
    Figure PCTCN2019128058-appb-100020
    根据所述夹角
    Figure PCTCN2019128058-appb-100021
    与所述拍摄装置的俯仰角θ,获取所述目标相对于所述无人机的距离d。
  23. 根据权利要求22所述的无人机,其特征在于,所述视觉芯片具体用于:
    获取所述最小外接框的所述最高点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角α;
    获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与所述拍摄装置的光轴之间的夹角β;
    利用以下公式计算所述夹角
    Figure PCTCN2019128058-appb-100022
    Figure PCTCN2019128058-appb-100023
  24. 根据权利要求23所述的无人机,其特征在于,所述视觉芯片具体用于:
    利用以下公式计算所述夹角α:
    α=tan -1(v min-c y)/f y
    其中,v min为所述最小外接框的所述最高点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。
  25. 根据权利要求23或24所述的无人机,其特征在于,所述视觉芯片具体用于:
    利用以下公式计算所述夹角β:
    β=tan -1(v max-c y)/f y
    其中,v max为所述最小外接框的所述最低点的纵坐标,c y为所述拍摄装置的光心纵坐标,f y为所述拍摄装置在y轴方向上的焦距。;
  26. 根据权利要求23-25中任一项所述的无人机,其特征在于,所述视觉芯片具体用于:
    根据所述拍摄装置的俯仰角θ,获取所述最小外接框的所述最低点与所述拍摄装置之间的连线与竖直方向构成的夹角δ;
    根据所述夹角
    Figure PCTCN2019128058-appb-100024
    与所述夹角δ,获取所述目标相对于所述无人机的距离d。
  27. 根据权利要求26所述的无人机,其特征在于,所述视觉芯片具体用 于:
    利用以下公式计算所述夹角δ:
    Figure PCTCN2019128058-appb-100025
  28. 根据权利要求26或27所述的无人机,其特征在于,所述视觉芯片具体用于:
    利用以下公式计算所述目标相对于所述无人机的距离d:
    d=L 2sinδ;
    其中,L 2为所述最小外接框的所述最低点与所述拍摄装置的连线的长度。
  29. 根据权利要求28所述的无人机,其特征在于,所述最小外接框的所述最低点与所述拍摄装置的连线的长度L 2可由以下公式求得:
    Figure PCTCN2019128058-appb-100026
    Figure PCTCN2019128058-appb-100027
    其中,L 1所述最小外接框的所述最高点与所述拍摄装置的连线的长度,L为所述目标的高度。
  30. 根据权利要求21-29中任一项所述的无人机,其特征在于,所述拍摄装置包括单目相机。
PCT/CN2019/128058 2018-12-24 2019-12-24 一种目标距离估计方法、装置及无人机 WO2020135447A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19905518.7A EP3905197A4 (en) 2018-12-24 2019-12-24 METHOD AND DEVICE FOR TARGET RANGE ESTIMATION AND UNMANNED AIRCRAFT
US17/356,656 US11747833B2 (en) 2018-12-24 2021-06-24 Method and device for estimating distance to target, and unmanned aerial vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811580979.9 2018-12-24
CN201811580979.9A CN109754420B (zh) 2018-12-24 2018-12-24 一种目标距离估计方法、装置及无人机

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/356,656 Continuation US11747833B2 (en) 2018-12-24 2021-06-24 Method and device for estimating distance to target, and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
WO2020135447A1 true WO2020135447A1 (zh) 2020-07-02

Family

ID=66402910

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/128058 WO2020135447A1 (zh) 2018-12-24 2019-12-24 一种目标距离估计方法、装置及无人机

Country Status (4)

Country Link
US (1) US11747833B2 (zh)
EP (1) EP3905197A4 (zh)
CN (1) CN109754420B (zh)
WO (1) WO2020135447A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12014508B2 (en) 2021-10-18 2024-06-18 Ford Global Technologies, Llc Distance determination from image data

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109754420B (zh) 2018-12-24 2021-11-12 深圳市道通智能航空技术股份有限公司 一种目标距离估计方法、装置及无人机
CN110570463B (zh) * 2019-09-11 2023-04-07 深圳市道通智能航空技术股份有限公司 一种目标状态估计方法、装置和无人机
WO2021258251A1 (zh) * 2020-06-22 2021-12-30 深圳市大疆创新科技有限公司 用于可移动平台的测绘方法、可移动平台和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103075998A (zh) * 2012-12-31 2013-05-01 华中科技大学 一种单目空间目标测距测角方法
CN106054929A (zh) * 2016-06-27 2016-10-26 西北工业大学 一种基于光流的无人机自动降落引导方法
CN107452037A (zh) * 2017-08-02 2017-12-08 北京航空航天大学青岛研究院 一种基于gps辅助信息加速的从运动中恢复结构方法
CN108476288A (zh) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 拍摄控制方法及装置
CN109754420A (zh) * 2018-12-24 2019-05-14 深圳市道通智能航空技术有限公司 一种目标距离估计方法、装置及无人机

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342746B1 (en) * 2011-03-17 2016-05-17 UtopiaCompression Corporation Maneuverless passive range estimation using monocular image sequences
CN103604427B (zh) * 2013-12-10 2016-10-12 中国航天空气动力技术研究院 对地面移动目标动态定位的无人机系统和方法
CN103822615B (zh) * 2014-02-25 2016-01-20 北京航空航天大学 一种多控制点自动提取与聚合的无人机地面目标实时定位方法
KR101614654B1 (ko) * 2015-06-02 2016-04-21 동의대학교 산학협력단 단일 카메라가 장착된 드론을 이용한 추적 객체 위치 판별 방법
EP3353706A4 (en) * 2015-09-15 2019-05-08 SZ DJI Technology Co., Ltd. SYSTEM AND METHOD FOR MONITORING UNIFORM TARGET TRACKING
CN105698762B (zh) * 2016-01-15 2018-02-23 中国人民解放军国防科学技术大学 一种单机航迹上基于不同时刻观测点的目标快速定位方法
CN106326892B (zh) * 2016-08-01 2020-06-09 西南科技大学 一种旋翼式无人机的视觉着陆位姿估计方法
US10033980B2 (en) * 2016-08-22 2018-07-24 Amazon Technologies, Inc. Determining stereo distance information using imaging devices integrated into propeller blades
CN106373159A (zh) * 2016-08-30 2017-02-01 中国科学院长春光学精密机械与物理研究所 一种简化的无人机多目标定位方法
CN106570820B (zh) * 2016-10-18 2019-12-03 浙江工业大学 一种基于四旋翼无人机的单目视觉三维特征提取方法
CN107247458A (zh) * 2017-05-24 2017-10-13 中国电子科技集团公司第二十八研究所 无人机视频图像目标定位系统、定位方法及云台控制方法
CN107729808B (zh) * 2017-09-08 2020-05-01 国网山东省电力公司电力科学研究院 一种用于输电线路无人机巡检的图像智能采集系统及方法
CN107798685B (zh) * 2017-11-03 2019-12-03 北京旷视科技有限公司 行人身高确定方法、装置及系统
CN108107920A (zh) * 2017-12-19 2018-06-01 天津工业大学 一种微小型双轴视觉稳定云台目标探测跟踪系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103075998A (zh) * 2012-12-31 2013-05-01 华中科技大学 一种单目空间目标测距测角方法
CN106054929A (zh) * 2016-06-27 2016-10-26 西北工业大学 一种基于光流的无人机自动降落引导方法
CN108476288A (zh) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 拍摄控制方法及装置
CN107452037A (zh) * 2017-08-02 2017-12-08 北京航空航天大学青岛研究院 一种基于gps辅助信息加速的从运动中恢复结构方法
CN109754420A (zh) * 2018-12-24 2019-05-14 深圳市道通智能航空技术有限公司 一种目标距离估计方法、装置及无人机

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12014508B2 (en) 2021-10-18 2024-06-18 Ford Global Technologies, Llc Distance determination from image data

Also Published As

Publication number Publication date
CN109754420A (zh) 2019-05-14
EP3905197A4 (en) 2022-02-23
EP3905197A1 (en) 2021-11-03
US20210325912A1 (en) 2021-10-21
CN109754420B (zh) 2021-11-12
US11747833B2 (en) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2020135447A1 (zh) 一种目标距离估计方法、装置及无人机
CN108399642B (zh) 一种融合旋翼无人机imu数据的通用目标跟随方法和系统
WO2020135446A1 (zh) 一种目标定位方法和装置、无人机
JP5618840B2 (ja) 飛行体の飛行制御システム
US20200346753A1 (en) Uav control method, device and uav
JP5775632B2 (ja) 飛行体の飛行制御システム
WO2020014909A1 (zh) 拍摄方法、装置和无人机
US11057604B2 (en) Image processing method and device
WO2019113966A1 (zh) 一种避障方法、装置和无人机
CN108363946B (zh) 基于无人机的人脸跟踪系统及方法
WO2018210078A1 (zh) 无人机的距离测量方法以及无人机
WO2020239094A1 (zh) 一种对焦方法、装置、航拍相机以及无人飞行器
WO2019126930A1 (zh) 测距方法、装置以及无人机
WO2019127518A1 (zh) 避障方法、装置及可移动平台
JP2019008676A (ja) 制御装置、飛行体、および制御プログラム
JP2018004420A (ja) 装置、移動体装置、位置ずれ検出方法及び測距方法
WO2020198963A1 (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
WO2019183789A1 (zh) 无人机的控制方法、装置和无人机
WO2021258251A1 (zh) 用于可移动平台的测绘方法、可移动平台和存储介质
WO2021217371A1 (zh) 可移动平台的控制方法和装置
WO2021056139A1 (zh) 获取降落位置的方法、设备、无人机、系统及存储介质
WO2022193081A1 (zh) 无人机的控制方法、装置及无人机
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机
WO2020019130A1 (zh) 运动估计方法及可移动设备
WO2019205103A1 (zh) 云台姿态修正方法、云台姿态修正装置、云台、云台系统和无人机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19905518

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019905518

Country of ref document: EP

Effective date: 20210726