CN110799801A - Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle Download PDF

Info

Publication number
CN110799801A
CN110799801A CN201880039803.4A CN201880039803A CN110799801A CN 110799801 A CN110799801 A CN 110799801A CN 201880039803 A CN201880039803 A CN 201880039803A CN 110799801 A CN110799801 A CN 110799801A
Authority
CN
China
Prior art keywords
information
image
unmanned aerial
aerial vehicle
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880039803.4A
Other languages
Chinese (zh)
Inventor
翁超
熊川樘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN110799801A publication Critical patent/CN110799801A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Abstract

A distance measurement method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle comprise: acquiring a first image captured by a capturing device (120) at a first position with respect to a target; controlling the shooting device to move from the first position to the second position according to the position information of the first position; acquiring a second image captured by the capturing device (120) at a second position with respect to the target; determining the depth information of the target according to the distance between the first position and the second position, the first image and the second image; the heights of the first position and the second position are equal, and a connecting line of the first position and the second position is parallel to a shooting plane shot for the object when the shooting device is at the first position. Accomplish the image that shoots based on two shooting device in two shooting positions among the prior art through single shooting device (120), realized the same effect with binocular range finding, saved the cost of range finding, can use on small-size unmanned aerial vehicle.

Description

Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle
Technical Field
The invention relates to the field of distance measurement, in particular to a distance measurement method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle.
Background
In the prior art, a binocular shooting device is adopted to acquire depth information of a target, and under the balance between image resolution and object distance precision, the distance between binocular cameras is larger as a necessary result. And the small unmanned aerial vehicle causes the application of binocular camera to be very limited due to the restriction of position. Adopt large-scale unmanned aerial vehicle to carry out relevant survey and drawing in the trade more, it is with high costs, be unfavorable for promoting. In addition, two shooting devices need to be installed on unmanned aerial vehicle to the realization of binocular shooting device, and the cost is higher.
Disclosure of Invention
The invention provides a distance measuring method and device based on an unmanned aerial vehicle and the unmanned aerial vehicle.
According to a first aspect of the present invention, there is provided a ranging method based on an unmanned aerial vehicle having a camera mounted thereon, the method comprising:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
According to a second aspect of the present invention, there is provided a ranging apparatus based on a drone, including a camera mounted on the drone and a processor communicatively connected to the camera, the processor being configured to:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
According to a third aspect of the invention, there is provided a drone comprising:
a body;
a camera mounted on the body; and
a processor communicatively coupled to the camera, the processor configured to:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
According to the technical scheme provided by the embodiment of the invention, the single shooting device is controlled to move to the two shooting positions respectively, the images are obtained respectively, the depth information of the target is calculated, the images shot by the two shooting devices at the two shooting positions in the prior art are completed through the single shooting device, the same effect as binocular distance measurement is achieved, the distance measurement cost is saved, the distance measurement method provided by the embodiment of the invention can be applied to small-sized unmanned aerial vehicles, and the requirements of most application scenes are met.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic structural diagram of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a flowchart of a method of a ranging method based on an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 3 is a flowchart of a specific method of a ranging method based on an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a position relationship between a target and a drone provided by an embodiment of the present invention;
fig. 5 is a flowchart of another specific method of the unmanned aerial vehicle-based ranging method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a position relationship between a target and a pan/tilt head according to an embodiment of the present invention;
fig. 7 is a block diagram of a ranging apparatus based on an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 8 is a block diagram of an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following describes in detail a ranging method and apparatus based on an unmanned aerial vehicle, and an unmanned aerial vehicle according to the present invention, with reference to the accompanying drawings. The features of the following examples and embodiments may be combined with each other without conflict.
Fig. 1 is a schematic view of an unmanned aerial vehicle according to an embodiment of the present invention. The drone 100 may include a carrier 110 and a load 120. In some embodiments, the load 120 may be located directly on the drone 100 without the carrier 110. In this embodiment, the supporting body 110 is a pan/tilt head, such as a two-axis pan/tilt head or a three-axis pan/tilt head. The load 120 may be an image capturing device or a camera device (e.g., a camera, a camcorder, an infrared camera device, an ultraviolet camera device, or the like), an infrared camera device, or the like, and the load 120 may provide static sensing data (e.g., pictures) or dynamic sensing data (e.g., videos). The load 120 is mounted on the carrier 110, so that the rotation of the load 120 is controlled by the carrier 110.
Further, the drone 100 may include a power mechanism 130, a sensing system 140, and a communication system 150. The power mechanism 130 may include one or more rotating bodies, propellers, blades, motors, electronic speed regulators, and the like. For example, the rotator of the power mechanism may be a self-fastening rotator, a rotator assembly, or other rotator power unit. The drone 100 may have one or more powered mechanisms. All power mechanisms may be of the same type. Alternatively, one or more of the power mechanisms may be of a different type. The power mechanism 130 may be mounted on the drone by suitable means, such as by a support element (e.g., a drive shaft). The power mechanism 130 may be mounted at any suitable location on the drone 100, such as the top, bottom, front, back, sides, or any combination thereof. By controlling one or more power mechanisms 130, to control the flight of the drone 100.
The sensing system 140 may include one or more sensors to sense spatial orientation, velocity, and/or acceleration (e.g., rotation and translation with respect to up to three degrees of freedom) of the drone 100. The one or more sensors may include a GPS sensor, a motion sensor, an inertial sensor, a proximity sensor, or an image sensor. The sensed data provided by the sensing system 140 may be used to track the spatial orientation, velocity and/or acceleration of the target (using a suitable processing unit and/or control unit, as described below). Optionally, the sensing system 140 may be used to collect environmental data of the drone, such as climate conditions, potential obstacles to approach, location of geographic features, location of man-made structures, and the like.
The communication system 150 is capable of communicating with a terminal 160 having a communication module via wireless signals. The communication system 150, communication module, may include any number of transmitters, receivers, and/or transceivers for wireless communication. The communication may be a one-way communication such that data may be transmitted from one direction. For example, one-way communication may include only the drone 100 transmitting data to the terminal 160, or vice versa. One or more transmitters of the communication system 150 may transmit data to one or more receivers of the communication module and vice versa. Alternatively, the communication may be two-way communication, such that data may be transmitted in both directions between the drone 100 and the terminal 160. Two-way communication includes one or more transmitters of the communication system 150 that can transmit data to one or more receivers of the communication module, and vice versa.
In some embodiments, the terminal 160 may provide control data to one or more of the drone 100, the carrier 110, and the load 120, and receive information from one or more of the drone 100, the carrier 110, and the load 120 (e.g., position and/or motion information of the drone, the carrier, or the load, load-sensed data, such as image data captured by a camera).
In some embodiments, the drone 100 may communicate with other remote devices than the terminal 160, and the terminal 160 may also communicate with other remote devices than the drone 100. For example, the drone and/or the terminal 160 may communicate with another drone or a bearer or load of another drone. The additional remote device may be a second terminal or other computing device (such as a computer, desktop, tablet, smartphone, or other mobile device) when desired. The remote device may transmit data to the drone 100, receive data from the drone 100, transmit data to the terminal 160, and/or receive data from the terminal 160. Alternatively, the remote device may be connected to the internet or other telecommunications network to enable data received from the drone 100 and/or the terminal 160 to be uploaded to a website or server.
In some embodiments, the movement of the drone 100, the movement of the carrier 110, and the movement of the load 120 relative to a fixed reference (e.g., an external environment), and/or each other, may be controlled by the terminal 160. The terminal 160 may be a remote control terminal located remotely from the drone, carrier and/or load. The terminal 160 may be located on or affixed to a support platform. Alternatively, the terminal 160 may be hand-held or wearable. For example, the terminal 160 may include a smartphone, a tablet, a desktop, a computer, glasses, gloves, a helmet, a microphone, or any combination thereof. The terminal 160 may include a user interface such as a keyboard, mouse, joystick, touch screen, or display. Any suitable user input may interact with terminal 160 such as manual input commands, voice control, gesture control, or position control (e.g., through movement, position, or tilt of terminal 160).
In the following embodiments, the unmanned aerial vehicle-based ranging method and apparatus and the unmanned aerial vehicle are described separately by taking the load 120 including the shooting apparatus as an example.
Example one
Fig. 2 is a flowchart of a method of a ranging method based on an unmanned aerial vehicle according to an embodiment of the present invention. The main execution body of the distance measuring method of the embodiment is an unmanned aerial vehicle, and specifically, the main execution body can be one or more of a flight controller, a pan-tilt controller and other controllers arranged on the unmanned aerial vehicle.
Referring to fig. 2, the drone-based ranging method may include the steps of:
step S201: acquiring a first image captured by a capturing device at a first position with respect to a subject (P in fig. 4 and 6);
in this step, the first position may be a position preset by the user, or may be the current position of the unmanned aerial vehicle, and may specifically be set as required. For example, in one embodiment, the first position is predetermined by a user. In this embodiment, a user sets location information of a first location in advance, where the location information of the first location includes geographic location information (i.e., longitude and latitude) and altitude information. Specifically, the position information of the first position can be preset in various ways, and optionally, before the unmanned aerial vehicle is started to execute the ranging program, the position information of the first position is directly input to the unmanned aerial vehicle through a parameter setting module of the unmanned aerial vehicle. Optionally, before starting the drone to execute the ranging procedure, the location information of the first location is input to the drone through a device (such as a remote controller, a terminal, and the like) that controls the drone. Optionally, the device controlling the unmanned aerial vehicle sends a ranging instruction to the unmanned aerial vehicle, the unmanned aerial vehicle is triggered to start a ranging program, and the position information of the first position is carried in the ranging instruction.
In another embodiment, after receiving the ranging start instruction, the unmanned aerial vehicle obtains the position information of the current position of the unmanned aerial vehicle, and sets the current position of the unmanned aerial vehicle as the first position, where the unmanned aerial vehicle is located when the unmanned aerial vehicle receives the ranging start instruction. In this embodiment, the position of the drone may be obtained based on at least one sensor of the drone, for example, the geographical location information of the current position of the drone may be detected based on a GPS or other positioning device, and the altitude information of the current position of the drone may be detected based on a vision module (VO or VIO), a barometer, or other ranging sensor. Wherein, GPS or other positioner can locate on unmanned aerial vehicle, the cloud platform or shoot the device, and visual module, barometer or other range finding sensor also can locate on unmanned aerial vehicle, the cloud platform or shoot the device. In addition, the ranging start instruction of this implementation can be sent by the equipment (such as remote controller, terminal etc.) of control unmanned aerial vehicle.
Step S202: controlling the shooting device to move from the first position to the second position according to the position information of the first position;
in this embodiment, the first position and the second position need to satisfy the position relationship between the two shooting positions when the binocular shooting device shoots, so as to realize the effect that a single shooting device simulates the binocular shooting device, specifically, the position relationship between the first position and the second position needs to satisfy the following condition: the heights of the first position and the second position are equal, and a connecting line of the first position and the second position is parallel to a shooting plane shot for the object when the shooting device is at the first position. Wherein, the height is unmanned aerial vehicle to ground direction's distance, and the shooting plane of shooting the target at the primary importance to shooting the device is when shooting device shoots at the primary importance, the imaging plane of shooting the target.
In this embodiment, the second position may be located on the left side of the first position, or may be located on the right side of the first position, so that a connection line between the first position and the second position is parallel to a shooting plane for shooting the target when the shooting device is at the first position, and specifically, the orientation of the second position relative to the first position may be selected as needed.
Further, the position relationship between the first position and the second position is also required to satisfy: the distance between the first position and the second position is less than or equal to a preset distance threshold. Since the heights of the first position and the second position are equal, the distance between the first position and the second position is the horizontal distance between the first position and the second position, that is, the binocular optical center distance to be used for calculating the depth information based on the binocular distance measuring principle. Optionally, when the preset distance threshold is a binocular distance measurement, the maximum value of the center distances of the two shooting devices is obtained, and when the distance between the first position and the second position exceeds the maximum value, the depth information of the target is not suitable for being calculated by using a binocular distance measurement principle. Optionally, when the preset distance threshold is smaller than the binocular distance measurement, the maximum value of the center distances of the two shooting devices is obtained. The preset distance threshold value can be set according to the requirement so as to meet different precision requirements.
The second position may be a position preset by a user, or may be a position determined according to the position information of the first position and a preset distance threshold. For example, in one embodiment, the second position is preset by the user. In this embodiment, the user sets the location information of the second location in advance, where the location information of the second location includes geographic location information (i.e., longitude and latitude) and altitude information. Specifically, the position information of the second position can be preset in multiple modes, and optionally, before the unmanned aerial vehicle is started to execute the ranging program, the position information of the second position is directly input to the unmanned aerial vehicle through a parameter setting module of the unmanned aerial vehicle. Optionally, before starting the drone to execute the ranging procedure, the location information of the second location is input to the drone through a device (such as a remote controller, a terminal, etc.) controlling the drone. Optionally, the device controlling the unmanned aerial vehicle sends a ranging start instruction to the unmanned aerial vehicle, the unmanned aerial vehicle is triggered to start a ranging program, and the position information of the second position is carried in the ranging start instruction.
In another embodiment, the second position is a position determined according to the position information of the first position and a preset distance threshold. Alternatively, after the position information of the first position is determined, a position (hereinafter referred to as a left shooting limit position) which is located on the left side of the first position and a distance to the first position is equal to a preset distance threshold value or any position on a line connecting the first position to the left shooting limit position is selected as the second position. Alternatively, after the position information of the first position is determined, a position (hereinafter referred to as a right shooting limit position) which is located on the right side of the first position and whose distance to the first position is equal to a preset distance threshold or any position on a line connecting the first position to the right shooting limit position is selected as the second position.
In a possible implementation, the first position and the second position are both preset by a user. In another possible implementation manner, the first position is preset by a user, and the second position is a position determined according to the position information of the first position and a preset distance threshold. In another possible implementation manner, the first position is a position at which the unmanned aerial vehicle receives the ranging instruction, and the second position is a position determined according to the position information of the first position and a preset distance threshold.
After determining the position information of the first position and the position information of the second position, step S202 may be performed.
After step S202 is performed, ideally, the shooting device should be located at the second position, but due to poor control accuracy, there may be a deviation between the actual position of the shooting device and the second position, if the deviation is too large, the horizontal distance and/or the height difference between the actual position of the unmanned aerial vehicle and the first position is large, which may result in that the depth information of the target cannot be calculated, and at this time, the actual position of the shooting device needs to be adjusted, so that the position relationship between the actual position of the shooting device and the first position meets the depth calculation requirement.
Specifically, in some embodiments, after step S202 is executed, the actual position information of the photographing device at the second position is obtained, then the horizontal distance between the first position and the second position is determined according to the position information of the first position and the actual position information of the second position, and then it is determined whether the horizontal distance between the first position and the second position is greater than the preset distance threshold. When the horizontal distance is larger than a preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold; when the horizontal distance between the first position and the second position is smaller than or equal to the preset distance threshold, the position of the shooting device in the horizontal direction (vertical to the direction from the shooting device to the ground) does not need to be adjusted.
In some embodiments, after step S202 is performed, the actual position information of the photographing device at the second position is obtained, then the height difference between the first position and the second position is determined according to the position information of the first position and the actual position information of the second position, and then whether the height difference between the first position and the second position is zero is determined. When the height difference between the first position and the second position is not equal to zero, adjusting the height of the shooting device to enable the height of the shooting device to be equal to that of the first position; when the height difference between the first position and the second position is zero, the height of the shooting device does not need to be adjusted. In other embodiments, when the height difference between the first position and the second position is within the preset height difference threshold range, the actual position of the camera after step S202 is performed does not need to be adjusted; when the height difference between the first position and the second position exceeds the preset height difference threshold, the actual position of the camera after the step S202 is performed needs to be adjusted.
In this embodiment, different implementations may be adopted to control the shooting device to move from the first position to the second position, for example, in one implementation, in combination with fig. 3 and 4, the unmanned aerial vehicle (reference numeral 100 in fig. 4) is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position, and the unmanned aerial vehicle is controlled to move to drive the shooting device to move from the first position to the second position. The mode of controlling the unmanned aerial vehicle to move can include but is not limited to the following modes:
(1) the method comprises the steps of firstly obtaining the current movement speed of the unmanned aerial vehicle, then determining the flight direction and the flight duration according to the current movement speed and the position information of the first position and the second position, and finally controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration. In this manner, the direction of flight is determined from the position information of the first and second positions. When the flight time length is determined, firstly, the distance between the first position and the second position is determined according to the position information of the first position and the second position, and then the flight time length is determined according to the distance between the first position and the second position and the current movement speed. Optionally, the flight duration is the distance between the first position and the second position/the current movement speed.
(2) The method comprises the steps of firstly obtaining the current movement speed of the unmanned aerial vehicle, then determining the flight time according to a preset distance threshold value of the current movement speed, and then controlling the unmanned aerial vehicle to translate towards the left or the right relative to a first position according to the current movement speed according to the flight time. Optionally, the flight time is equal to a preset distance threshold/current movement speed. Optionally, the flight time is greater than zero and less than any value between the preset distance thresholds/current movement speed.
(3) The method comprises the steps of firstly determining the flight direction and the flight duration according to the preset speed and the position information of a first position and a second position, and then controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration. In this manner, the direction of flight is determined from the position information of the first and second positions. When the flight time length is determined, firstly, the distance between the first position and the second position is determined according to the position information of the first position and the second position, and then the flight time length is determined according to the distance between the first position and the second position and the preset speed. Optionally, the time of flight is the distance between the first position and the second position/the preset speed. In addition, the preset speed can be set according to actual requirements.
(4) Firstly, determining flight time according to a preset speed and a preset distance threshold, and then controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time. Optionally, the time of flight is a preset distance threshold/preset speed. Optionally, the flight time is greater than zero and less than any value between the preset distance thresholds/preset speed. In addition, the preset speed can be set according to actual requirements.
(5) And controlling the unmanned aerial vehicle to translate relative to the first position according to the preset speed and the preset duration. The unmanned aerial vehicle can be controlled to translate towards the left or the right relative to the first position according to the preset speed and the preset duration. In this implementation, the distance between the position reached by the unmanned aerial vehicle after the unmanned aerial vehicle translates relative to the first position according to the preset speed and the preset duration and the position before the unmanned aerial vehicle translates is less than or equal to the preset distance threshold.
In another implementation, the camera is mounted on the drone through a cradle head. When the shooting device is controlled to move from the first position to the second position according to the position information of the first position, in combination with fig. 5 and 6, when the unmanned aerial vehicle is in a static state, the tripod head (reference numeral 110 in fig. 6) is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position. Specifically, the cloud platform carries on unmanned aerial vehicle through a power device that can remove at the driftage direction, through controlling the motion of power device to control cloud platform translation. In this embodiment, the power device is controlled to move in the yaw direction, so as to control the holder to move in the yaw direction, and drive the shooting device to move from the first position to the second position. Wherein, the power device can be any power structure, such as a motor.
Further, the manner of controlling the movement of the pan/tilt head may include, but is not limited to, the following manners:
(1) the method comprises the steps of firstly determining the movement direction and the movement duration of the power device according to a preset speed and position information of a first position and a second position, and then controlling the power device to translate according to the preset speed according to the movement direction and the movement duration. In this manner, the direction of movement is determined from the position information of the first position and the second position. When the movement duration is determined, firstly, the distance between the first position and the second position is determined according to the position information of the first position and the second position, and then the movement duration is determined according to the distance between the first position and the second position and the preset speed. Optionally, the movement duration is equal to the distance between the first position and the second position/the preset speed. In this implementation, the preset speed can be set according to actual requirements.
(2) Firstly, determining the movement time length of the power device according to a preset speed and a preset distance threshold, and then controlling the power device to translate towards the left or towards the right relative to the first position according to the movement time length and the preset speed. Optionally, the movement duration is equal to a preset distance threshold/preset speed. Optionally, the movement duration is greater than zero and less than any value between the preset distance thresholds/preset speed. In this implementation, the preset speed can be set according to actual requirements.
(3) And controlling the power device to translate relative to the first position according to the preset speed and the preset duration. The power device can be controlled to translate towards the left or the right relative to the first position according to the preset speed and the preset duration. In this implementation, the distance between the position reached by the power device after translating relative to the first position according to the preset speed and the preset duration and the position before translating the power device is less than or equal to the preset distance threshold.
Step S203: acquiring a second image shot by the shooting device at a second position aiming at the target;
after step S203 is executed, the images shot by the two shooting devices at the two shooting positions in the prior art are completed by the single shooting device, so that the cost of distance measurement is saved.
Step S204: determining the depth information of the target according to the distance between the first position and the second position, the first image and the second image;
specifically, when step S204 is executed, the focal length of the camera is obtained, the parallax between the first image and the second image is determined according to the first image and the second image, and then the depth information of the target is determined according to the distance between the first position and the second position, the focal length of the camera, and the parallax. In this embodiment, the calculation formula of the depth information Z of the target is as follows:
Figure BDA0002317679970000091
in formula (1), f is the focal length of the photographing device, B is the distance between the first position and the second position, and XR-TIs the parallax.
The step of obtaining the focal length of the shooting device and the step of determining the parallax between the first image and the second image according to the first image and the second image can be executed simultaneously or sequentially.
In this embodiment, the focal length of the shooting device is determined by calibrating the shooting device, which is the prior art and is not described herein again. In addition, the parallax between the first image and the second image is obtained by performing binocular matching on the first image and the second image, and the parallax between the first image and the second image is obtained by matching corresponding image points of the same scene on the first image and the second image by adopting binocular matching.
In some embodiments, before binocular matching is performed on the first image and the second image to obtain the parallax between the first image and the second image, the shooting device is calibrated to obtain internal reference data of the shooting device, and then binocular correction processing is performed on the first image and the second image according to the internal reference data. The embodiment eliminates distortion of the first image and the second image through binocular correction and carries out line alignment on the first image and the second image, so that the coordinates of the imaging original points of the first image and the second image are consistent, the optical axes of the shooting device in the first position and the second position are parallel, the left imaging plane and the right imaging plane are coplanar, the polar lines are aligned, and the corresponding image points on the first image and the second image are matched conveniently. The internal parameter data may include a focal length f of the photographing device, an imaging origin, five distortion parameters, and an external parameter.
Furthermore, the unmanned aerial vehicle is provided with a sensing unit. In this embodiment, the ranging method based on the unmanned aerial vehicle further includes: the distance information from the target to the unmanned aerial vehicle is acquired based on the sensing unit, and the depth information of the target is adjusted according to the distance information, so that the ranging precision is improved. Optionally, the distance information and the depth information are fused to determine final depth information of the target. The manner of fusing the distance information and the depth information may include various manners, for example, taking an average value of the distance information and the depth information as final depth information of the target, or taking a weighted average value of the distance information and the depth information as final depth information of the target.
Further, the depth information determined in step S204 may be verified based on the distance information obtained by the sensing unit. In some embodiments, after the distance information from the target to the unmanned aerial vehicle is acquired based on the sensing unit, it is determined whether a difference between the distance information and the depth information of the target is less than or equal to a difference threshold value before the depth information of the target is adjusted according to the distance information. If yes, determining that the depth information of the target (the depth information determined in step S204) is valid information, and the depth information has a small error, and adjusting the depth information of the target according to the distance information; if not, the depth information of the target (the depth information determined in step S204) is determined to be invalid information, and the depth information error is large. The sensing unit of the present embodiment may include a laser ranging sensor, and may also include other ranging sensors.
In the embodiment of the invention, the single shooting device is controlled to move to the two shooting positions respectively, the images are acquired respectively, so that the depth information of the target is calculated, the images shot by the two shooting devices at the two shooting positions in the prior art are completed through the single shooting device, the same effect as binocular distance measurement is realized, the distance measurement cost is saved, and the distance measurement method can be applied to small-sized unmanned aerial vehicles and meets the requirements of most application scenes.
Example two
Referring to fig. 7, a second embodiment of the present invention provides a ranging device based on an unmanned aerial vehicle, where the ranging device may include a shooting device and a first processor (a single-core or multi-core processor), where the shooting device is mounted on the unmanned aerial vehicle, and the first processor is in communication connection with the shooting device.
The first processor of the present embodiment is configured to execute the drone-based ranging method shown in fig. 2, 3, and 5.
Specifically, the first processor is configured to: the method comprises the steps of acquiring a first image shot by a shooting device at a first position aiming at a target, controlling the shooting device to move from the first position to a second position according to position information of the first position, acquiring a second image shot by the shooting device at the second position aiming at the target, and determining depth information of the target according to the distance between the first position and the second position, the first image and the second image. Wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a shooting plane shot for the object when the shooting device is at the first position.
It should be noted that, for the specific implementation of the first processor in the embodiment of the present invention, reference may be made to the description of corresponding contents in the first embodiment, which is not described herein again.
The first processor may comprise a combination of one or more of a flight controller of the drone, a pan-tilt controller, other controllers provided on the drone.
The first processor may be a Central Processing Unit (CPU). The first processor may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
EXAMPLE III
With reference to fig. 1 and 8, a third embodiment of the present invention provides a drone, which may include a main body, a camera mounted on the main body, and a second processor (a single-core or multi-core processor), where the second processor is in communication connection with the camera.
The first processor of the present embodiment is configured to execute the drone-based ranging method shown in fig. 2, 3, and 5.
Specifically, the second processor is configured to: the method comprises the steps of acquiring a first image shot by a shooting device at a first position aiming at a target, controlling the shooting device to move from the first position to a second position according to position information of the first position, acquiring a second image shot by the shooting device at the second position aiming at the target, and determining depth information of the target according to the distance between the first position and the second position, the first image and the second image. Wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a shooting plane shot for the object when the shooting device is at the first position.
It should be noted that, for the specific implementation of the second processor in the embodiment of the present invention, reference may be made to the description of corresponding contents in the first embodiment, which is not described herein again.
The second processor may comprise a combination of one or more of a flight controller of the drone, a pan-tilt controller, other controllers provided on the drone.
Optionally, the shooting device is mounted on the unmanned aerial vehicle through the cradle head, when the second processor is the flight controller, the first image and the second image acquired by the shooting device are directly sent to the flight controller, and of course, the first image and the second image acquired by the shooting device can also be forwarded to the flight controller through the cradle head.
The second processor may be a Central Processing Unit (CPU). The first processor may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
In addition, the unmanned aerial vehicle of this embodiment still includes the sensing unit, and this sensing unit can include laser ranging sensor, also can include other ranging sensor. The depth information of the target can be verified according to the distance information obtained by the sensing unit, and the ranging precision is improved.
The unmanned aerial vehicle of this embodiment can be many rotor unmanned aerial vehicle, also can be for not having rotor unmanned aerial vehicle.
Example four
A fourth embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the unmanned aerial vehicle-based ranging method described in the first embodiment.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (69)

1. The utility model provides a range finding method based on unmanned aerial vehicle, its characterized in that, unmanned aerial vehicle carries on the shooting device, the method includes:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
2. The method of claim 1, wherein the first location is a user-preset location; alternatively, the first and second electrodes may be,
the first position is a current position of the drone.
3. The method of claim 1, wherein the location information of the first location comprises:
geographic location information based on GPS detection; and
based on altitude information detected by a vision module or barometer.
4. The method of claim 3, wherein the camera is mounted on an unmanned aerial vehicle via a pan-tilt head, and the GPS is located on the unmanned aerial vehicle, the pan-tilt head, or the camera; and/or the presence of a catalyst in the reaction mixture,
the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.
5. The method of claim 1, wherein a distance between the first location and the second location is less than or equal to a preset distance threshold.
6. The method of claim 5, wherein the second location is a user-preset location; alternatively, the first and second electrodes may be,
the second position is determined according to the position information of the first position and the preset distance threshold.
7. The method according to claim 6, wherein after controlling the camera to move from the first position to the second position according to the position information of the first position, the method further comprises:
acquiring actual position information of the shooting device at the second position;
determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.
8. The method according to claim 6, wherein after controlling the camera to move from the first position to the second position according to the position information of the first position, the method further comprises:
acquiring actual position information of the shooting device at the second position;
determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.
9. The method of claim 1 or 2, wherein the second position is located to the left or right of the first position.
10. The method of claim 1, wherein the controlling the camera to move from the first position to a second position according to the position information of the first position comprises:
according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.
11. The method of claim 10, wherein controlling the drone to move according to the location information of the first location comprises:
acquiring the current movement speed of the unmanned aerial vehicle;
determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
acquiring the current movement speed of the unmanned aerial vehicle;
determining flight time according to the current movement speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;
alternatively, the first and second electrodes may be,
determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
determining flight time according to a preset speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;
alternatively, the first and second electrodes may be,
and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.
12. The method of claim 1, wherein the camera is mounted on the drone through a pan-tilt head;
the controlling the photographing device to move from the first position to the second position according to the position information of the first position comprises:
when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.
13. The method of claim 12, wherein said pan/tilt head is carried on said drone by a power device movable in yaw;
the controlling the pan-tilt movement includes:
and controlling the power device to move so as to control the holder to translate.
14. The method of claim 13, wherein controlling the pan/tilt head movement according to the position information of the first position comprises:
determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;
controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;
alternatively, the first and second electrodes may be,
determining the movement time length of the power device according to a preset speed and a preset distance threshold;
controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;
alternatively, the first and second electrodes may be,
and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.
15. The method according to claim 10 or 12, wherein a sensing unit is provided on the drone; the method further comprises the following steps:
acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;
and adjusting the depth information of the target according to the distance information.
16. The method of claim 15, wherein before adjusting the depth information of the target according to the distance information, further comprising:
and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.
17. The method of claim 15, wherein after the obtaining the distance information of the target to the drone based on the sensing unit, further comprising:
and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.
18. The method of claim 15, wherein the adjusting the depth information of the target according to the distance information comprises:
and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.
19. The method of claim 15, wherein the sensing unit comprises a laser ranging sensor.
20. The method of claim 1, wherein determining the depth information of the target according to the distance between the first location and the second location and the first image and the second image comprises:
acquiring the focal length of the shooting device;
determining a disparity between the first image and the second image from the first image and the second image;
and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.
21. The method of claim 20, wherein the obtaining the focal length of the camera comprises:
and calibrating the shooting device, and determining the focal length of the shooting device.
22. The method of claim 20, wherein determining the disparity between the first image and the second image from the first image and the second image comprises:
and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.
23. The method of claim 22, wherein the binocular matching the first image and the second image, prior to obtaining the disparity between the first image and the second image, comprises:
calibrating the shooting device to obtain internal reference data of the shooting device;
and performing binocular correction processing on the first image and the second image according to the internal reference data.
24. The utility model provides a range unit based on unmanned aerial vehicle, its characterized in that, includes shoots device and treater, it carries on to shoot the device unmanned aerial vehicle is last, the treater with shoot device communication connection, the treater is used for:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
25. The apparatus of claim 24, wherein the first position is a user preset position; alternatively, the first and second electrodes may be,
the first position is a current position of the drone.
26. The apparatus of claim 24, wherein the location information of the first location comprises:
geographic location information based on GPS detection; and
based on altitude information detected by a vision module or barometer.
27. The device of claim 26, wherein the camera is mounted on an unmanned aerial vehicle via a cradle head, and the GPS is provided on the unmanned aerial vehicle, the cradle head, or the camera; and/or the presence of a catalyst in the reaction mixture,
the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.
28. The apparatus of claim 24, wherein a distance between the first position and the second position is less than or equal to a preset distance threshold.
29. The apparatus of claim 28, wherein the second position is a user preset position; alternatively, the first and second electrodes may be,
the second position is determined according to the position information of the first position and the preset distance threshold.
30. The apparatus of claim 29, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position according to the position information of the first position:
acquiring actual position information of the shooting device at the second position;
determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.
31. The apparatus of claim 29, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position according to the position information of the first position:
acquiring actual position information of the shooting device at the second position;
determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.
32. The device of claim 24 or 25, wherein the second position is located to the left or right of the first position.
33. The apparatus of claim 24, wherein the processor is specifically configured to:
according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.
34. The apparatus of claim 33, wherein the processor is specifically configured to:
acquiring the current movement speed of the unmanned aerial vehicle;
determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
acquiring the current movement speed of the unmanned aerial vehicle;
determining flight time according to the current movement speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;
alternatively, the first and second electrodes may be,
determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
determining flight time according to a preset speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;
alternatively, the first and second electrodes may be,
and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.
35. The apparatus of claim 24, wherein the camera is mounted on the drone through a cradle head;
the processor is specifically configured to:
when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.
36. The apparatus of claim 35, wherein said pan/tilt head is carried on said drone by a power device movable in yaw;
the processor is specifically configured to:
and controlling the power device to move so as to control the holder to translate.
37. The apparatus of claim 36, wherein the processor is specifically configured to:
determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;
controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;
alternatively, the first and second electrodes may be,
determining the movement time length of the power device according to a preset speed and a preset distance threshold;
controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;
alternatively, the first and second electrodes may be,
and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.
38. The device of claim 33 or 35, wherein the drone is provided with a sensing unit; the processor is further configured to:
acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;
and adjusting the depth information of the target according to the distance information.
39. The apparatus of claim 38, wherein the processor, prior to adjusting the depth information of the target based on the distance information, is further configured to:
and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.
40. The apparatus of claim 38, wherein the processor, after obtaining the distance information from the target to the drone based on the sensing unit, is further configured to:
and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.
41. The apparatus of claim 38, wherein the processor is specifically configured to:
and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.
42. The apparatus of claim 38, wherein the sensing unit comprises a laser ranging sensor.
43. The apparatus of claim 24, wherein the processor is specifically configured to:
acquiring the focal length of the shooting device;
determining a disparity between the first image and the second image from the first image and the second image;
and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.
44. The apparatus of claim 43, wherein the processor is specifically configured to:
and calibrating the shooting device, and determining the focal length of the shooting device.
45. The apparatus of claim 43, wherein the processor is specifically configured to:
and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.
46. The apparatus of claim 45, wherein the processor is configured to perform binocular matching on the first image and the second image, and prior to obtaining the disparity between the first image and the second image, to:
calibrating the shooting device to obtain internal reference data of the shooting device;
and performing binocular correction processing on the first image and the second image according to the internal reference data.
47. An unmanned aerial vehicle, comprising:
a body;
a camera mounted on the body; and
a processor communicatively coupled to the camera, the processor configured to:
acquiring a first image shot by the shooting device at a first position aiming at a target;
controlling the shooting device to move from the first position to a second position according to the position information of the first position;
acquiring a second image shot by the shooting device at the second position aiming at the target;
determining depth information of the target according to the distance between the first position and the second position, the first image and the second image;
wherein the first position and the second position are equal in height, and a line connecting the first position and the second position is parallel to a photographing plane photographed by the photographing apparatus for the target when in the first position.
48. A drone as claimed in claim 47, wherein the first position is a user preset position; alternatively, the first and second electrodes may be,
the first position is a current position of the drone.
49. A drone as claimed in claim 47, wherein the location information for the first location includes:
geographic location information based on GPS detection; and
based on altitude information detected by a vision module or barometer.
50. The unmanned aerial vehicle of claim 49, wherein the camera is mounted on the unmanned aerial vehicle via a cradle head, and the GPS is provided on the unmanned aerial vehicle, the cradle head, or the camera; and/or the presence of a catalyst in the reaction mixture,
the vision module or the barometer is arranged on the unmanned aerial vehicle, the holder or the shooting device.
51. A drone according to claim 47, wherein the distance between the first and second positions is less than or equal to a preset distance threshold.
52. A drone as claimed in claim 51, wherein the second position is a user preset position; alternatively, the first and second electrodes may be,
the second position is determined according to the position information of the first position and the preset distance threshold.
53. A drone as claimed in claim 52, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position based on the position information of the first position:
acquiring actual position information of the shooting device at the second position;
determining a horizontal distance between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the horizontal distance is larger than the preset distance threshold, controlling the shooting device to translate towards the first position, so that the horizontal distance from the shooting device to the first position is smaller than or equal to the preset distance threshold.
54. A drone as claimed in claim 52, wherein the processor is further configured to, after controlling the camera to move from the first position to the second position based on the position information of the first position:
acquiring actual position information of the shooting device at the second position;
determining a height difference between the first position and the second position according to the position information of the first position and the actual position information of the second position;
when the height difference is not equal to zero, adjusting the height of the shooting device so that the height of the shooting device is equal to that of the first position.
55. A drone as claimed in claim 47 or 48, wherein the second location is to the left or right of the first location.
56. A drone as claimed in claim 47, wherein the processor is specifically configured to:
according to the position information of the first position, the unmanned aerial vehicle is controlled to move, so that the shooting device moves from the first position to the second position.
57. A drone as claimed in claim 56, wherein the processor is specifically configured to:
acquiring the current movement speed of the unmanned aerial vehicle;
determining a flight direction and a flight duration according to the current movement speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the current movement speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
acquiring the current movement speed of the unmanned aerial vehicle;
determining flight time according to the current movement speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the flight time length and the current movement speed;
alternatively, the first and second electrodes may be,
determining a flight direction and a flight duration according to a preset speed and the position information of the first position and the second position;
controlling the unmanned aerial vehicle to translate according to the preset speed according to the flight direction and the flight duration;
alternatively, the first and second electrodes may be,
determining flight time according to a preset speed and a preset distance threshold;
controlling the unmanned aerial vehicle to translate towards the left or the right relative to the first position according to the preset speed according to the flight time;
alternatively, the first and second electrodes may be,
and controlling the unmanned aerial vehicle to translate relative to the first position according to a preset speed and a preset duration.
58. The drone of claim 47, wherein the camera is mounted on the drone by a cradle head;
the processor is specifically configured to:
when the unmanned aerial vehicle is in a static state, the cradle head is controlled to move according to the position information of the first position, so that the shooting device moves from the first position to the second position.
59. A drone according to claim 58, wherein the head is carried on the drone by a power device movable in the yaw direction;
the processor is specifically configured to:
and controlling the power device to move so as to control the holder to translate.
60. A drone as claimed in claim 59, wherein the processor is specifically configured to:
determining the movement direction and the movement duration of the power device according to the preset speed and the position information of the first position and the second position;
controlling the power device to translate according to the preset speed according to the movement direction and the movement duration;
alternatively, the first and second electrodes may be,
determining the movement time length of the power device according to a preset speed and a preset distance threshold;
controlling the power device to translate towards the left or the right relative to the first position according to the preset speed according to the movement duration;
alternatively, the first and second electrodes may be,
and controlling the power device to translate relative to the first position according to a preset speed and a preset time length.
61. A drone according to claim 56 or 58, characterised in that the drone is provided with a sensing unit; the processor is further configured to:
acquiring distance information from the target to the unmanned aerial vehicle based on the sensing unit;
and adjusting the depth information of the target according to the distance information.
62. A drone of claim 61, wherein the processor, prior to adjusting the depth information of the target based on the distance information, is further configured to:
and determining that the difference between the distance information and the depth information of the target is smaller than or equal to a preset difference threshold.
63. A drone according to claim 61, wherein the processor, after obtaining the distance information of the target to the drone based on the sensing unit, is further configured to:
and when the difference between the distance information and the depth information of the target is determined to be larger than a preset difference threshold value, determining the depth information of the target to be invalid information.
64. A drone as claimed in claim 61, wherein the processor is specifically configured to:
and carrying out fusion processing on the distance information and the depth information to determine final depth information of the target.
65. A drone according to claim 61, wherein the sensing unit includes a laser ranging sensor.
66. A drone as claimed in claim 47, wherein the processor is specifically configured to:
acquiring the focal length of the shooting device;
determining a disparity between the first image and the second image from the first image and the second image;
and determining the depth information of the target according to the distance between the first position and the second position, the focal length of the shooting device and the parallax.
67. A drone as claimed in claim 66, wherein the processor is specifically configured to:
and calibrating the shooting device, and determining the focal length of the shooting device.
68. A drone as claimed in claim 66, wherein the processor is specifically configured to:
and carrying out binocular matching on the first image and the second image to obtain the parallax between the first image and the second image.
69. A drone as claimed in claim 68, wherein the processor, prior to binocular matching the first and second images, is configured to:
calibrating the shooting device to obtain internal reference data of the shooting device;
and performing binocular correction processing on the first image and the second image according to the internal reference data.
CN201880039803.4A 2018-09-28 2018-09-28 Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle Pending CN110799801A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/108335 WO2020062024A1 (en) 2018-09-28 2018-09-28 Distance measurement method and device based on unmanned aerial vehicle and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN110799801A true CN110799801A (en) 2020-02-14

Family

ID=69426628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880039803.4A Pending CN110799801A (en) 2018-09-28 2018-09-28 Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN110799801A (en)
WO (1) WO2020062024A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111309050A (en) * 2020-03-04 2020-06-19 桂林航天工业学院 Unmanned aerial vehicle target identification and positioning method
WO2021258251A1 (en) * 2020-06-22 2021-12-30 深圳市大疆创新科技有限公司 Surveying and mapping method for movable platform, and movable platform and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180004232A1 (en) * 2015-07-08 2018-01-04 SZ DJI Technology Co., Ltd Camera configuration on movable objects
CN107687841A (en) * 2017-09-27 2018-02-13 中科创达软件股份有限公司 A kind of distance-finding method and device
CN107729878A (en) * 2017-11-14 2018-02-23 智车优行科技(北京)有限公司 Obstacle detection method and device, equipment, vehicle, program and storage medium
CN108037768A (en) * 2017-12-13 2018-05-15 常州工学院 Unmanned plane obstruction-avoiding control system, avoidance obstacle method and unmanned plane

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105376484A (en) * 2015-11-04 2016-03-02 深圳市金立通信设备有限公司 Image processing method and terminal
CN106291521A (en) * 2016-07-29 2017-01-04 广东欧珀移动通信有限公司 Distance-finding method, device and the mobile terminal moved based on MEMS
JP6463319B2 (en) * 2016-10-19 2019-01-30 株式会社Subaru Stereo distance measuring device, stereo distance measuring method, and stereo distance measuring program
WO2018214077A1 (en) * 2017-05-24 2018-11-29 深圳市大疆创新科技有限公司 Photographing method and apparatus, and image processing method and apparatus
CN207585606U (en) * 2017-09-29 2018-07-06 山东茁恩航空技术发展有限公司 A kind of unmanned plane aeroplane photography ranging and drafting system
CN107967701B (en) * 2017-12-18 2021-10-15 信利光电股份有限公司 Calibration method, device and equipment of depth camera equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180004232A1 (en) * 2015-07-08 2018-01-04 SZ DJI Technology Co., Ltd Camera configuration on movable objects
CN107687841A (en) * 2017-09-27 2018-02-13 中科创达软件股份有限公司 A kind of distance-finding method and device
CN107729878A (en) * 2017-11-14 2018-02-23 智车优行科技(北京)有限公司 Obstacle detection method and device, equipment, vehicle, program and storage medium
CN108037768A (en) * 2017-12-13 2018-05-15 常州工学院 Unmanned plane obstruction-avoiding control system, avoidance obstacle method and unmanned plane

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111309050A (en) * 2020-03-04 2020-06-19 桂林航天工业学院 Unmanned aerial vehicle target identification and positioning method
CN111309050B (en) * 2020-03-04 2023-07-04 桂林航天工业学院 Unmanned aerial vehicle target identification positioning method
WO2021258251A1 (en) * 2020-06-22 2021-12-30 深圳市大疆创新科技有限公司 Surveying and mapping method for movable platform, and movable platform and storage medium

Also Published As

Publication number Publication date
WO2020062024A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US11039086B2 (en) Dual lens system having a light splitter
US10771699B2 (en) Systems and methods for rolling shutter correction
CN108351574B (en) System, method and apparatus for setting camera parameters
CN109219785B (en) Multi-sensor calibration method and system
US20190096069A1 (en) Systems and methods for visual target tracking
CN105959625B (en) Method and device for controlling unmanned aerial vehicle to track and shoot
US11353891B2 (en) Target tracking method and apparatus
US20200191556A1 (en) Distance mesurement method by an unmanned aerial vehicle (uav) and uav
CN110366670B (en) Three-dimensional shape estimation method, flight vehicle, mobile platform, program, and recording medium
CN108886572B (en) Method and system for adjusting image focus
WO2018178756A1 (en) System and method for providing autonomous photography and videography
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
US11105622B2 (en) Dual barometer systems for improved altitude estimation
CN109983468A (en) Use the method and system of characteristic point detection and tracking object
CN105045293A (en) Cradle head control method, external carrier control method and cradle head
CN109076206B (en) Three-dimensional imaging method and device based on unmanned aerial vehicle
WO2020048365A1 (en) Flight control method and device for aircraft, and terminal device and flight control system
CN111213107B (en) Information processing device, imaging control method, program, and recording medium
CN110799801A (en) Unmanned aerial vehicle-based distance measurement method and device and unmanned aerial vehicle
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
WO2021056503A1 (en) Positioning method and apparatus for movable platform, movable platform, and storage medium
WO2022109860A1 (en) Target object tracking method and gimbal
WO2021217371A1 (en) Control method and apparatus for movable platform
CN113824885B (en) System, method and related unmanned aerial vehicle for optical path adjustment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200214

RJ01 Rejection of invention patent application after publication