CN110645960A - Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device - Google Patents
Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device Download PDFInfo
- Publication number
- CN110645960A CN110645960A CN201810665952.3A CN201810665952A CN110645960A CN 110645960 A CN110645960 A CN 110645960A CN 201810665952 A CN201810665952 A CN 201810665952A CN 110645960 A CN110645960 A CN 110645960A
- Authority
- CN
- China
- Prior art keywords
- area
- aircraft
- dimensional
- detected
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000691 measurement method Methods 0.000 title claims abstract description 11
- 238000000034 method Methods 0.000 claims abstract description 91
- 230000000007 visual effect Effects 0.000 claims abstract description 37
- 238000013459 approach Methods 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims description 7
- 238000012360 testing method Methods 0.000 claims 3
- 238000005259 measurement Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 14
- 238000004364 calculation method Methods 0.000 description 12
- 238000010276 construction Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005265 energy consumption Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229910052742 iron Inorganic materials 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
- G01C11/12—Interpretation of pictures by comparison of two or more pictures of the same area the pictures being supported in the same relative position as when they were taken
- G01C11/14—Interpretation of pictures by comparison of two or more pictures of the same area the pictures being supported in the same relative position as when they were taken with optical projection
- G01C11/16—Interpretation of pictures by comparison of two or more pictures of the same area the pictures being supported in the same relative position as when they were taken with optical projection in a common plane
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
- G01C11/28—Special adaptation for recording picture point data, e.g. for profiles
Abstract
The embodiment of the invention provides a distance measuring method for terrain following, and a distance measuring method and device for obstacle avoidance. The distance measurement method comprises the following steps: selecting a visual area from three-dimensional point clouds constructed on the basis of an area to be measured according to appearance information of an aircraft, wherein the area to be measured is an aircraft approach area, the three-dimensional point clouds constructed on the basis of the area to be measured are a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the area to be measured, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be measured one by one, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft; and determining the minimum distance between the area to be measured and the aircraft according to the visible area, wherein the minimum distance is used for determining the flight track of the aircraft.
Description
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to a distance measuring method, a distance measuring system, a distance measuring device, a distance measuring medium and a distance measuring computing device, which are applied to terrain following, and a distance measuring method, a distance measuring system, a distance measuring device, a distance measuring medium and a distance measuring computing device which are applied to obstacle avoidance.
Background
This section is intended to provide a background or context to the embodiments of the invention that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
Unmanned vehicles, like electric power patrol inspection unmanned aerial vehicle, agricultural plant protection machine etc. the wide application is used for fields such as agricultural, energy, military affairs and geographical survey and drawing. Under the application scenes of topographic mapping, power pipeline inspection, oil and gas pipeline inspection, agricultural planting and the like, the unmanned aerial vehicle usually needs to keep a lower flying height to complete a task. However, since the unmanned aerial vehicle fluctuates along the terrain, the unmanned aerial vehicle needs to have a real-time distance measuring capability for the ground and an obstacle in order to avoid a touchdown crash caused by a low flying height. At present, the existing distance measurement schemes mainly include the following:
according to the technical scheme, the distance measurement of the ground and the obstacle is carried out by using laser distance measurement equipment. The laser ranging device is limited by the limitation of the unmanned aerial vehicle on the volume and the weight, the mechanical rotation auxiliary device of the laser ranging device cannot be additionally arranged on the unmanned aerial vehicle, the laser ranging device can only measure the distance between the aircraft and an obstacle or a certain point on the ground under the condition that the mechanical rotation auxiliary device is not arranged, and most of the projection of the aircraft is massive, so that a measurement blind spot exists in the scheme, and the accurate ranging result cannot be obtained. In addition, the distance measurement algorithms adopted by the laser distance measurement equipment are generally a TOF (time of flight) method and a triangulation method, the measurement distance of the triangulation method is generally within 20 meters, the measurement distance of the TOF can reach hundreds of meters, but both distance measurement algorithms are limited by the surface reflectivity of the measured object, and the situation of measurement failure may occur when the reflectivity of the object is low.
According to the second technical scheme, based on the depth information acquired by the depth sensor, a VSLAM (visual instant positioning and mapping) algorithm is used for planning the flight path of the aircraft, so that the aircraft is prevented from being crashed and the lower flight height is kept. Although the technical scheme can construct the environment where the aircraft is located and plan the flight path for the aircraft, the computing resources and energy consumption required by the operation of the technical scheme are huge, and the general aircraft cannot be loaded at all. Moreover, the technical scheme is difficult to meet the requirement of the aircraft ranging on real-time performance.
According to the third technical scheme, the topographic data of the operation site is imported in advance, and the relative distance between the aircraft and the ground and between the aircraft and the obstacle is obtained by combining GPS positioning. The terrain data needs to be elevation grid data, although mainstream map software on the market can provide elevation grid data of partial areas, the utilization rate is low due to incomplete areas and low grid precision, further high-precision terrain mapping is needed, and the technical scheme is difficult to implement and high in implementation cost.
In conclusion, the existing distance measurement schemes cannot well realize real-time distance measurement of the aircraft on the ground and the obstacles.
Disclosure of Invention
The embodiment of the invention provides a distance measuring method, a distance measuring device, a distance measuring medium and computing equipment, which are used for solving the problem that the existing distance measuring scheme cannot well realize real-time distance measurement of an aircraft on the ground and an obstacle.
In a first aspect, the present invention provides a ranging method, including: selecting a visual area from three-dimensional point clouds constructed on the basis of an area to be measured according to appearance information of an aircraft, wherein the area to be measured is an aircraft approach area, the three-dimensional point clouds constructed on the basis of the area to be measured are a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the area to be measured, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be measured one by one, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft; and determining the minimum distance between the area to be measured and the aircraft according to the visual area, wherein the minimum distance is used for determining the flight track of the aircraft.
In one possible implementation, determining the minimum distance between the area to be measured and the aircraft according to the visible area includes: projecting the three-dimensional point cloud in the visible area to a two-dimensional plane to obtain a two-dimensional point cloud corresponding to the visible area, wherein the three-dimensional point cloud in the visible area is a subset of the three-dimensional point cloud, a plurality of two-dimensional coordinate points in the two-dimensional point cloud correspond to a plurality of three-dimensional coordinate points in the three-dimensional point cloud in the visible area, and the coordinate of each two-dimensional coordinate point in the plurality of two-dimensional coordinate points is used for indicating the distance between each corresponding sub-area and the aircraft; and determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area.
In one possible implementation, determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area includes: performing discrete point removing processing on the two-dimensional point cloud of the visible area; and selecting a two-dimensional coordinate point with the minimum indicated distance from the two-dimensional point cloud of the processed visual area, and taking the distance indicated by the two-dimensional coordinate point as the minimum distance between the area to be measured and the aircraft.
In one possible implementation, the profile information of the aircraft includes a volume of the aircraft.
In a possible implementation manner, before selecting a visible area according to the appearance information of the aircraft in the three-dimensional point cloud constructed based on the area to be measured, the method further includes: and setting a proportional relation between the visible area and the three-dimensional point cloud constructed based on the area to be measured according to the volume of the aircraft and the preset flight height of the aircraft. Selecting a visual area from the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft, wherein the visual area comprises the following steps: and selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the proportional relation.
In one possible implementation manner, selecting a visual area from a three-dimensional point cloud constructed based on an area to be measured according to appearance information of an aircraft includes: and dynamically selecting a visual area from the three-dimensional point cloud constructed based on the area to be detected according to the volume of the aircraft and the flight height of the aircraft.
In a possible implementation manner, before selecting a visible area according to the appearance information of the aircraft in the three-dimensional point cloud constructed based on the area to be measured, the method further includes: acquiring a depth image of a region to be detected; constructing a three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected; the coordinates of each pixel point in the depth image are used for indicating the distance between each terrain area in the area to be measured corresponding to each pixel point and the aircraft, and the pixel points in the depth image correspond to the three-dimensional coordinate points in the three-dimensional point cloud in a one-to-one mode.
In a possible implementation manner, before constructing the three-dimensional point cloud of the region to be measured based on the depth image of the region to be measured, the method further includes: attitude information of the aircraft is acquired. The method for constructing the three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected comprises the following steps: projecting the depth image of the region to be measured into a three-dimensional space to form an initial three-dimensional point cloud; judging whether included angles exist in the attitude information of the aircraft or not; and if the attitude information has an included angle, reversely rotating the coordinates of each three-dimensional coordinate point in the initial three-dimensional point cloud by the angle same as the included angle to obtain the three-dimensional point cloud of the area to be measured.
In one possible implementation, acquiring a depth image of a region to be measured includes: acquiring at least two frames of image information of a region to be detected through binocular camera equipment; and performing parallax calculation on the at least two frames of image information to obtain a depth image of the region to be measured.
In a possible implementation manner, before acquiring at least two frames of image information of a region to be detected by using a binocular camera device, the method further includes: and carrying out frame synchronization on the binocular camera equipment.
In a possible implementation manner, after determining the minimum distance between the area to be measured and the aircraft according to the visible area, the method further includes: judging whether the minimum distance deviates from a preset range; if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the flying height of the aircraft unchanged; or if the minimum distance deviates from the preset range, the aircraft is instructed to adjust the flight height of the aircraft.
In a second aspect, the present invention further provides a ranging method applied to terrain following, including: selecting a visible area from three-dimensional point clouds constructed on the basis of the terrain to be detected according to the appearance information of the aircraft, wherein the terrain to be detected is the terrain of an aircraft approach area, the three-dimensional point clouds constructed on the basis of the terrain to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the terrain to be detected, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the terrain to be detected one by one, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft; determining the minimum distance between the terrain to be measured and the aircraft according to the visible area; and determining the flight track of the aircraft in terrain following according to the minimum distance.
In one possible implementation, determining a flight trajectory of the aircraft for terrain following according to the minimum distance includes: judging whether the minimum distance deviates from a preset range; if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the flying height of the aircraft unchanged; or if the minimum distance deviates from the preset range, the aircraft is instructed to adjust the flight height of the aircraft.
In one possible implementation, the method of any one of the first aspect is performed.
In a third aspect, the present invention further provides a distance measurement method applied to obstacle avoidance, including: selecting a visual area from three-dimensional point clouds constructed on the basis of an area to be measured according to appearance information of an aircraft, wherein the area to be measured is an aircraft approach area, the three-dimensional point clouds constructed on the basis of the area to be measured are a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the area to be measured, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be measured one by one, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft; determining the minimum distance between a region to be detected and the aircraft according to the visible region, wherein the region to be detected comprises the obstacles in the region to be detected; and determining the flight track of the aircraft during obstacle avoidance according to the minimum distance.
In a possible implementation manner, determining a flight trajectory of the aircraft in obstacle avoidance according to the minimum distance includes: judging whether the minimum distance deviates from a preset range; if the minimum distance is not smaller than the preset range, indicating the aircraft to keep the flight direction of the aircraft unchanged; or if the minimum distance is smaller than the preset range, the aircraft is instructed to adjust the flight direction of the aircraft.
In one possible implementation, the method of any one of the first aspect is performed.
In a fourth aspect, the present invention further provides a distance measuring apparatus, including a binocular camera device, and
the device comprises a selecting unit, a judging unit and a judging unit, wherein the selecting unit is used for selecting a visible area from three-dimensional point clouds constructed on the basis of an area to be detected according to the appearance information of an aircraft, the area to be detected is an aircraft approach area, the three-dimensional point clouds constructed on the basis of the area to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the area to be detected, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be detected one by one, and the coordinate of each three-dimensional coordinate point;
and the determining unit is used for determining the minimum distance between the area to be measured and the aircraft according to the visible area, and the minimum distance is used for determining the flight track of the aircraft.
In a possible implementation manner, the determining unit is specifically configured to: projecting the three-dimensional point cloud in the visible area to a two-dimensional plane to obtain a two-dimensional point cloud corresponding to the visible area, wherein the three-dimensional point cloud in the visible area is a subset of the three-dimensional point cloud, a plurality of two-dimensional coordinate points in the two-dimensional point cloud correspond to a plurality of three-dimensional coordinate points in the three-dimensional point cloud in the visible area, and the coordinate of each two-dimensional coordinate point in the plurality of two-dimensional coordinate points is used for indicating the distance between each corresponding sub-area and the aircraft; and determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area.
In a possible implementation manner, the determining unit, when determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area, is specifically configured to: performing discrete point removing processing on the two-dimensional point cloud of the visible area; and selecting a two-dimensional coordinate point with the minimum indicated distance from the two-dimensional point cloud of the processed visual area, and taking the distance indicated by the two-dimensional coordinate point as the minimum distance between the area to be measured and the aircraft.
In one possible implementation, the profile information of the aircraft includes a volume of the aircraft.
Accordingly, in one possible implementation, the selecting unit is further configured to: before a visual area is selected from the three-dimensional point cloud constructed based on the area to be measured according to the appearance information of the aircraft, the proportional relation between the visual area and the three-dimensional point cloud constructed based on the area to be measured is set according to the volume of the aircraft and the preset flying height of the aircraft. The selection unit is specifically configured to: and selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the proportional relation.
Correspondingly, in a possible implementation manner, the selecting unit is specifically configured to: and dynamically selecting a visual area from the three-dimensional point cloud constructed based on the area to be detected according to the volume of the aircraft and the flight height of the aircraft.
In a possible implementation manner, the method further includes a construction unit configured to: before a selecting unit selects a visual area in the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft, a depth image of the area to be detected is obtained; constructing a three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected; the coordinates of each pixel point in the depth image are used for indicating the distance between each terrain area in the area to be measured corresponding to each pixel point and the aircraft, and the pixel points in the depth image correspond to the three-dimensional coordinate points in the three-dimensional point cloud in a one-to-one mode.
In a possible implementation, the construction unit is further configured to: before the three-dimensional point cloud of the region to be detected is constructed based on the depth image of the region to be detected, attitude information of the aircraft is obtained. When the building unit builds the three-dimensional point cloud of the region to be measured based on the depth image of the region to be measured, the building unit is specifically configured to: projecting the depth image of the region to be measured into a three-dimensional space to form an initial three-dimensional point cloud; judging whether included angles exist in the attitude information of the aircraft or not; and if the attitude information has an included angle, reversely rotating the coordinates of each three-dimensional coordinate point in the initial three-dimensional point cloud by the angle same as the included angle to obtain the three-dimensional point cloud of the area to be measured.
In a possible implementation manner, when the construction unit obtains the depth image of the region to be measured, the construction unit is specifically configured to: acquiring at least two frames of image information of a region to be detected through binocular camera equipment; and performing parallax calculation on the at least two frames of image information to obtain a depth image of the region to be measured.
In a possible implementation, the construction unit is further configured to: before at least two frames of image information of the area to be detected are acquired through the binocular camera equipment, frame synchronization is carried out on the binocular camera equipment.
In a possible implementation manner, the system further comprises an adjusting unit, configured to determine whether the minimum distance deviates from a preset range after the determining unit determines the minimum distance between the area to be measured and the aircraft according to the visible area; if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the flying height of the aircraft unchanged; or if the minimum distance deviates from the preset range, the aircraft is instructed to adjust the flight height of the aircraft.
In a fifth aspect, the present invention further provides a terrain following sensor, including a binocular camera device, a data interface, and a processor, where the processor is configured to execute the method in any one of the first aspect, and the binocular camera device is configured to acquire image information of a terrain to be detected.
In a sixth aspect, the present invention further provides a terrain following apparatus, which includes a binocular camera device, a processor, a memory, and a transceiver;
the binocular camera equipment is used for acquiring image information of a terrain to be measured;
a memory for storing a program executed by the processor;
a processor for performing the method of any one of the first aspect in accordance with the program stored in the memory and the image information of the terrain to be measured;
a transceiver for receiving or transmitting data under the control of the processor.
In a seventh aspect, the present invention further provides an obstacle avoidance sensor, including a binocular camera device, a data interface, and a processor, where the processor is configured to execute the method in any one of the first aspect, the binocular camera device is configured to acquire image information of a region to be detected, and the region to be detected includes an obstacle of the region to be detected.
In an eighth aspect, the invention further provides an obstacle avoidance device, which includes binocular camera equipment, a processor, a memory and a transceiver;
the binocular camera equipment is used for acquiring image information of a to-be-detected area, and the to-be-detected area comprises obstacles of the to-be-detected area;
a memory for storing a program executed by the processor;
a processor for performing the method of any one of the first aspect in accordance with the program stored in the memory and the image information of the terrain to be measured;
a transceiver for receiving or transmitting data under the control of the processor.
In a ninth aspect, the invention also proposes a system comprising an aircraft and at least one terrain following device, the at least one terrain following device being mounted on the outside of the aircraft, a plurality of terrain following devices being used to perform the method of any of the first aspects.
In a tenth aspect, the present invention further provides a system, which is characterized by comprising an aircraft and at least one obstacle avoidance device, wherein the at least one obstacle avoidance device is mounted on the outer side of the aircraft, and a plurality of obstacle avoidance devices are used for executing the method in any one of the first aspect.
In an eleventh aspect, the present invention also proposes a medium storing computer-executable instructions for causing a computer to perform the method of any one of the first aspects.
In a twelfth aspect, the present invention also provides a computing device having stored thereon executable instructions for causing the computing device to perform the method of any of the first aspects.
According to the technical scheme provided by the invention, the visual area can be selected from the three-dimensional point cloud constructed based on the area to be measured, so that the minimum distance between the area to be measured and the aircraft can be determined according to the visual area, the high calculation resource demand and energy consumption caused by the need of calculating the distance between the aircraft and all points in the whole area to be measured are avoided, the calculated amount in the distance measuring process is greatly reduced, and the implementation difficulty and the implementation cost in the distance measuring process are reduced.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
fig. 1 schematically shows a flow chart of a ranging method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a process for constructing a three-dimensional point cloud according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of another process for constructing a three-dimensional point cloud according to an embodiment of the invention;
FIG. 4 is a schematic diagram illustrating a ranging scenario according to an embodiment of the present invention;
fig. 5 is a schematic flow chart illustrating a distance measuring method applied to terrain following according to an embodiment of the present invention;
fig. 6 is a schematic flow chart illustrating a distance measurement method applied to obstacle avoidance according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a distance measuring device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a distance measuring device based on a binocular camera according to an embodiment of the invention;
FIG. 9 is a schematic diagram of a medium according to an embodiment of the present invention;
fig. 10 schematically shows a structural diagram of a computing device according to an embodiment of the present invention.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
The principles and spirit of the present invention will be described with reference to a number of exemplary embodiments. It is understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the invention, and are not intended to limit the scope of the invention in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As will be appreciated by one of skill in the art, embodiments of the present invention may be embodied as a method, system, apparatus, device, method, or computer program product for ranging. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
According to the embodiment of the invention, a distance measuring method, a distance measuring system, a distance measuring device, a distance measuring sensor, a distance measuring device, a distance measuring method or a computer program product are provided, and the distance measuring method is applied to terrain following and obstacle avoidance.
In this document, it is to be understood that any number of elements in the figures are provided by way of illustration and not limitation, and any nomenclature is used for differentiation only and not in any limiting sense.
The principles and spirit of the present invention are explained in detail below with reference to several representative embodiments of the invention.
Application scene overview
The embodiment of the invention can be applied to ranging scenes in various fields, in particular to a ranging scene based on terrain following and a ranging scene based on obstacle avoidance. Several scenarios will be exemplified below:
taking an industrial unmanned aerial vehicle in a power pipeline inspection scene as an example, the flying distance is often longer when the task of power pipeline inspection is executed, which means that the pipeline has larger fluctuation change through the terrain, for an unmanned aerial vehicle flying at an ultra-low altitude, the distance between the unmanned aerial vehicle and the ground can be determined by adopting the distance measuring method provided by the embodiment of the invention, so that the task can be safely completed by keeping the distance between the unmanned aerial vehicle and the ground.
Taking the agricultural plant protection machine in an agricultural planting scene as an example, the distance between the agricultural plant protection machine and the ground is usually only 2 to 5 meters when spraying operation is carried out, and the relative height between the agricultural plant protection machine and the ground can be ensured not to have the capability of sensing the relative height between the agricultural plant protection machine and the ground when the agricultural plant protection machine faces a sloping farmland.
Taking a photographing unmanned aerial vehicle for tracking and photographing a target as an example, when the photographing flight is carried out in a field with complex terrain such as hills, canyons and the like, the photographing unmanned aerial vehicle must have sensitive environment sensing capability including sensing capability for the relative height of the ground and surrounding obstacles, and under the scene, the distance measuring method provided by the embodiment of the invention can be adopted to determine the relative height between the photographing unmanned aerial vehicle and the ground and the distance between the photographing unmanned aerial vehicle and the surrounding obstacles, so that collision and crash are avoided.
Taking the manned vehicle as an example, when the manned vehicle flies at a lower flying height, the distance measurement method provided by the embodiment of the invention can be adopted to determine the relative height between the manned vehicle and the ground and the distance between the manned vehicle and the surrounding obstacles in the scene, so that the safety of a driver and the vehicle is ensured, and the control difficulty of the vehicle is reduced.
It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present invention, and the embodiments of the present invention are not limited in this respect. Rather, embodiments of the present invention may be applied to any ranging scenario.
Exemplary method
A ranging method according to an exemplary embodiment of the present invention is described below with reference to fig. 1 in conjunction with an application scenario. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present invention, and the embodiments of the present invention are not limited in this respect. Rather, embodiments of the present invention may be applied to any scenario where a ranging method is applicable.
The invention provides a distance measuring method, which comprises the following steps:
s101, selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft.
S102, determining the minimum distance between the area to be measured and the aircraft according to the visible area, wherein the minimum distance is used for determining the flight track of the aircraft.
In the embodiment of the invention, the area to be detected is an aircraft approach area. The three-dimensional point cloud constructed based on the region to be measured refers to a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed based on the region to be measured, the coordinates of each point in the three-dimensional point cloud are the positions of the point in the actual space, the coordinates of each point are three-dimensional, and the points can be called as three-dimensional coordinate points. Each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be measured one by one, and the coordinates of each three-dimensional coordinate point are used for indicating the distance between each corresponding sub-area and the aircraft.
By the method, the visual area can be selected from the three-dimensional point cloud constructed based on the area to be measured, so that the minimum distance between the area to be measured and the aircraft can be determined according to the visual area, the problems that the computing resource demand and the energy consumption are high due to the fact that the distance between the aircraft and all points in the whole area to be measured needs to be computed are solved, the computing amount in the distance measurement process is greatly reduced, and the implementation difficulty and the implementation cost in the distance measurement process are reduced.
The following will explain the ranging method provided by the embodiment of the present invention in detail:
in the embodiment of the invention, the external shape information of the aircraft includes, but is not limited to, the volume of the aircraft, the size of the aircraft and the external shape of the aircraft.
In S101, there are various ways to select a visible region from the three-dimensional point cloud constructed based on the region to be detected according to the profile information of the aircraft.
One implementation of S101 is: before the visible area is selected from the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft, the proportional relation between the visible area and the three-dimensional point cloud constructed based on the area to be detected can be set according to the volume of the aircraft and the preset flying height of the aircraft. And then, in S101, selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the proportional relation.
Another implementation manner of S101 is: and dynamically selecting a visual area from the three-dimensional point cloud constructed based on the area to be detected according to the volume of the aircraft and the flight height of the aircraft.
Through the implementation mode, the visual area is selected from the three-dimensional point cloud constructed based on the whole area to be measured, so that the follow-up calculation can be carried out according to the visual area, the calculation resource pressure and high energy consumption caused by the calculation based on the whole area to be measured are avoided, the realization difficulty, the energy consumption and the calculation resource demand of the follow-up calculation process are reduced, and the distance measurement scene on an aircraft, particularly a small aircraft, is more facilitated to be realized.
In the embodiment of the invention, before S101, a three-dimensional point cloud may be constructed based on the region to be measured. Optionally, the step of constructing the three-dimensional point cloud as shown in fig. 2 is as follows:
step 201: and acquiring a depth image of the region to be detected.
And the coordinates of each pixel point in the depth image are used for indicating the distance between each terrain area in the area to be measured corresponding to each pixel point and the aircraft.
One method for obtaining the depth image in step 201 may be: at least two frames of image information of the area to be measured are obtained through binocular camera equipment, and parallax calculation is carried out on the at least two frames of image information to obtain a depth image of the area to be measured. Preferably, at least two frames of image information of the area to be detected can be acquired through the binocular camera at the same time, so that parallax calculation can be accurately performed subsequently based on the at least two frames of image information acquired at the same time, and the difficulty in registering the at least two frames of image information is reduced.
Optionally, before the binocular camera device acquires at least two frames of image information of the area to be detected, frame synchronization is performed on the binocular camera device, so that the binocular camera device can acquire the image information at the same time.
It should be noted that, in addition to the foregoing implementation, a depth image of the region to be measured may be acquired by a monocular camera device, or a depth image of the region to be measured may be acquired by another depth acquisition device, which is not limited in the embodiment of the present invention.
Step 202: and constructing a three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected.
And the pixel points in the depth image correspond to the three-dimensional coordinate points in the three-dimensional point cloud one to one. Optionally, before the three-dimensional point cloud of the region to be measured is constructed based on the depth image of the region to be measured, attitude information of the aircraft is acquired, and the attitude information includes, but is not limited to, euler angles. The method for acquiring the attitude information of the aircraft is similar to the prior art, and is not described in detail here.
There are various ways to construct the three-dimensional point cloud of the region to be measured based on the depth image of the region to be measured in step 202. Taking a depth image as a depth map collected by binocular camera equipment as an example: the parallax can be obtained according to the difference of the imaging positions of the same point on the left imaging plane and the right imaging plane, and under the condition that the base length, the camera focal length and the parallax value are known (the base length and the camera focal length can be obtained by calibrating the camera), the following formula can be easily obtained according to the triangle similarity principle:
camera focal length x base length/parallax value
Therefore, according to the formula, all the pixel points which can be in one-to-one correspondence on the left/right eye image collected by the traversal binocular camera device are obtained, the depth values of all the pixel points are stored as an image with a depth value, and the image is stored as a depth map.
In order to ensure the accuracy of the subsequent ranging process, the coordinates of each three-dimensional coordinate point in the three-dimensional point cloud must be the coordinates of the point in real three-dimensional space. However, when an included angle exists between the depth acquisition device, such as a binocular camera device, and the horizontal plane, the three-dimensional point cloud obtained directly through depth image conversion necessarily also has an included angle with the horizontal plane, and therefore does not meet the requirement for the three-dimensional point cloud image, and therefore, when an included angle exists between the depth acquisition device (for example, a binocular camera device mounted on an aircraft) and the horizontal plane, step 202 also needs to perform corresponding angle adjustment on each three-dimensional coordinate point in the three-dimensional point cloud.
One implementation of step 202 is shown in FIG. 3, and includes the following steps:
step 301: and projecting the depth image of the area to be measured to a three-dimensional space to form an initial three-dimensional point cloud. The three-dimensional space may be a three-dimensional space based on a world coordinate system, or may be a three-dimensional space based on another coordinate system, which is not limited in the embodiment of the present invention.
Step 302: and judging whether the attitude information of the aircraft has an included angle. Attitude information of the aircraft includes, but is not limited to, euler angles.
Step 303: and if the attitude information has an included angle, reversely rotating the coordinates of each three-dimensional coordinate point in the initial three-dimensional point cloud by the angle which is the same as the included angle to obtain the three-dimensional point cloud constructed based on the area to be measured.
Taking the ranging scene shown in fig. 4 as an example, if the flight attitude tilt of the aircraft is measured, that is, the attitude information fed back by the flight control includes the included angle α, the coordinates of each three-dimensional coordinate point in the initial three-dimensional point cloud are reversely rotated by the same angle as the included angle α to obtain the three-dimensional point cloud constructed based on the region to be measured.
Another implementation of step 202 is: and traversing each pixel point in the depth image, and taking the depth value of each pixel point as the coordinate of the pixel point on the third dimension, so that the original two-dimensional coordinate and the coordinate on the third dimension jointly form a three-dimensional coordinate, and the three-dimensional point cloud of the area to be detected can be obtained.
Therefore, the angle correction is carried out on each three-dimensional coordinate point in the initial three-dimensional point cloud through the two implementation modes, and the influence of the attitude of the aircraft on each three-dimensional coordinate point in the three-dimensional point cloud is avoided.
The specific implementation manner of determining the minimum distance between the region to be measured and the aircraft according to the visible region in S102 may include the following steps, for example:
the method comprises the following steps: and projecting the three-dimensional point cloud in the visible area to a two-dimensional plane to obtain a two-dimensional point cloud corresponding to the visible area.
In the embodiment of the invention, the three-dimensional point cloud in the visible area is a subset of the three-dimensional point cloud, a plurality of two-dimensional coordinate points in the two-dimensional point cloud correspond to a plurality of three-dimensional coordinate points in the three-dimensional point cloud in the visible area, and the coordinate of each two-dimensional coordinate point in the plurality of two-dimensional coordinate points is used for indicating the distance between each corresponding sub-area and the aircraft.
Taking the visible area as a window, and taking as an example that all three-dimensional coordinate points in the window include three axial coordinates of x \ y \ z, the method of the first step may be: and deleting the y-axis coordinates of the three-dimensional coordinate points aiming at all the three-dimensional coordinate points in the window, namely forming a new two-dimensional coordinate system by adopting the x-axis coordinates and the z-axis coordinates of the three-dimensional coordinate points, so that all the three-dimensional coordinate points in the window can fall on the same two-dimensional plane, and the projection of all the three-dimensional coordinate points in the window to the two-dimensional plane is realized.
Step two: and determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area.
The concrete implementation method of the step two is various. Optionally, one of them may be, for example: the method comprises the steps of firstly carrying out discrete point removing processing on two-dimensional point clouds in a visible area, then selecting a two-dimensional coordinate point with the minimum indicated distance from the two-dimensional point clouds in the visible area after processing, and taking the distance indicated by the two-dimensional coordinate point as the minimum distance between an area to be measured and an aircraft. It should be noted that the process of removing discrete points is similar to the prior art, and is not described here again.
After S102, after the minimum distance between the area to be measured and the aircraft is determined according to the visible area, whether the minimum distance deviates from a preset range is judged. And if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the distance between the aircraft and the area to be measured unchanged. If the minimum distance deviates from the preset range, the aircraft is instructed to adjust the distance between the aircraft and the area to be measured, for example, the aircraft can be instructed to stop advancing, or the aircraft is instructed to adjust the flight direction. Therefore, the aircraft can be kept at a preset flying height or flying direction, and the aircraft is prevented from being crashed due to contact with the ground or obstacles.
The following describes an application of the ranging method provided by the embodiment of the present invention in different scenarios through two exemplary scenarios:
as shown in fig. 5, the present invention provides a ranging method applied to terrain following, including:
s501, selecting a visible area from the three-dimensional point cloud constructed based on the terrain to be detected according to the appearance information of the aircraft. In the embodiment of the invention, the terrain to be detected is the terrain of an aircraft approach area, the three-dimensional point cloud constructed on the basis of the terrain to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the terrain to be detected, each three-dimensional coordinate point in the plurality of three-dimensional coordinate points corresponds to each sub-area in the terrain to be detected one by one, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft.
And S502, determining the minimum distance between the terrain to be measured and the aircraft according to the visible area.
And S503, determining the flight track of the aircraft during terrain following according to the minimum distance.
In S503, the implementation manner of determining the flight trajectory of the aircraft during terrain following according to the minimum distance is as follows: and judging whether the minimum distance deviates from a preset range. If the minimum distance does not deviate from the preset range, indicating the aircraft to keep the flying height of the aircraft unchanged; and if the minimum distance deviates from the preset range, indicating the aircraft to adjust the flight height of the aircraft. Wherein, the flying height is the distance between the aircraft and the terrain to be measured.
The above-described ranging method applied to terrain following may be performed by a ranging device, particularly a binocular ranging device or a binocular ranging sensor, mounted on an aircraft, so as to protect the aircraft from a crash due to touchdown.
It should be noted that S501 to S502 are similar to S101 to S102, and refer to the description of S101 to S102, which is not repeated herein.
As shown in fig. 6, the present invention provides a distance measuring method applied to obstacle avoidance, including:
s601, selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft; the method comprises the steps that a region to be detected is an aircraft approach region, three-dimensional point cloud constructed on the basis of the region to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the region to be detected, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-region in the region to be detected in a one-to-one mode, and the coordinates of each three-dimensional coordinate point are used for indicating the distance between each corresponding sub-region and an aircraft.
S602, determining the minimum distance between the area to be measured and the aircraft according to the visual area, wherein the area to be measured comprises the obstacles in the area to be measured.
And S603, determining the flight track of the aircraft during obstacle avoidance according to the minimum distance.
In S603, the implementation manner of determining the flight trajectory of the aircraft when avoiding the obstacle according to the minimum distance may be: and judging whether the minimum distance is smaller than a preset range. If the minimum distance is not less than the preset range, the distance between the aircraft and the obstacle is in the safe range, so that the aircraft can be instructed to keep the flight direction of the aircraft unchanged, or the aircraft is not instructed to adjust the flight direction of the aircraft. If the minimum distance is smaller than the preset range, it means that the distance between the aircraft and the obstacle is too small, which may cause the aircraft to collide with the obstacle, and therefore the aircraft needs to be instructed to adjust the flight direction of the aircraft.
The distance measuring method applied to obstacle avoidance can be performed by a distance measuring device, particularly a binocular distance measuring device or a binocular distance measuring sensor, mounted on an aircraft, so that the aircraft is prevented from being crashed due to contact with an obstacle.
It should be noted that S601 to S602 are similar to S101 to S102, and refer to the related description of S101 to S102, which is not repeated herein.
The following distance description will be made with respect to several specific ranging scenarios:
to illustrate
Use the aircraft for plant protection unmanned aerial vehicle under the farming scene, the region that awaits measuring is the slope for example, and binocular range unit carries on in the plant protection unmanned aerial vehicle outside, and the height that is required when assuming plant protection unmanned aerial vehicle to spray the pesticide in the farmland to keep and between the crops is 1.5 meters, and unmanned aerial vehicle flying speed is 2 meters/second, and unmanned aerial vehicle is the biggest to climb or the rate of descent is 3 meters/second. The binocular range finder may process 10 frames of images per second, i.e., may output 10 range points per second. When encountering a rising slope of 30 degrees, if the plant protection unmanned aerial vehicle does not respond, the plant protection unmanned aerial vehicle can impact the slope after continuously flying for 3 meters and 1.5 seconds. In order to avoid the impact, the plant protection unmanned aerial vehicle must react when getting into the slope, calculates according to the above-mentioned parameter, 0.1 second after plant protection unmanned aerial vehicle flies into the slope region, and the distance point that binocular range unit gave is 1.5 meters, and the perpendicular distance of this moment plant protection unmanned aerial vehicle and slope is 1.35 meters. When the binocular distance measuring device gives a second distance point, the vertical distance between the plant protection unmanned aerial vehicle and the slope is 1.2 meters, the distance value of the distance point is 1.35 meters and is smaller than 1.5 meters, and then the plant protection unmanned aerial vehicle needs to make a rising reaction. The ascending speed is 3 m/s, the ascending height is 0.3 m when the unmanned plant protection vehicle continuously flies for 0.1 s, the vertical distance between the unmanned plant protection vehicle and the slope is 1.35 m, the unmanned plant protection vehicle needs to continuously ascend at the speed of 0.3 m/s, the vertical distance between the unmanned plant protection vehicle and the slope is 1.5 m after 0.1 s, and the vertical distance is 1.65 m after 0.1 s. It is thus clear that can successfully prevent plant protection unmanned aerial vehicle striking slope through above-mentioned range finding process, realize plant protection unmanned aerial vehicle's topography and follow.
It should be noted that the specific data in the first example are exemplary data, and the specific data is not limited in the embodiment of the present invention.
By way of example, 2
The industrial unmanned aerial vehicle under the scene of power pipeline inspection is taken as an aircraft, the area to be detected is the ground and ground obstacles, the industrial unmanned aerial vehicle needs to be ensured not to collide with any ground obstacles when executing the flight task of power inspection, and the obstacles include but are not limited to a power iron tower, a chimney, a high-rise building and a cliff stone wall. When the industrial unmanned aerial vehicle executes a task similar to power patrol inspection, a binocular distance measuring device comprising a binocular camera is generally mounted right ahead. During the flight process, the binocular camera can output 10 distance measuring points per second and send the distance measuring points to the flight control system, and the flight control system stops advancing after finding that the distance between the industrial unmanned aerial vehicle and an obstacle is smaller than a warning value, or changes the flight direction to avoid collision.
It should be noted that the specific data in the second example are exemplary data, and the specific data is not limited in the embodiment of the present invention.
According to the technical scheme provided by the invention, the visual area can be selected from the three-dimensional point cloud constructed based on the area to be measured, so that the minimum distance between the area to be measured and the aircraft can be determined according to the visual area, the high calculation resource demand and energy consumption caused by the need of calculating the distance between the aircraft and all points in the whole area to be measured are avoided, the calculated amount in the distance measuring process is greatly reduced, and the implementation difficulty and the implementation cost in the distance measuring process are reduced.
Exemplary devices
Having described the method of an exemplary embodiment of the present invention, it follows that the present invention provides an apparatus for an exemplary implementation. The installation method or installation part of the distance measuring device or distance measuring sensor provided by the present invention is not limited in the present invention. The distance measuring device or the distance measuring sensor provided by the invention can be mounted below, in front of, behind, on the left side of or on the right side of the aircraft, or can be in rigid connection or flexible connection.
Referring to fig. 7, the present invention provides a ranging apparatus that can implement the method of the exemplary embodiment of the present invention corresponding to fig. 1. Referring to fig. 7, the apparatus includes: a selecting unit, a determining unit and a binocular camera device, wherein,
the device comprises a selecting unit, a judging unit and a judging unit, wherein the selecting unit is used for selecting a visible area from three-dimensional point clouds constructed on the basis of an area to be detected according to the appearance information of an aircraft, the area to be detected is an aircraft approach area, the three-dimensional point clouds constructed on the basis of the area to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed on the basis of the area to be detected, each three-dimensional coordinate point in the three-dimensional coordinate points corresponds to each sub-area in the area to be detected one by one, and the coordinate of each three-dimensional coordinate point;
and the determining unit is used for determining the minimum distance between the area to be measured and the aircraft according to the visible area, and the minimum distance is used for determining the flight track of the aircraft.
Optionally, the determining unit is specifically configured to: projecting the three-dimensional point cloud in the visible area to a two-dimensional plane to obtain a two-dimensional point cloud corresponding to the visible area, wherein the three-dimensional point cloud in the visible area is a subset of the three-dimensional point cloud, a plurality of two-dimensional coordinate points in the two-dimensional point cloud correspond to a plurality of three-dimensional coordinate points in the three-dimensional point cloud in the visible area, and the coordinate of each two-dimensional coordinate point in the plurality of two-dimensional coordinate points is used for indicating the distance between each corresponding sub-area and the aircraft; and determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area.
Optionally, when the determining unit determines the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area, the determining unit is specifically configured to: performing discrete point removing processing on the two-dimensional point cloud of the visible area; and selecting a two-dimensional coordinate point with the minimum indicated distance from the two-dimensional point cloud of the processed visual area, and taking the distance indicated by the two-dimensional coordinate point as the minimum distance between the area to be measured and the aircraft.
Optionally, the profile information of the aircraft comprises a volume of the aircraft.
Correspondingly, the selecting unit is further configured to: before a visual area is selected from the three-dimensional point cloud constructed based on the area to be measured according to the appearance information of the aircraft, the proportional relation between the visual area and the three-dimensional point cloud constructed based on the area to be measured is set according to the volume of the aircraft and the preset flying height of the aircraft. The selection unit is specifically configured to: and selecting a visible area from the three-dimensional point cloud constructed based on the area to be detected according to the proportional relation.
Correspondingly, optionally, the selecting unit is specifically configured to: and dynamically selecting a visual area from the three-dimensional point cloud constructed based on the area to be detected according to the volume of the aircraft and the flight height of the aircraft.
Optionally, the method further comprises a construction unit for: before a selecting unit selects a visual area in the three-dimensional point cloud constructed based on the area to be detected according to the appearance information of the aircraft, a depth image of the area to be detected is obtained; constructing a three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected; the coordinates of each pixel point in the depth image are used for indicating the distance between each terrain area in the area to be measured corresponding to each pixel point and the aircraft, and the pixel points in the depth image correspond to the three-dimensional coordinate points in the three-dimensional point cloud in a one-to-one mode.
Optionally, the construction unit is further configured to: before the three-dimensional point cloud of the region to be detected is constructed based on the depth image of the region to be detected, attitude information of the aircraft is obtained. When the building unit builds the three-dimensional point cloud of the region to be measured based on the depth image of the region to be measured, the building unit is specifically configured to: projecting the depth image of the region to be measured into a three-dimensional space to form an initial three-dimensional point cloud; judging whether included angles exist in the attitude information of the aircraft or not; and if the attitude information has an included angle, reversely rotating the coordinates of each three-dimensional coordinate point in the initial three-dimensional point cloud by the angle same as the included angle to obtain the three-dimensional point cloud of the area to be measured.
Optionally, when the building unit obtains the depth image of the region to be measured, the building unit is specifically configured to: acquiring at least two frames of image information of a region to be detected through binocular camera equipment; and performing parallax calculation on the at least two frames of image information to obtain a depth image of the region to be measured.
Optionally, the construction unit is further configured to: before at least two frames of image information of the area to be detected are acquired through the binocular camera equipment, frame synchronization is carried out on the binocular camera equipment.
Optionally, the device further comprises an adjusting unit, configured to determine whether the minimum distance deviates from a preset range after the determining unit determines the minimum distance between the region to be measured and the aircraft according to the visible region; if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the flying height of the aircraft unchanged; or if the minimum distance deviates from the preset range, the aircraft is instructed to adjust the flight height of the aircraft.
The invention also provides a terrain following sensor which comprises binocular camera equipment, a data interface and a processor, wherein the processor is used for executing the method, and the binocular camera equipment is used for acquiring the image information of the terrain to be detected.
The invention also provides a terrain following device which comprises binocular camera equipment, a processor, a memory and a transceiver. The binocular camera equipment is used for acquiring image information of a terrain to be measured; the memory is used for storing programs executed by the processor; the processor is used for executing the method according to the program stored in the memory and the image information of the terrain to be detected; the transceiver is used to receive or transmit data under the control of the processor.
The invention also provides an obstacle avoidance sensor which comprises binocular camera equipment, a data interface and a processor, wherein the processor is used for executing the method, the binocular camera equipment is used for collecting image information of the area to be detected, and the area to be detected comprises obstacles of the area to be detected.
The invention also provides an obstacle avoidance device which comprises binocular camera equipment, a processor, a memory and a transceiver. The binocular camera equipment is used for acquiring image information of a to-be-detected area, and the to-be-detected area comprises obstacles of the to-be-detected area; a memory for storing a program executed by the processor; the processor is used for executing the method according to the program stored in the memory and the image information of the terrain to be detected; a transceiver for receiving or transmitting data under the control of the processor.
The distance measuring device in the embodiment of the invention can be a distance measuring device based on binocular camera equipment. One possible structure of the distance measuring apparatus based on the binocular camera device, as shown in fig. 8, may include a left eye camera, a right eye camera, a main control chip (i.e., a processor), and a data interface.
It should be noted that all the binocular imaging apparatuses mentioned in the embodiments of the present invention may be replaced by other imaging apparatuses or depth acquisition apparatuses, for example, a monocular camera is used to replace the binocular imaging apparatuses, which is not limited in the embodiments of the present invention.
Exemplary Medium
Having described the method and apparatus of the exemplary embodiments of this invention, and referring next to FIG. 9, the present invention provides an exemplary medium having stored thereon computer-executable instructions operable to cause a computer to perform the method of any one of the corresponding exemplary embodiments of this invention of FIG. 1.
Exemplary computing device
Having described the method, medium, and apparatus of the exemplary embodiments of this invention, next, with reference to fig. 10, an exemplary computing device provided by the present invention is described, the device comprising a processor, a memory, and a transceiver, wherein the memory stores a program for execution by the processor; the processor is used for executing the method of any one of the corresponding exemplary embodiments of the invention in FIG. 1 according to the program stored in the memory; the transceiver is used for receiving or transmitting data under the control of the processor.
It should be noted that although in the above detailed description several units/modules or sub-units/modules of the ranging apparatus are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more of the units/modules described above may be embodied in one unit/module according to embodiments of the invention. Conversely, the features and functions of one unit/module described above may be further divided into embodiments by a plurality of units/modules.
Moreover, while the operations of the method of the invention are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
While the spirit and principles of the invention have been described with reference to several particular embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, nor is the division of aspects, which is for convenience only as the features in such aspects may not be combined to benefit. The invention is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Claims (10)
1. A method of ranging, comprising:
selecting a visual area according to appearance information of an aircraft in three-dimensional point cloud constructed based on an area to be detected, wherein the area to be detected is an aircraft route area, the three-dimensional point cloud constructed based on the area to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed based on the area to be detected, each three-dimensional coordinate point in the plurality of three-dimensional coordinate points corresponds to each sub-area in the area to be detected in a one-to-one mode, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft;
and determining the minimum distance between the area to be measured and the aircraft according to the visual area, wherein the minimum distance is used for determining the flight track of the aircraft.
2. The method of claim 1, wherein said determining a minimum distance between the area under test and the aircraft from the viewable area comprises:
projecting the three-dimensional point cloud in the visible area to a two-dimensional plane to obtain a two-dimensional point cloud corresponding to the visible area, wherein the three-dimensional point cloud in the visible area is a subset of the three-dimensional point cloud, a plurality of two-dimensional coordinate points in the two-dimensional point cloud correspond to a plurality of three-dimensional coordinate points in the three-dimensional point cloud in the visible area, and a coordinate of each two-dimensional coordinate point in the plurality of two-dimensional coordinate points is used for indicating a distance between each corresponding sub-area and the aircraft;
and determining the minimum distance between the area to be measured and the aircraft according to the two-dimensional point cloud of the visible area.
3. The method of claim 2, wherein determining the minimum distance between the area under test and the aircraft from the two-dimensional point cloud of the viewable area comprises:
performing discrete point removing processing on the two-dimensional point cloud of the visible area;
and selecting a two-dimensional coordinate point with the minimum indicated distance from the processed two-dimensional point cloud of the visual area, and taking the distance indicated by the two-dimensional coordinate point as the minimum distance between the area to be measured and the aircraft.
4. The method of claim 1, wherein the profile information for the aircraft comprises a volume of the aircraft.
5. The method of claim 1, wherein before selecting the visible area according to the profile information of the aircraft in the three-dimensional point cloud constructed based on the area to be measured, the method further comprises:
acquiring a depth image of the region to be detected;
constructing a three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected;
the coordinates of each pixel point in the depth image are used for indicating the distance between each terrain area in the area to be measured and the aircraft, which corresponds to each pixel point, and the pixel points in the depth image correspond to the three-dimensional coordinate points in the three-dimensional point cloud in a one-to-one mode.
6. The method of claim 5, wherein prior to constructing the three-dimensional point cloud for the region of interest based on the depth image of the region of interest, further comprising:
acquiring attitude information of the aircraft;
the method for constructing the three-dimensional point cloud of the region to be detected based on the depth image of the region to be detected comprises the following steps:
projecting the depth image of the region to be measured to a three-dimensional space to form an initial three-dimensional point cloud;
judging whether the attitude information of the aircraft has an included angle or not;
and if the included angle exists in the attitude information, reversely rotating the coordinate of each three-dimensional coordinate point in the initial three-dimensional point cloud by the angle same as the included angle to obtain the three-dimensional point cloud of the area to be detected.
7. The method of any of claims 1 to 6, wherein after determining the minimum distance between the area under test and the aircraft based on the viewing area, further comprising:
judging whether the minimum distance deviates from a preset range;
if the minimum distance does not deviate from the preset range, indicating the aircraft to keep the distance between the aircraft and the area to be measured unchanged; or
And if the minimum distance deviates from the preset range, indicating the aircraft to adjust the distance between the aircraft and the area to be measured.
8. A ranging method for terrain following, comprising:
selecting a visible area from a three-dimensional point cloud constructed based on a terrain to be detected according to appearance information of an aircraft, wherein the terrain to be detected is a terrain of an aircraft approach area, the three-dimensional point cloud constructed based on the terrain to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed based on the terrain to be detected, each three-dimensional coordinate point in the plurality of three-dimensional coordinate points corresponds to each sub-area in the terrain to be detected in a one-to-one mode, and a coordinate of each three-dimensional coordinate point is used for indicating a distance between each corresponding sub-area and the aircraft;
determining the minimum distance between the terrain to be measured and the aircraft according to the visual area;
and determining the flight track of the aircraft for terrain following according to the minimum distance.
9. A distance measurement method applied to obstacle avoidance is characterized by comprising the following steps:
selecting a visual area according to appearance information of an aircraft in three-dimensional point cloud constructed based on an area to be detected, wherein the area to be detected is an aircraft route area, the three-dimensional point cloud constructed based on the area to be detected is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed based on the area to be detected, each three-dimensional coordinate point in the plurality of three-dimensional coordinate points corresponds to each sub-area in the area to be detected in a one-to-one mode, and the coordinate of each three-dimensional coordinate point is used for indicating the distance between each corresponding sub-area and the aircraft;
determining the minimum distance between the area to be detected and the aircraft according to the visual area, wherein the area to be detected comprises obstacles in the area to be detected;
and determining the flight track of the aircraft during obstacle avoidance according to the minimum distance.
10. A distance measuring device is characterized by comprising binocular camera equipment and
the device comprises a selecting unit, a judging unit and a judging unit, wherein the selecting unit is used for selecting a visible area according to the appearance information of an aircraft in a three-dimensional point cloud constructed based on a to-be-detected area, the to-be-detected area is the aircraft route area, the three-dimensional point cloud constructed based on the to-be-detected area is a set of a plurality of three-dimensional coordinate points in a three-dimensional space constructed based on the to-be-detected area, each three-dimensional coordinate point in the plurality of three-dimensional coordinate points corresponds to each sub-area in the to-be-detected area in a one-to-one mode, and the coordinate of each three;
and the determining unit is used for determining the minimum distance between the area to be measured and the aircraft according to the visual area, and the minimum distance is used for determining the flight track of the aircraft.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810665952.3A CN110645960A (en) | 2018-06-26 | 2018-06-26 | Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810665952.3A CN110645960A (en) | 2018-06-26 | 2018-06-26 | Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110645960A true CN110645960A (en) | 2020-01-03 |
Family
ID=69008666
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810665952.3A Pending CN110645960A (en) | 2018-06-26 | 2018-06-26 | Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110645960A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112362026A (en) * | 2020-11-11 | 2021-02-12 | 深圳星迹国际餐饮管理有限公司 | Method and system for measuring flying height of unmanned aerial vehicle and flying dish transmission method and system |
CN112816967A (en) * | 2021-02-03 | 2021-05-18 | 成都康烨科技有限公司 | Image distance measuring method, device, distance measuring equipment and readable storage medium |
CN114742885A (en) * | 2022-06-13 | 2022-07-12 | 山东省科学院海洋仪器仪表研究所 | Target consistency judgment method in binocular vision system |
WO2022252036A1 (en) * | 2021-05-31 | 2022-12-08 | 深圳市大疆创新科技有限公司 | Method and apparatus for acquiring obstacle information, movable platform and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101464148A (en) * | 2007-12-21 | 2009-06-24 | 财团法人工业技术研究院 | Three-dimensional image detecting, compiling and reconstructing system |
CN101627280A (en) * | 2006-11-21 | 2010-01-13 | 曼蒂斯影像有限公司 | 3d geometric modeling and 3d video content creation |
US20170140539A1 (en) * | 2015-11-16 | 2017-05-18 | Abb Technology Ag | Three-dimensional visual servoing for robot positioning |
CN106774410A (en) * | 2016-12-30 | 2017-05-31 | 易瓦特科技股份公司 | Unmanned plane automatic detecting method and apparatus |
-
2018
- 2018-06-26 CN CN201810665952.3A patent/CN110645960A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101627280A (en) * | 2006-11-21 | 2010-01-13 | 曼蒂斯影像有限公司 | 3d geometric modeling and 3d video content creation |
CN101464148A (en) * | 2007-12-21 | 2009-06-24 | 财团法人工业技术研究院 | Three-dimensional image detecting, compiling and reconstructing system |
US20170140539A1 (en) * | 2015-11-16 | 2017-05-18 | Abb Technology Ag | Three-dimensional visual servoing for robot positioning |
CN106774410A (en) * | 2016-12-30 | 2017-05-31 | 易瓦特科技股份公司 | Unmanned plane automatic detecting method and apparatus |
Non-Patent Citations (1)
Title |
---|
王林等: "复杂环境下多无人机协作式地面移动目标跟踪", 控制理论与应用 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112362026A (en) * | 2020-11-11 | 2021-02-12 | 深圳星迹国际餐饮管理有限公司 | Method and system for measuring flying height of unmanned aerial vehicle and flying dish transmission method and system |
CN112362026B (en) * | 2020-11-11 | 2023-10-27 | 深圳星迹国际餐饮管理有限公司 | Method and system for measuring flying height of unmanned aerial vehicle, and flying dish conveying method and system |
CN112816967A (en) * | 2021-02-03 | 2021-05-18 | 成都康烨科技有限公司 | Image distance measuring method, device, distance measuring equipment and readable storage medium |
WO2022252036A1 (en) * | 2021-05-31 | 2022-12-08 | 深圳市大疆创新科技有限公司 | Method and apparatus for acquiring obstacle information, movable platform and storage medium |
CN114742885A (en) * | 2022-06-13 | 2022-07-12 | 山东省科学院海洋仪器仪表研究所 | Target consistency judgment method in binocular vision system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220124303A1 (en) | Methods and systems for selective sensor fusion | |
EP3903164B1 (en) | Collision avoidance system, depth imaging system, vehicle, map generator, amd methods thereof | |
US11032527B2 (en) | Unmanned aerial vehicle surface projection | |
US9898821B2 (en) | Determination of object data by template-based UAV control | |
EP3264364B1 (en) | Method and apparatus for obtaining range image with uav, and uav | |
US9481982B1 (en) | Method and control system for surveying and mapping a terrain while operating a bulldozer | |
CN106774431B (en) | Method and device for planning air route of surveying and mapping unmanned aerial vehicle | |
AU2015234395B2 (en) | Real-time range map generation | |
EP3081902B1 (en) | Method and apparatus for correcting aircraft state in real time | |
CN107656545A (en) | A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid | |
CN110645960A (en) | Distance measurement method, terrain following distance measurement method, obstacle avoidance distance measurement method and device | |
US20200191556A1 (en) | Distance mesurement method by an unmanned aerial vehicle (uav) and uav | |
AU2012202966B2 (en) | Method for pilot assistance for the landing of and aircraft in restricted visibility | |
CN105424006A (en) | Unmanned aerial vehicle hovering precision measurement method based on binocular vision | |
KR20200031165A (en) | Navigation chart configuration method, obstacle avoidance method and device, terminal, drone | |
JP7011908B2 (en) | Optical information processing equipment, optical information processing method and optical information processing program | |
CN105045276A (en) | Method and apparatus for controlling flight of unmanned plane | |
US20210263533A1 (en) | Mobile object and method for controlling mobile object | |
CN112612291A (en) | Air route planning method and device for unmanned aerial vehicle for oil field surveying and mapping | |
Moore et al. | UAV altitude and attitude stabilisation using a coaxial stereo vision system | |
KR101764222B1 (en) | System and method for high precise positioning | |
CN107323677B (en) | Unmanned aerial vehicle auxiliary landing method, device, equipment and storage medium | |
CN102706360B (en) | Method utilizing optical flow sensors and rate gyroscope to estimate state of air vehicle | |
US10424105B2 (en) | Efficient airborne oblique image collection | |
Hintze | Autonomous landing of a rotary unmanned aerial vehicle in a non-cooperative environment using machine vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200103 |
|
WD01 | Invention patent application deemed withdrawn after publication |