WO2022174712A1 - 一种四旋翼无人机 - Google Patents

一种四旋翼无人机 Download PDF

Info

Publication number
WO2022174712A1
WO2022174712A1 PCT/CN2022/072784 CN2022072784W WO2022174712A1 WO 2022174712 A1 WO2022174712 A1 WO 2022174712A1 CN 2022072784 W CN2022072784 W CN 2022072784W WO 2022174712 A1 WO2022174712 A1 WO 2022174712A1
Authority
WO
WIPO (PCT)
Prior art keywords
binocular
sets
sensors
fisheye
drone
Prior art date
Application number
PCT/CN2022/072784
Other languages
English (en)
French (fr)
Inventor
郑欣
Original Assignee
深圳市道通智能航空技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术股份有限公司 filed Critical 深圳市道通智能航空技术股份有限公司
Publication of WO2022174712A1 publication Critical patent/WO2022174712A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters

Definitions

  • the present application relates to the field of unmanned aerial vehicles, and in particular, to a quadrotor unmanned aerial vehicle.
  • UAVs are usually equipped with sensors for environmental perception to perform obstacle avoidance or braking.
  • the existing UAV perception system has the following problems: 1. Limited by size or cost, only some directions are equipped with binocular sensors, and such a design will have a blind spot for perception; 2. Equipped with six sets of binocular omnidirectional perception The dual-target targeting required by UAVs takes a long time, the production cost is high, and they are often bulky and cannot be folded.
  • foldable UAVs are limited by the structure, some binocular sensors have short baselines and complex structures; third, the binoculars are easily blocked by the arms, the viewing angle is narrow, and there is a certain blind spot for perception, and the arm or fuselage structure is not suitable for the dual cameras.
  • the occlusion of the visual field will affect the stability of the binocular matching algorithm; 4.
  • the existing method of realizing all-round environment perception and eliminating the blind spot of perception by moving or rotating the sensor does not belong to the omni-directional and all-time perception. In complex environments or high-speed motion 5.
  • the folding structure cannot be used, or the structural stability and the stability of the structure cannot be affected after the folding structure is used.
  • the requirements of the calibration algorithm are very strict; 6.
  • the edge error of the fisheye image based on the traditional plane perspective correction is large, and there is a problem of inaccurate stereo matching; 7.
  • the fisheye image has a large distortion, and the existing polar curves, polar coordinates, The depth calculation of latitude and longitude coordinates is large and the real-time performance is poor.
  • the present application provides the following solutions.
  • an embodiment of the present application further provides a quadrotor UAV, the UAV includes: two sets of binocular sensors and two sets of fisheye binocular sensors;
  • Two sets of binocular sensors are respectively arranged on the fuselage of the UAV in a diagonal manner;
  • Two sets of fisheye binocular sensors are respectively arranged on the fuselage of the drone in a diagonal manner;
  • the sum of the field angles of the two sets of binocular sensors and the two sets of fisheye binocular sensors in any direction is greater than or equal to 360 degrees.
  • the field of view of the fisheye binocular sensor is greater than 180 degrees.
  • two sets of binocular sensors and two sets of fisheye binocular sensors are used to collect images of the environment around the UAV.
  • the above-mentioned UAV also includes a processor
  • the processor is used for processing the environmental images collected by the two sets of binocular sensors and the two sets of fisheye binocular sensors to construct a three-dimensional map.
  • the processor is configured to process the environmental image collected by the fisheye binocular sensor by using a block correction method to obtain a block correction image of each fisheye binocular sensor.
  • the processor is further configured to calculate the corresponding block correction images in the two groups of fisheye binocular sensors through a binocular matching method, and generate a disparity map.
  • the processor is further configured to remove the occluder in the parallax map by masking.
  • the processor is further configured to generate a point cloud according to the disparity map from which the occluders are removed, and construct a three-dimensional map according to the point cloud.
  • An embodiment of the present application provides a quadrotor drone, which includes two sets of binocular sensors and two sets of fisheye binocular sensors, wherein the two sets of binocular sensors are respectively disposed on the drone's body in a diagonal manner.
  • the two sets of fisheye binocular sensors are respectively arranged on the fuselage of the drone in a diagonal manner, and the sum of the field angles of the two sets of binocular sensors and the two sets of fisheye binocular sensors in any direction is greater than or equal to Equal to 360 degrees.
  • Such an arrangement can be flexibly applied to the UAV structure with folding arms. Compared with the non-folding omnidirectional sensing quadrotor system, the volume of the UAV can be greatly reduced.
  • only four sets of binocular sensors are used to ensure full coverage of the viewing angle and realize omnidirectional perception. Compared with the implementation in the prior art, the number of lenses used is less, which can save costs.
  • FIG. 1 is a front view of a sensor provided on an unmanned aerial vehicle according to an embodiment of the present application
  • FIG. 2 is a left view of a sensor provided on an unmanned aerial vehicle provided by an embodiment of the present application
  • FIG. 3 is a top view of a sensor provided on an unmanned aerial vehicle according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a blind spot in a left view of a drone provided with a fisheye binocular sensor provided in an embodiment of the present application;
  • FIG. 5 is a schematic diagram of a blind area in a top view of a drone provided with a fisheye binocular sensor according to an embodiment of the present application;
  • FIG. 6 is a schematic diagram of a block correction provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a binocular structure formed by a block rectified image provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a dual-purpose pinhole camera provided by an embodiment of the present application.
  • words such as “optionally” or “exemplarily” are used to represent examples, illustrations, or illustrations. Any embodiment or design described in the embodiments of the present application as “optionally” or “exemplarily” should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as “optionally” or “exemplarily” is intended to present the related concepts in a specific manner.
  • the embodiment of the present application provides a quadrotor unmanned aerial vehicle
  • the unmanned aerial vehicle may include two sets of binocular sensors and two sets of fish-eye binocular sensors.
  • the two sets of binocular sensors are respectively arranged on the fuselage of the drone in a diagonal manner.
  • the two sets of fisheye binocular sensors are also arranged on the fuselage of the drone in a diagonal manner.
  • Figure 1, Figure 2, and Figure 3 they are a front view, a left view, and a top view of the arrangement of two sets of binocular sensors and two sets of fisheye binocular sensors, respectively.
  • Figures 1 to 3 show the setting method: two sets of fisheye binocular sensors are arranged above and below the fuselage of the drone, and two sets of wide-angle binocular sensors are arranged on the front and rear of the drone fuselage, respectively.
  • the binocular sensor here can include any type of lens, and of course fisheye binocular sensors, that is, four groups of fisheye binocular sensors can be set on the drone body.
  • the two sets of binocular sensors and the two sets of fish-eye binocular sensors are required to ensure that the sum of their field of view angles is greater than or equal to 360 degrees in any direction.
  • the two groups of fisheye binocular sensors are required to have a field of view angle greater than 180 degrees, as shown in FIG. 1 , that is, it needs to ensure that a+b>360°.
  • the binocular parallax of the fisheye binocular sensor in a certain area in the baseline direction is too small, there is a certain blind area.
  • the size of the blind area depends on the imaging quality of the lens and the resolution of the sensor, generally not exceeding 90 degrees.
  • the shaded portion formed by the dots in the left view as shown in FIG. 4 that is, m+n ⁇ 90°
  • the shaded portion formed by the dots in the top view shown in FIG. 5 that is, k ⁇ 90°. Therefore, it is designed to add two sets of binocular lenses on the drone body to make up for this blind spot, so as to ensure that the sum of the field angles of the four sets of sensors in any direction is greater than or equal to 360 degrees.
  • the quadrotor UAV provided by the embodiment of the present application includes two sets of binocular sensors and two sets of fisheye binocular sensors, wherein the two sets of binocular sensors are respectively arranged on the fuselage of the drone in a diagonal manner, and the two The two sets of fisheye binocular sensors are respectively arranged on the fuselage of the UAV in a diagonal manner.
  • the sum of the field angles of the two sets of binocular sensors and the two sets of fisheye binocular sensors in any direction is greater than or equal to 360 degrees. .
  • Such an arrangement can be flexibly applied to the UAV structure with folding arms. Compared with the non-folding omnidirectional sensing quadrotor system, the volume of the UAV can be greatly reduced.
  • only four sets of binocular sensors are used to ensure full coverage of the viewing angle and realize omnidirectional perception. Compared with the implementation in the prior art, the number of lenses used is less, which can save costs.
  • the above-mentioned UAV may further include a processor
  • the processor is used to process the environmental images collected by the above two sets of binocular sensors and two sets of fisheye binocular sensors to construct a three-dimensional map.
  • the processor may use a block correction method to process the environment image collected by the fisheye binocular sensor to obtain a block correction image of each fisheye binocular sensor.
  • Block correction can set the imaging plane as multiple virtual imaging planes in the spherical surface, each virtual imaging plane constitutes a pinhole camera model, and the settings of the size, number or direction of the imaging planes can be adjusted according to the actual situation. For example, it can be set to 2, 3, or 5 virtual imaging planes. As shown in FIG. 6 , it is an example in which three virtual planes are perpendicular to each other.
  • u v is the image coordinates of the block correction
  • K v is the internal parameter of the pinhole camera model corresponding to the virtual plane, which can be determined by the size and position of the virtual imaging plane.
  • P v c is the transformation matrix from the virtual imaging plane coordinate system to the physical imaging plane coordinate system
  • K c is the projection matrix from the three-dimensional fisheye space to the imaging plane, that is, the fisheye projection model.
  • This parameter can be determined by the fisheye calibration model, for example , you can use UCM, EUCM, KB and other models. Based on the above formula (1), the original fisheye image can be converted into a block corrected image.
  • the above-mentioned processor is further configured to calculate the corresponding block correction images in the two groups of fisheye binocular sensors through a binocular matching method, and generate a disparity map.
  • a binocular structure can be formed with the corresponding right camera segment, that is, for each group of binocular Fish eyes, a_l and a_r, b_l and b_r, c_l and c_r in the figure form binoculars respectively.
  • the disparity map can be generated by using bm, sgbm and other methods to perform binocular matching on the three groups of binocular fisheyes. Moreover, the calculation process can be compatible with most binocular calculation modules in the prior art.
  • the two sets of fisheye binoculars can be equivalent to six sets of conventional pinhole camera binoculars, as shown in FIG. 8 .
  • This processing method can ensure the edge quality of the fisheye image to the greatest extent, which not only ensures the field of view of the fisheye binocular captured image, but also can use conventional binocular matching to generate a disparity map to reduce the amount of calculation.
  • the above-mentioned processor is further configured to remove occluders in the parallax map by masking.
  • the occluder can be an aircraft tripod, etc.
  • the images collected by the fisheye binocular sensor may There will be obstructions such as aircraft tripods, so you can use mask covering to remove obstructions in the image.
  • the mask here is a black and white binary image, the black area is covered, and the white area is reserved.
  • the processing method is as follows: The processing procedure in the prior art will not be repeated here.
  • the black mask area should be slightly larger than the actual area, so the union of the left and right masks of the fisheye binocular sensor can be used as the final mask, and use the final mask as a fixed configuration, which is loaded when the UAV system starts, and applied to the disparity map of the binocular output.
  • the above-mentioned processor may also be configured to generate a point cloud according to the disparity map from which the occluders are removed, and construct a three-dimensional map according to the point cloud.
  • a point cloud can be generated as follows:
  • baseline is the length of the binocular baseline
  • disparity is the disparity value
  • cx, cy, fx, and fy are the binocular fixed parameters
  • px and py are the coordinates in the disparity map.
  • a 3D map can be constructed.
  • UAVs can perform tasks such as obstacle avoidance braking and path planning based on the constructed 3D map.
  • the present application can be implemented by means of software and necessary general-purpose hardware, and of course can also be implemented by hardware, but in many cases the former is a better implementation manner .
  • the technical solutions of the present application can be embodied in the form of software products in essence or the parts that make contributions to the prior art, and the computer software products can be stored in a computer-readable storage medium, such as a floppy disk of a computer , read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), flash memory (FLASH), hard disk or optical disk, etc., including several instructions to make a computer device (which can be a personal computer , server, or network device, etc.) to implement the methods or functions described in the various embodiments of the present application.
  • a computer-readable storage medium such as a floppy disk of a computer , read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), flash memory (FLASH), hard disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Image Processing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

一种四旋翼无人机,包括两组双目传感器和两组鱼眼双目传感器,其中,两组双目传感器分别以对角线的方式设置于无人机的机身,两组鱼眼双目传感器分别以对角线的方式设置于无人机的机身,两组双目传感器和两组鱼眼双目传感器在任意方向上的视场角的和大于或等于360度,这样的设置方式可以灵活地应用于折叠机臂的无人机结构上,相比于非折叠的全向感知四旋翼系统,无人机的体积可以大大减小,仅采用四组双目传感器即可保证视角全覆盖,实现全向感知,相比于现有技术中的实现方式采用的镜头数量更少,可以节约成本。

Description

一种四旋翼无人机
本申请要求于2021年2月22日提交中国专利局、申请号为202120393695X、申请名称为“一种四旋翼无人机”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及无人机领域,尤其涉及一种四旋翼无人机。
背景技术
在现有的无人机领域,通常为无人机配备传感器进行环境感知,以执行避障或刹车。但现有的无人机感知系统存在以下问题:一、受限于体积或成本,仅部分方向上配备双目传感器,这样的设计会存在感知盲区;二、配备六组双目的全向感知无人机所需的双目标定时间长,生产成本高,往往体积大,无法折叠。其中,可折叠的无人机受结构限制,部分双目传感器基线短,结构复杂;三、双目易被机臂遮挡,视角窄,存在一定的感知盲区,并且机臂或机身结构对双目视野的遮挡会影响双目匹配算法的稳定性;四、现有通过移动或旋转感知传感器实现全方位环境感知并消除感知盲区的方法不属于全向全时感知,在复杂环境或高速运动时难以胜任;五、现有的三目鱼眼系统的全向感知方案在实现全时无盲区时,为了满足三目结构安装,无法使用折叠结构,或者在使用了折叠结构后对结构稳定性以及标定算法的要求十分严格;六、基于传统的平面透视矫正产生的鱼眼图像边缘误差大,存在立体匹配不精准的问题;七、鱼眼图像畸变大,现有的极曲线、极坐标、基于经纬度坐标的深度计算量大,实时性差。
实用新型内容
为了解决上述至少一个技术问题,本申请提供了以下解决方案。
第一方面,本申请实施例还提供了一种四旋翼无人机,该无人机包括:两组双目传感器和两组鱼眼双目传感器;
两组双目传感器分别以对角线的方式设置于无人机的机身;
两组鱼眼双目传感器分别以对角线的方式设置于无人机的机身;
两组双目传感器和两组鱼眼双目传感器在任意方向上的视场角的和大于或等于360度。
可选地,鱼眼双目传感器的视场角大于180度。
可选地,两组双目传感器和两组鱼眼双目传感器用于采集无人机周围的环境图像。
可选地,上述无人机还包括处理器;
处理器,用于对两组双目传感器和两组鱼眼双目传感器采集的环境图像进行处理,构建三维地图。
可选地,处理器用于采用分块矫正的方法对鱼眼双目传感器采集的环境图像进行处理,得到每个鱼眼双目传感器的分块矫正图像。
可选地,处理器还用于通过双目匹配方法计算两组鱼眼双目传感器中对应的分块矫正图像,生成视差图。
可选地,处理器还用于采用蒙版遮盖的方式去除视差图中的遮挡物。
可选地,处理器还用于根据去除遮挡物的视差图生成点云,并根据点云构建三维地图。
本申请实施例提供了一种四旋翼无人机,包括两组双目传感器和两组鱼眼 双目传感器,其中,两组双目传感器分别以对角线的方式设置于无人机的机身,两组鱼眼双目传感器分别以对角线的方式设置于无人机的机身,两组双目传感器和两组鱼眼双目传感器在任意方向上的视场角的和大于或等于360度。这样的设置方式可以灵活地应用于折叠机臂的无人机结构上,相比于非折叠的全向感知四旋翼系统,无人机的体积可以大大减小。并且,在本申请实施例中,仅采用四组双目传感器即可保证视角全覆盖,实现全向感知,相比于现有技术中的实现方式采用的镜头数量更少,可以节约成本。
附图说明
图1为本申请实施例提供的一种无人机上设置传感器的主视图;
图2为本申请实施例提供的一种无人机上设置传感器的左视图;
图3为本申请实施例提供的一种无人机上设置传感器的俯视图;
图4为本申请实施例提供的一种无人机上设置鱼眼双目传感器的左视图中盲区的示意图;
图5为本申请实施例提供的一种无人机上设置鱼眼双目传感器的俯视图中盲区的示意图;
图6为本申请实施例提供的一种分块矫正的原理图;
图7为本申请实施例提供的分块矫正图像组成双目结构的示意图;
图8为本申请实施例提供的针孔相机双目的示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅仅用于解释本申请,而非对本申请的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结 构。
另外,在本申请实施例中,“可选地”或者“示例性地”等词用于表示作例子、例证或说明。本申请实施例中被描述为“可选地”或者“示例性地”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“可选地”或者“示例性地”等词旨在以具体方式呈现相关概念。
本申请实施例提供了一种四旋翼无人机,该无人机可以包括两组双目传感器和两组鱼眼双目传感器。其中,两组双目传感器分别以对角线的方式设置于无人机的机身,同样地,两组鱼眼双目传感器也以对角线的方式设置于无人机的机身。如图1、图2、图3所示,分别为两组双目传感器和两组鱼眼双目传感器设置方式的主视图、左视图和俯视图。
图1-图3中呈现的设置方式为两组鱼眼双目传感器分别设置于无人机机身的上方和下方,两组广角双目传感器分别设置于无人机机身的前后和后方。
当然,上述图中所示的设置方式仅是示意性地设置方式,本申请实施例对此没有严格的限制,四组传感器也可以设置于无人机的前后方和左右侧等其他位置,仅要求每两组传感器以对角线的方式设置于无人机机身。
需要说明的是,这里的双目传感器可以包括任意类型的镜头,当然也包括鱼眼双目传感器,即无人机机身上可以设置4组鱼眼双目传感器。
进一步地,设置的两组双目传感器和两组鱼眼双目传感器需要在任意方向上保证其视场角的和大于或等于360度。
如图2所示,即需要满足c+d+e+f≥360°的条件,如图3所示,需要满足g+h+i+j≥360°的条件。
在本申请实施例中,要求设置的两组鱼眼双目传感器的视场角大于180度,如图1所示,即需要保证a+b>360°。
由于鱼眼双目传感器在基线方向一定区域的双目视差过小,故存在一定的盲区,该盲区大小根据镜头成像质量以及传感器的分辨率而定,一般不超过90度。如图4所示的左视图中的点构成的阴影部分,即m+n<90°,如图5所示的俯视图中的点构成的阴影部分,即k<90°。因此,设计在无人机机身上加入两组双目镜头以弥补该盲区,以确保四组传感器在任意方向上的视场角的和大于或等于360度。
需要说明的是,在无人机机身上设置有四组鱼眼双目传感器的情况下,仅要求其中两组鱼眼双目传感器的视场角大于180度,另外两组鱼眼双目传感器不作要求。也就是说,无论选用哪种类型的传感器,要求其中两组鱼眼双目传感器的视场角大于180度即可。
本申请实施例提供的四旋翼无人机包括两组双目传感器和两组鱼眼双目传感器,其中,两组双目传感器分别以对角线的方式设置于无人机的机身,两组鱼眼双目传感器分别以对角线的方式设置于无人机的机身,两组双目传感器和两组鱼眼双目传感器在任意方向上的视场角的和大于或等于360度。这样的设置方式可以灵活地应用于折叠机臂的无人机结构上,相比于非折叠的全向感知四旋翼系统,无人机的体积可以大大减小。并且,在本申请实施例中,仅采用四组双目传感器即可保证视角全覆盖,实现全向感知,相比于现有技术中的实现方式采用的镜头数量更少,可以节约成本。
在一种示例中,上述无人机还可以包括处理器;
处理器用于对上述两组双目传感器和两组鱼眼双目传感器采集的环境图像 进行处理,构建三维地图。
示例性地,处理器可采用分块矫正的方法对鱼眼双目传感器采集的环境图像进行处理,得到每个鱼眼双目传感器的分块矫正图像。分块矫正可以将成像平面设置为球面内的多个虚拟成像平面,每个虚拟成像平面均构成一个针孔相机模型,其中成像平面大小、个数或方向的设置可以根据实际情况进行调整。比如,可以设置为2、3、5个虚拟成像平面。如图6所示,为3个虚拟平面且各平面相互垂直的示例。
其中,对于每个分块矫正的图像,其对应的投影方程为:
Figure PCTCN2022072784-appb-000001
其中,u v为分块矫正的图像坐标,K v为虚拟平面所对应的针孔相机模型的内参,该参数可以由虚拟成像平面的大小位置决定。P v c为虚拟成像平面坐标系到物理成像平面坐标系的转换矩阵,K c为鱼眼三维空间到成像平面的投影矩阵,即鱼眼投影模型,该参数可以由鱼眼标定模型确定,例如,可以使用UCM、EUCM、KB等模型。基于上述公式(1)即可实现将鱼眼原图转换为分块矫正图像。
进一步地,上述处理器还用于通过双目匹配方法计算两组鱼眼双目传感器中对应的分块矫正图像,生成视差图。如图7所示,将每个鱼眼图像分为3块区域分块矫正后,对于每个左相机的分块,可以与对应的右相机分块构成双目结构,即对每一组双目鱼眼,图中的a_l与a_r,b_l与b_r,c_l与c_r分别组成双目。由于图像经过针孔模型矫正并去畸变,那么对这三组双目鱼眼使用bm、sgbm等方法进行双目匹配,即可生成视差图。并且,该计算过程可以兼容现有技术中的大多数双目计算模块。
通过上述图像矫正处理,可以将两组鱼眼双目等效为6组常规的针孔相机双目,如图8所示。该处理方式可以最大程度保证鱼眼图像的边缘画质,既保证了鱼眼双目采集图像的视场角,又可以使用常规双目匹配生成视差图,以减少计算量。
在一种示例中,上述处理器还用于采用蒙版遮盖的方式去除视差图中的遮挡物。该遮挡物可以为飞行器脚架等,如图1所示,由于在无人机机身上设置的鱼眼双目传感器的视场角大于180°,因而鱼眼双目传感器采集的图像中可能会出现飞行器脚架等遮挡物,那么可以采用蒙版遮盖的方式去除图像中的遮挡物,这里的蒙版即为一种黑白二值图,黑色区域为遮盖,白色为保留,其处理方式为现有技术中的处理过程,在此不再赘述。
由于双目算法可以匹配相邻像素,并且鱼眼双目传感器存在安装公差,黑色遮盖区域应该比实际区域略大,因此可以将鱼眼双目传感器的左、右蒙版取并集作为最终的蒙版,并将该最终的蒙版作为固定配置,在无人机系统启动时加载,运用于双目输出的视差图上。
在一种示例中,上述处理器还可以用于根据去除遮挡物的视差图生成点云,并根据点云构建三维地图。
例如,可以通过如下方式生成点云:
Figure PCTCN2022072784-appb-000002
Figure PCTCN2022072784-appb-000003
Figure PCTCN2022072784-appb-000004
其中,baseline为双目基线长度,disparity为视差值,cx、cy、fx、fy为双 目标定参数,px、py为视差图中的坐标。
通过上述方式生成全方向的点云,即可构建三维地图。无人机可以基于构建的三维地图执行避障刹车、路径规划等任务。
通过以上关于实施方式的描述,所属领域的技术人员可以清楚地了解到,本申请可借助软件及必需的通用硬件来实现,当然也可以通过硬件实现,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如计算机的软盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、闪存(FLASH)、硬盘或光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)实现本申请各个实施例所述的方法或功能。
注意,上述仅为本申请的较佳实施例及所运用技术原理。本领域技术人员会理解,本申请不限于这里所述的特定实施例,对本领域技术人员来说能够进行各种明显的变化、重新调整和替代而不会脱离本申请的保护范围。因此,虽然通过以上实施例对本申请进行了较为详细的说明,但是本申请不仅仅限于以上实施例,在不脱离本申请构思的情况下,还可以包括更多其他等效实施例,而本申请的范围由所附的权利要求范围决定。

Claims (8)

  1. 一种四旋翼无人机,其特征在于,包括:两组双目传感器和两组鱼眼双目传感器;
    所述两组双目传感器分别以对角线的方式设置于所述无人机的机身;
    所述两组鱼眼双目传感器分别以对角线的方式设置于所述无人机的机身;
    所述两组双目传感器和两组鱼眼双目传感器在任意方向上的视场角的和大于或等于360度。
  2. 根据权利要求1所述的无人机,其特征在于,所述鱼眼双目传感器的视场角大于180度。
  3. 根据权利要求1或2所述的无人机,其特征在于,所述两组双目传感器和两组鱼眼双目传感器用于采集所述无人机周围的环境图像。
  4. 根据权利要求1所述的无人机,其特征在于,所述无人机还包括处理器;
    所述处理器,用于对所述两组双目传感器和两组鱼眼双目传感器采集的环境图像进行处理,构建三维地图。
  5. 根据权利要求4所述的无人机,其特征在于,所述处理器用于采用分块矫正的方法对所述鱼眼双目传感器采集的环境图像进行处理,得到每个鱼眼双目传感器的分块矫正图像。
  6. 根据权利要求5所述的无人机,其特征在于,所述处理器还用于通过双目匹配方法计算所述两组鱼眼双目传感器中对应的分块矫正图像,生成视差图。
  7. 根据权利要求6所述的无人机,其特征在于,所述处理器还用于采用蒙版遮盖的方式去除所述视差图中的遮挡物。
  8. 根据权利要求7所述的无人机,其特征在于,所述处理器还用于根据去除遮挡物的视差图生成点云,并根据所述点云构建三维地图。
PCT/CN2022/072784 2021-02-22 2022-01-19 一种四旋翼无人机 WO2022174712A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202120393695.X 2021-02-22
CN202120393695.XU CN215972078U (zh) 2021-02-22 2021-02-22 一种四旋翼无人机

Publications (1)

Publication Number Publication Date
WO2022174712A1 true WO2022174712A1 (zh) 2022-08-25

Family

ID=80506676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/072784 WO2022174712A1 (zh) 2021-02-22 2022-01-19 一种四旋翼无人机

Country Status (2)

Country Link
CN (1) CN215972078U (zh)
WO (1) WO2022174712A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140231578A1 (en) * 2012-06-19 2014-08-21 Bae Systems Information And Electronic Systems Integration Inc. Stabilized uav platform with fused ir and visible imagery
JP2015204633A (ja) * 2014-04-16 2015-11-16 パロット 安定化した一続きの画像を提供するビデオカメラが設けられた回転翼無人機
CN105314122A (zh) * 2015-12-01 2016-02-10 浙江宇视科技有限公司 一种用于应急指挥和占道取证的无人机
CN206012982U (zh) * 2016-08-17 2017-03-15 吉林威和航空科技有限公司 一种用于倾斜摄影的小型电动固定翼无人机
CN110775288A (zh) * 2019-11-26 2020-02-11 哈尔滨工业大学(深圳) 一种基于仿生的飞行机械颈眼系统及控制方法
CN210277081U (zh) * 2019-07-01 2020-04-10 湖南海森格诺信息技术有限公司 一种扫地机器人
CN112052788A (zh) * 2020-09-03 2020-12-08 深圳市道通智能航空技术有限公司 基于双目视觉的环境感知方法、装置及无人飞行器

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140231578A1 (en) * 2012-06-19 2014-08-21 Bae Systems Information And Electronic Systems Integration Inc. Stabilized uav platform with fused ir and visible imagery
JP2015204633A (ja) * 2014-04-16 2015-11-16 パロット 安定化した一続きの画像を提供するビデオカメラが設けられた回転翼無人機
CN105314122A (zh) * 2015-12-01 2016-02-10 浙江宇视科技有限公司 一种用于应急指挥和占道取证的无人机
CN206012982U (zh) * 2016-08-17 2017-03-15 吉林威和航空科技有限公司 一种用于倾斜摄影的小型电动固定翼无人机
CN210277081U (zh) * 2019-07-01 2020-04-10 湖南海森格诺信息技术有限公司 一种扫地机器人
CN110775288A (zh) * 2019-11-26 2020-02-11 哈尔滨工业大学(深圳) 一种基于仿生的飞行机械颈眼系统及控制方法
CN112052788A (zh) * 2020-09-03 2020-12-08 深圳市道通智能航空技术有限公司 基于双目视觉的环境感知方法、装置及无人飞行器

Also Published As

Publication number Publication date
CN215972078U (zh) 2022-03-08

Similar Documents

Publication Publication Date Title
WO2020135446A1 (zh) 一种目标定位方法和装置、无人机
KR101666959B1 (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
CN108416812B (zh) 一种单摄像机镜像双目视觉系统的标定方法
CN107705252B (zh) 适用于双目鱼眼图像拼接展开校正的方法及系统
WO2022048541A1 (zh) 基于双目视觉的环境感知方法、装置及无人飞行器
CN108492333B (zh) 基于星箭对接环图像信息的航天器姿态估计方法
CN109115184B (zh) 基于非合作目标协同测量方法及系统
CN105208247A (zh) 一种基于四元数的全景稳像方法
WO2019047847A1 (zh) 一种虚拟现实的六自由度三维重构方法、系统及便携式终端
WO2021104308A1 (zh) 全景深度测量方法、四目鱼眼相机及双目鱼眼相机
CN105825470A (zh) 基于点云影像的鱼眼影像纠正方法
CN113192179A (zh) 一种基于双目立体视觉的三维重建方法
CN109769110B (zh) 一种3d小行星动态图的生成方法、装置及便携式终端
CN103295231A (zh) 一种鱼眼图像拼接中鱼眼镜头垂直映射图像几何校正方法
CN104680505A (zh) 一种鱼眼镜头校正的全景视图算法
CN106530239A (zh) 基于大视场仿生鱼眼的小型无人旋翼机移动目标低空跟踪方法
WO2020114433A1 (zh) 一种深度感知方法,装置和深度感知设备
EP4073756A1 (en) A method for measuring the topography of an environment
CN102780834A (zh) 环眼镜头图像半柱面全景展开方法
CN105989354A (zh) 一种定位方法与系统
WO2022174712A1 (zh) 一种四旋翼无人机
CN112802109A (zh) 一种汽车鸟瞰全景图生成方法
Lin et al. Real-time low-cost omni-directional stereo vision via bi-polar spherical cameras
TWM594322U (zh) 全向立體視覺的相機配置系統
CN113706391B (zh) 无人机航拍图像实时拼接方法、系统、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22755485

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22755485

Country of ref document: EP

Kind code of ref document: A1