WO2022218081A1 - 一种双目摄像头及机器人 - Google Patents

一种双目摄像头及机器人 Download PDF

Info

Publication number
WO2022218081A1
WO2022218081A1 PCT/CN2022/080691 CN2022080691W WO2022218081A1 WO 2022218081 A1 WO2022218081 A1 WO 2022218081A1 CN 2022080691 W CN2022080691 W CN 2022080691W WO 2022218081 A1 WO2022218081 A1 WO 2022218081A1
Authority
WO
WIPO (PCT)
Prior art keywords
dot matrix
binocular camera
receiving module
matrix projection
module
Prior art date
Application number
PCT/CN2022/080691
Other languages
English (en)
French (fr)
Inventor
陈展耀
周宗华
罗德国
戴书麟
刘风雷
Original Assignee
东莞埃科思科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 东莞埃科思科技有限公司 filed Critical 东莞埃科思科技有限公司
Publication of WO2022218081A1 publication Critical patent/WO2022218081A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Definitions

  • the present application relates to the technical field of machine vision, and in particular, to a binocular camera and a robot.
  • Binocular Stereo Vision is an important form of machine vision. It is based on the principle of parallax and uses imaging equipment to obtain two images of the measured object from different positions. By calculating the positional deviation between the corresponding points of the images, A method to obtain the three-dimensional geometric information of an object.
  • binocular stereo vision mostly adopts an active binocular structured light scheme to reconstruct three-dimensional space.
  • the existing active binocular structured light camera limited by the field of view transmitted by the dot matrix projector, the viewing angle is small, and the viewing angle is small, which inevitably leads to a small operating space and cannot achieve large-scale three-dimensional obstacles.
  • the detection of objects affects the functions of robot obstacle avoidance, real-time localization and mapping (simultaneous localization and mapping, SLAM) or navigation.
  • the purpose of the present application is to provide a binocular camera and a robot, which can increase the field of view, thereby improving the ability of three-dimensional reconstruction in space.
  • An embodiment of the present application provides a binocular camera, which includes a base plate and a fixing frame disposed on the base plate, and a dot matrix projection module is arranged on the fixing frame at intervals, and the outgoing light paths of the two dot matrix projection modules are A preset angle is present, and the vertical distances between the two dot matrix projection modules and the substrate are equal; the binocular camera further includes a first light receiving module and a second light receiving module arranged on the substrate The first light receiving module and the second light receiving module are configured to collect light reflection information of the two dot matrix projection modules respectively.
  • the fixing frame has an isosceles triangle structure, and the two dot matrix projection modules are respectively located on two opposite sides of the isosceles triangle.
  • the binocular camera further includes a closed casing and a transparent cover plate arranged on one side of the casing, the base plate, the fixing frame, the dot matrix projection module, the first light receiving module. Both the group and the second light receiving module are located in the closed casing, and the base plate and the transparent cover plate are arranged in parallel.
  • the distance between the two described dot matrix projection modules is wherein, d is the distance between the two dot matrix projection modules, 2 ⁇ is the field of view angle of the dot matrix projection module, 2 ⁇ is the field angle of the binocular camera, h is the binocular camera Desired minimum application distance.
  • the dot matrix projection module includes at least one dot matrix projector; when the dot matrix projection module includes two or more dot matrix projectors, two of the dot matrix projection module The one or more dot matrix projectors are on the same straight line, and the two straight lines respectively formed by the dot matrix projectors in the two dot matrix projection modules are parallel to each other.
  • the dot matrix projector includes a light source, and a collimating lens and a diffractive optical element located on the outgoing light path of the light source.
  • the fixing frame includes spaced positioning seats, the two dot matrix projection modules are respectively located on the positioning surfaces of the positioning seats, and the two positioning surfaces are respectively coincident with the two waists of the isosceles triangle.
  • the first light receiving module and the second light receiving module are respectively located on opposite sides of the two dot matrix projection modules.
  • the fixing frame is made of thermally conductive material.
  • Embodiments of the present application further provide a robot, including the binocular camera described in any one of the above.
  • the binocular camera provided by the embodiment of the present application passes through the base plate and the fixing frame arranged on the base plate, so as to provide stable support for the dot matrix projection module, the first light receiving module and the second light receiving module, so as to The stability of the relative positions among the dot matrix projection module, the first light receiving module and the second light receiving module is ensured.
  • the field of view of the binocular camera is increased, it is beneficial to enable the first light receiving module and the second light receiving module to receive a wider range of speckle pattern information, expand the depth reconstruction range of the binocular camera, and further Improve spatial 3D reconstruction capabilities.
  • FIG. 1 is one of the schematic structural diagrams of a binocular camera provided by an embodiment of the present application.
  • FIG. 2 is the second schematic structural diagram of a binocular camera provided by an embodiment of the present application.
  • FIG. 3 is a positional relationship diagram between a dot matrix projection module and a transparent cover plate provided by an embodiment of the present application;
  • FIG. 4 is a schematic structural diagram of the connection between a fixing frame and a dot matrix projector provided by an embodiment of the present application;
  • FIG. 5 is a schematic structural diagram of the connection between the positioning base and the dot matrix projection module provided by the embodiment of the present application.
  • Icon 100-binocular camera; 110-substrate; 120-fixed frame; 122-positioning seat; 1222-positioning surface; 130-dot matrix projection module; 132-dot matrix projector; 140-first light receiving module ; 150 - the second light receiving module; 160 - closed shell; 170 - transparent cover.
  • connection should be understood in a broad sense, for example, it may be a fixed connection or a detachable connection, or Connected integrally; it can be directly connected, or indirectly connected through an intermediate medium, and it can be the internal communication of two elements.
  • the depth of field of view of the active binocular structured light depends on the field of view of the camera.
  • the existing maximum angle of view is about 60 ⁇ 80, which can only be used in some specific scenarios such as access control, door locks, etc.
  • the required depth reconstruction field of view angle can reach 120 ⁇ 80, and the existing camera obviously cannot achieve such a large field of view, which limits the 3D reconstruction capability in practical applications.
  • the embodiments of the present application propose the following solutions to increase the field of view, thereby improving the spatial 3D reconstruction capability.
  • an embodiment of the present application provides a binocular camera 100 , which includes a substrate 110 and a fixing frame 120 disposed on the substrate 110 .
  • the fixed frame 120 is provided with dot matrix projection modules 130 at intervals, for example, two dot matrix projection modules 130 .
  • the outgoing light paths of the two dot matrix projection modules 130 are at a predetermined angle, and the vertical distances between the two dot matrix projection modules 130 and the substrate 110 are equal.
  • the binocular camera 100 further includes a first light receiving module 140 and a second light receiving module 150 disposed on the substrate 110 .
  • the first light receiving module 140 and the second light receiving module 150 are configured to collect the light reflection information of the two dot matrix projection modules 130 respectively.
  • the parameters of the two dot matrix projection modules 130 arranged at intervals on the fixing frame 120 may be the same, and the vertical distances between the two dot matrix projection modules 130 and the substrate 110 may be the same.
  • the two dot matrix projection modules 130 are located at the same installation height, so that when in use, the speckle patterns projected by the dot matrix projection modules 130 are equal in size at the same distance, which is beneficial to ensure the two
  • the consistency of the patterns projected by the dot matrix projection module 130 is convenient to reduce the difficulty of calculation.
  • dot matrix projection modules 130 with different parameter information can also be set as required to meet diverse demands.
  • the binocular camera 100 of the present application is implemented based on the optical triangulation measurement principle of active binocular structured light three-dimensional vision.
  • the two-dot-matrix projection module 130 projects a certain pattern of structured light on the surface of the object to form a three-dimensional image of the light bar on the surface of the object that is modulated by the shape of the object's surface.
  • the three-dimensional image is collected by the first light receiving module 140 and the second light receiving module 150 to obtain a two-dimensional distortion image of the light stripe.
  • the degree of distortion of the light bar depends on the relative positions of the dot matrix projection module 130 and the first light receiving module 140 and the second light receiving module 150 respectively and the surface profile (height) of the object.
  • the three-dimensional contour of the object surface can be reproduced from the coordinates of the distorted two-dimensional light bar image. In order to achieve the purpose of three-dimensional reconstruction of space.
  • the first light-receiving module 140 and the second light-receiving module 150 may use a receiving camera, and the photosensitive chip is a complementary metal-oxide-semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS) or a charge-coupled device (Charge Coupled Device, CCD) to collect the speckle pattern of the space to be measured.
  • CMOS complementary Metal-oxide-semiconductor
  • CCD Charge Coupled Device
  • the binocular camera 100 provided by the embodiment of the present application provides stable stability to the dot matrix projection module 130 , the first light receiving module 140 and the second light receiving module 150 through the base plate 110 and the fixing frame 120 disposed on the base plate 110 . support, so as to ensure the stability of relative positions among the dot matrix projection module 130 , the first light receiving module 140 and the second light receiving module 150 .
  • the first light receiving module 140 and the second light receiving module 150 When the field of view of the binocular camera 100 is increased, it is beneficial for the first light receiving module 140 and the second light receiving module 150 to receive a wider range of speckle pattern information, thereby expanding the depth of the binocular camera 100 Reconstruction range, thereby improving the ability of spatial 3D reconstruction.
  • the fixing frame 120 has an isosceles triangle structure, and the two dot matrix projection modules 130 are respectively located on two opposite sides of the isosceles triangle.
  • the fixing frame 120 has an isosceles triangle structure, that is, the fixing frame 120 adopts an isosceles triangle bracket, so that the structure of the fixing frame 120 is more stable and reliable.
  • the dot matrix projection module 130 and the fixing frame 120 can be stably connected, so that the dot matrix projection module 130 can be stably connected during installation and fixation.
  • the outgoing light path of the module 130 is perpendicular to the waist of the isosceles triangle. In this way, the number of base angles of the isosceles triangle determines the size of the preset angle between the outgoing light paths of the two-dot projection module 130 .
  • the binocular camera 100 provided in this embodiment of the present application may further include a closed casing 160 and a transparent cover plate 170 disposed on one side of the casing.
  • the substrate 110 , the fixing frame 120 , the dot matrix projection module 130 , the first light receiving module 140 and the second light receiving module 150 are all located in the closed casing 160 , and the substrate 110 and the transparent cover 170 are arranged in parallel.
  • the above form is used to facilitate the operation of the dot matrix projection module 130 , the first light receiving module 140 and the second light receiving module 150 and other components through the transparent cover 170 and the closed casing 160 . Protection, so as to ensure the stability of the binocular camera 100 during use, such as sealing, to prevent the entry of dust or water vapor, so as to avoid interference from the external environment. It should be noted that the embodiment of the present application does not specifically limit the setting form of the closed casing 160.
  • the closed casing 160 may be cylindrical, truncated, or other shapes, as long as it can be ensured that it will not affect the The field of view angle of the binocular camera 100 is sufficient so as not to block the sight of the first light receiving module 140 and the second light receiving module 150 .
  • the distance between the two dot matrix projection modules 130 can be determined by the following formula:
  • d is the distance between the two dot matrix projection modules 130, namely BE
  • 2 ⁇ is the field of view of the dot matrix projection module 130
  • 2 ⁇ is the field of view of the binocular camera 100
  • h is the binocular camera 100 Desired minimum application distance.
  • FIG. 3 is a simplified geometric model of the dot matrix projection module 130 and the fixing frame 120 in FIG. 2 .
  • the two dot matrix projection modules 130 are points B and E in FIG. 3 respectively.
  • the plane where the straight line GN is located is the plane where the transparent cover plate 170 is located, and the fixing frame 120 is an isosceles triangle of ⁇ JCI.
  • ⁇ GBF and ⁇ NEF are the field of view angles of the two dot matrix projectors 132 respectively, which are set as 2 ⁇ ;
  • the straight line BP and the straight line EM are the angle bisectors of ⁇ GBF and ⁇ NEF, respectively.
  • the straight line BP and the straight line EM are perpendicular to CJ and IJ. Extend the straight line GB and the straight line NE. Compared with point A, the geometric relationship shows that the intersection points F, J, and A are on the same straight line. Point D is the intersection of the straight line FA and the straight line BE.
  • the edge rays BF and EF of the two dot matrix projectors 132 will intersect at point F, which means that the minimum application distance of the product is FJ, which is set to h, otherwise there is an area without speckles (as shown in the figure, FBJE is surrounded by area), making deep reconstruction impossible.
  • the isosceles triangle angle ⁇ ( ⁇ JCI in FIG. 3 ) and two dot matrixes can be obtained.
  • the distance BE between the projection modules 130 isosceles triangle angle ⁇ ( ⁇ JCI in FIG. 3 ) and two dot matrixes.
  • the isosceles triangle fixing frame 120 can be designed according to the above formula and determined.
  • the expected minimum application distance h can be determined according to the installation and application environment of the product.
  • each dot matrix projection module 130 may include at least one dot matrix projector 132 .
  • the two or more dot projections 132 in the dot projection module 130 are on the same straight line , and optionally, the two straight lines formed by the lattice projectors 132 in the two lattice projection modules 130 are parallel to each other.
  • the two lines in the left lattice projection module The straight line formed by the dot matrix projectors and the straight line formed by the two dot matrix projectors in the right dot matrix projection module are parallel to each other.
  • each dot matrix projection module 130 may only include one dot matrix projector 132 , or according to the complexity of things, each dot matrix projection module 130 may have two or more dot matrix projectors 132 .
  • the dot matrix projector 132 is beneficial to increase the density of the speckle pattern in a unit area, thereby improving the three-dimensional reconstruction capability. It can be understood that the area projected by the dot matrix projector 132 is generally a rectangle.
  • each dot matrix projection module 130 When two or more dot matrix projectors 132 in each dot matrix projection module 130 are on the same straight line, When the two straight lines formed by the dot matrix projectors 132 in the projection module 130 are parallel to each other, the projection area between the two dot matrix projection modules 130 can be better distributed, so as to improve the utilization rate of the projection beam and avoid There is no beam projection area where the binocular camera 100 emits light.
  • the dot matrix projector 132 includes a light source, as well as a collimating lens and a diffractive optical element located on the outgoing light path of the light source.
  • the light source can be any one of a light emitting diode (light emitting diode, LED), a semiconductor laser (Laser diode, LD), and a vertical cavity surface emitting laser (Vertical Cavity Surface Emitting Laser, VCSEL).
  • the light beam emitted by the light source is collimated by the collimating lens, so that the beam is emitted in parallel, and then undergoes the shaping and diffraction effect of the diffractive optical element to form a specific speckle pattern.
  • the fixing frame 120 may further include positioning seats 122 arranged at intervals.
  • the two dot matrix projection modules 130 are respectively located on the positioning surfaces 1222 of the positioning base 122 , and the two positioning surfaces 1222 are respectively coincident with the two sides of the isosceles triangle.
  • the positioning seat 122 can also be in the form of a right-angled trapezoid, so as to increase the height of the setting position of the dot matrix projection module 130 according to actual needs.
  • the first light receiving module 140 and the second light receiving module 150 may be located on opposite sides of the two dot matrix projection modules 130 respectively. In this way, it is convenient for the first light receiving module 140 and the second light receiving module 150 to collect information respectively, and to measure the depth information by integrating information of different dimensions.
  • the fixing frame 120 is made of thermally conductive material.
  • the fixing frame 120 can be made of copper or aluminum, and can also be made of thermally conductive silicon or ceramics, which can be flexibly set according to the actual use environment.
  • the embodiment of the present application also discloses a robot, including the binocular camera 100 in the foregoing embodiment.
  • the robot includes the same structure and beneficial effects as the binocular camera 100 in the previous embodiment.
  • the structure and beneficial effects of the binocular camera 100 have been described in detail in the foregoing embodiments, and will not be repeated here.
  • the present disclosure provides a binocular camera and a robot, which can increase the field of view, facilitate receiving a wider range of pattern information, expand the depth reconstruction range of the binocular camera, and further improve the spatial three-dimensional reconstruction capability.
  • the binocular camera and the robot enable the detection of large-scale three-dimensional obstacles, which is beneficial to realize functions such as obstacle avoidance, real-time positioning, map construction or navigation of the robot.
  • the binocular camera and the robot of the present application are reproducible and can be applied in various industrial applications.
  • the binocular camera and robot of the present application can be used in the field of machine vision technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种双目摄像头及机器人,涉及机器视觉技术领域。该双目摄像头包括基板,以及设置在所述基板上的固定架,所述固定架上间隔设置有点阵投射模组,两所述点阵投射模组的出射光路呈预设夹角,且两所述点阵投射模组与所述基板的垂直距离相等;所述双目摄像头还包括设置在所述基板上的第一光接收模组和第二光接收模组,所述第一光接收模组和所述第二光接收模组被配置成分别采集两所述点阵投射模组的光反射信息。能够提升视场角,进而提升空间三维重建能力。

Description

一种双目摄像头及机器人
相关申请的交叉引用
本申请要求于2021年4月14日提交中国专利局的申请号为2021104014817、名称为“一种双目摄像头及机器人”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及机器视觉技术领域,具体而言,涉及一种双目摄像头及机器人。
背景技术
双目立体视觉(Binocular Stereo Vision)是机器视觉的一种重要形式,它是基于视差原理并利用成像设备从不同的位置获取被测物体的两幅图像,通过计算图像对应点间的位置偏差,来获取物体三维几何信息的方法。
现有技术中,双目立体视觉多采用主动双目结构光方案进行空间三维的重建。在现有的主动双目结构光摄像头中,受制于点阵投射器透射的视场角,导致可视角度较小,而可视角度小,必然导致可操作空间小,无法实现大范围立体障碍物的检测,影响机器人避障、即时定位与地图构建(simultaneous localization and mapping,SLAM)或导航等功能。
发明内容
本申请的目的在于提供一种双目摄像头及机器人,能够提升视场角,进而提升空间三维重建能力。
本申请的实施例是这样实现的:
本申请实施例提供一种双目摄像头,包括基板,以及设置在所述基板上的固定架,所述固定架上间隔设置有点阵投射模组,两所述点阵投射模组的出射光路呈预设夹角,且两所述点阵投射模组与所述基板的垂直距离相等;所述双目摄像头还包括设置在所述基板上的第一光接收模组和第二光接收模组,所述第一光接收模组和所述第二光接收模组被配置成分别采集两所述点阵投射模组的光反射信息。
可选地,所述固定架呈等腰三角形结构,两所述点阵投射模组分别位于所述等腰三角形相对的两腰上。
可选地,所述双目摄像头还包括封闭壳体以及设置在壳体一侧的透明盖板,所述基板、所述固定架、所述点阵投射模组、所述第一光接收模组和所述第二光接收模组均位于所述 封闭壳体内,且所述基板与所述透明盖板平行设置。
可选地,两所述点阵投射模组之间的距离为
Figure PCTCN2022080691-appb-000001
其中,d为两所述点阵投射模组之间的距离,2β为所述点阵投射模组的视场角,2θ为所述双目摄像头的视场角,h为所述双目摄像头期望的最小应用距离。
可选地,所述点阵投射模组包括至少一个点阵投射器;当所述点阵投射模组包括两个或更多个点阵投射器时,所述点阵投射模组中的两个或更多个点阵投射器在同一直线上,且分别由两所述点阵投射模组中的点阵投射器形成的两直线之间相互平行。
可选地,所述点阵投射器包括光源,以及位于所述光源出射光路上的准直透镜和衍射光学元件。
可选地,所述固定架包括间隔设置的定位座,两所述点阵投射模组分别位于所述定位座的定位面上,且两所述定位面分别与等腰三角形的两腰重合。
可选地,所述第一光接收模组和所述第二光接收模组分别位于两所述点阵投射模组的相对两侧。
可选地,所述固定架为导热材料制备。
本申请实施例还提供一种机器人,包括如上任意一项所述的双目摄像头。
本申请实施例的有益效果例如包括:
本申请实施例提供的双目摄像头,通过基板,以及设置在基板上的固定架,以便于对点阵投射模组、第一光接收模组和第二光接收模组提供稳定的支撑,以保证点阵投射模组、第一光接收模组和第二光接收模组之间相对位置的稳定性。通过将设置在固定架上的两点阵投射模组的出射光路呈预设夹角设置,与采用单个投射器相比,有利于提升双目摄像头的视场角。在双目摄像头的视场角增大的情况下,有利于使第一光接收模组和第二光接收模组接收更大范围的散斑图案信息,扩大双目摄像头的深度重建范围,进而提升空间三维重建能力。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本申请实施例提供的双目摄像头的结构示意图之一;
图2为本申请实施例提供的双目摄像头的结构示意图之二;
图3为本申请实施例提供的点阵投射模组与透明盖板之间的位置关系图;
图4为本申请实施例提供的固定架与点阵投射器连接的结构示意图;
图5为本申请实施例提供的定位座与点阵投射模组连接的结构示意图。
图标:100-双目摄像头;110-基板;120-固定架;122-定位座;1222-定位面;130-点阵投射模组;132-点阵投射器;140-第一光接收模组;150-第二光接收模组;160-封闭壳体;170-透明盖板。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。
因此,以下对在附图中提供的本申请的实施例的详细描述并非旨在限制要求保护的本申请的范围,而是仅仅表示本申请的选定实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。此外,术语“第一”、“第二”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请的描述中,还需要说明的是,除非另有明确的规定和限定,术语“设置”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本申请中的具体含义。
随着人们生活水平的提高,基于智能导航方案的室内机器人逐步进入人们的生活中,而3D感知系统是其最为核心的部分,以实现SLAM、避障等功能。目前,双目立体视觉多采用主动双目结构光方案进行空间三维的重建。但在实际使用时,其效果差强人意,主要问题体现在避障能力差,其主要传感器多位于机器人顶部,可视角度较小,而视觉范围小,必然导致可操作空间小,无法实现大范围立体障碍物的检测。
而主动双目结构光的深度视场角,取决于摄像头的视场角。受制于微纳光学的技术,在保证光学性能的前提下,现有的视场角最大约为60×80,只能在一些特定的场景如门禁,门锁等进行应用。而在机器人智能导航的应用场景中,所需的深度重建视场角度可达120×80,现有的摄像头明显达不到如此大的视场角,限制了实际应用中三维重建能力。基 于此,本申请实施例特提出以下方案,以提升视场角,进而提升空间三维重建能力。
请参照图1,本申请实施例提供一种双目摄像头100,包括基板110,以及设置在基板110上的固定架120。固定架120上间隔设置有点阵投射模组130,例如两个点阵投射模组130。两点阵投射模组130的出射光路呈预设夹角,且两点阵投射模组130与基板110的垂直距离相等。双目摄像头100还包括设置在基板110上的第一光接收模组140和第二光接收模组150。第一光接收模组140和第二光接收模组150被配置成分别采集两点阵投射模组130的光反射信息。
在本申请实施例中,固定架120上间隔设置的两点阵投射模组130的参数可以相同,而且两点阵投射模组130与基板110的垂直距离可以相等。这样一来,使得两点阵投射模组130位于同一安装高度,进而使得在使用时,通过点阵投射模组130投射出去的散斑图案在同样距离的位置处大小相等,这有利于保证两点阵投射模组130所投射图案的一致性,以便于减小计算难度。在实际应用中,也可以根据需要设置不同参数信息的点阵投射模组130,以满足多样化的需求。
本申请的双目摄像头100是基于主动双目结构光三维视觉的光学三角法测量原理来实现的。在使用时,两点阵投射模组130将一定模式的结构光投射于物体表面,以在物体表面上形成由被测物体表面形状所调制的光条三维图像。该三维图像由第一光接收模组140和第二光接收模组150采集,从而获得光条二维畸变图像。光条的畸变程度取决于点阵投射模组130分别与第一光接收模组140和第二光接收模组150之间的相对位置以及物体表面形廓(高度)。当点阵投射模组130、第一光接收模组140和第二光接收模组150之间的相对位置一定时,由畸变的二维光条图像坐标便可重现物体表面三维形廓,以达到空间三维重建的目的。
需要说明的是,采用主动双目结构光进行空间三维重建时,受制于双目摄像头100散斑图案的投射范围,也就是视场角,在没有投射散斑图案的区域,则无法计算深度信息。本申请实施例中,通过将两点阵投射模组130的出射光路呈预设夹角,能够提升双目摄像头100的视场角,再通过第一光接收模组140和第二光接收模组150采集的光反射信息(图像信息),计算出同名点的视差偏移,最后进行深度计算和深度补偿,生成高分辨率、高精度的图像深度信息。其中,第一光接收模组140和第二光接收模组150可采用接收相机,其感光芯片为互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)或者电荷耦合器件(Charge Coupled Device,CCD),以采集待测空间的散斑图案。
本申请实施例提供的双目摄像头100,通过基板110以及设置在基板110上的固定架120对点阵投射模组130、第一光接收模组140和第二光接收模组150提供稳定的支撑,从而保证点阵投射模组130、第一光接收模组140和第二光接收模组150之间相对位置的稳 定性。通过将设置在固定架120上的两点阵投射模组130的出射光路呈预设夹角设置,与采用单个投射器相比,有利于提升双目摄像头100的视场角。在双目摄像头100的视场角增大的情况下,有利于使第一光接收模组140和第二光接收模组150接收更大范围的散斑图案信息,扩大双目摄像头100的深度重建范围,进而提升空间三维重建能力。
如图1所示,固定架120呈等腰三角形结构,两点阵投射模组130分别位于等腰三角形相对的两腰上。
固定架120呈等腰三角形结构,即固定架120采用等腰三角形支架,使固定架120的结构更加稳定可靠。另外,通过将两点阵投射模组130分别位于等腰三角形相对的两腰上,可以使点阵投射模组130与固定架120之间稳定连接,从而在安装固定时,可以使点阵投射模组130的出射光路与等腰三角形的腰垂直。这样一来,等腰三角形的底角度数决定了两点阵投射模组130的出射光路的预设夹角的大小。在对不同型号的双目摄像头100进行组装时,通过更换不同的固定架120即可实现所需出射光路角度的调节,有利于简化装配结构,降低操作难度,提升装配效率。
如图2所示,本申请实施例提供的双目摄像头100还可以包括封闭壳体160以及设置在壳体一侧的透明盖板170。基板110、固定架120、点阵投射模组130、第一光接收模组140和第二光接收模组150均位于封闭壳体160内,且基板110与透明盖板170平行设置。
在本申请实施例中,采用上述形式,有利于通过透明盖板170和封闭壳体160对点阵投射模组130、第一光接收模组140和第二光接收模组150等元器件进行保护,从而保证双目摄像头100使用时的稳定性,例如密封性,防止灰尘或者水汽的进入,从而避免受外部环境的干扰。需要说明的是,本申请实施例对封闭壳体160的设置形式不做具体限制,示例性地,封闭壳体160可采用圆柱形,也可以采用圆台形或其他形状,只要能够保证不会影响双目摄像头100的视场角,且不会对第一光接收模组140和第二光接收模组150造成视线遮挡即可。
如图2和图3所示,在本申请实施例中,两点阵投射模组130之间的距离可以通过下式确定:
Figure PCTCN2022080691-appb-000002
其中,d为两点阵投射模组130之间的距离,即BE,2β为点阵投射模组130的视场角,2θ为双目摄像头100的视场角,以及h为双目摄像头100期望的最小应用距离。
图3是图2中点阵投射模组130和固定架120简化的几何模型,在此假定两点阵投射模组130分别为图3中的B点和E点。直线GN所在平面为透明盖板170所在平面,固定架120为ΔJCI的等腰三角形。其中,∠GBF、∠NEF分别为两点阵投射器132的视场角, 设为2β;直线BP和直线EM分别为∠GBF和∠NEF的角平分线,在点阵投射模组130的出射光路与等腰三角形的腰垂直时,直线BP和直线EM垂直于CJ和IJ。延长直线GB和直线NE,两者相较于A点,由几何关系可知,交点F、J、A处于同一直线,∠GAN为最终所需拼接出来的视场角,设为2θ。D点为直线FA与直线BE的交点。两个点阵投射器132的边缘光线BF和EF将在F点相交,意味着产品的最小应用距离为FJ,设为h,否则存在没有散斑点的区域(如图中所示的FBJE围成的区域),导致无法进行深度重建。
在已知单个点阵投射器132视场角2β、最终拼接视场角2θ和最小应用距离h的前提下,即可得到等腰三角形角度ɑ(图3中的∠JCI)和两个点阵投射模组130之间的距离BE。
由三角形的几何关系可知:
∠PBJ=∠PBF+∠FBJ     (1)
∠BJF=∠BDJ+∠DBJ      (2)
∠ABE+∠EBJ+∠PBJ+∠GBP=180°    (3)
∠JCI=∠JBE      (4)
∠FBJ+∠BJF+∠BFJ=180°      (5)
已知∠ABE+θ=90°,
还已知∠PBJ=90°,∠BDJ=90°,∠PBF=∠GBP=β,∠JCI=∠JBD=∠JBE=ɑ,代入上述式子可得:
∠BFJ=2β-θ,∠FBJ=90-β,∠JCI=∠JBE=α=θ-β
在ΔBFJ中,由三角形正弦定理有:
Figure PCTCN2022080691-appb-000003
由公式(6)可知:
Figure PCTCN2022080691-appb-000004
在ΔBDJ中,有恒等式:
Figure PCTCN2022080691-appb-000005
由公式(7)和公式(8)可知,两点阵投射模组130之间的距离d为:
Figure PCTCN2022080691-appb-000006
由上述公式可知,只要确定单个点阵投射模组130视场角2β、期望拼接的视场角2θ 和期望的最小应用距离h,便可以按照上述公式设计出等腰三角形的固定架120并确定左右两个点阵投射模组130之间的相对距离。其中,期望的最小应用距离h可根据产品的安装应用环境确定。
如图4所示,在本申请实施例中,每个点阵投射模组130可以包括至少一个点阵投射器132。可选地,当每个点阵投射模组130包括两个或更多个点阵投射器132时,该点阵投射模组130中的两个或更多个点阵投射器132在同一直线上,且可选地,由两个点阵投射模组130中的点阵投射器132分别形成的两直线相互平行,例如图4所示出的,由左侧点阵投射模组中的两个点阵投射器形成的直线与右侧点阵投射模组中的两个点阵投射器形成的直线相互平行。
在本申请实施例中,每个点阵投射模组130可以仅包括一个点阵投射器132,也可以根据事物的复杂程度,以使每个点阵投射模组130具有两个或者更多个点阵投射器132,这样一来,有利于提升单位面积内的散斑图案的密度,进而提升三维重建能力。可以理解的,点阵投射器132所投射的区域一般为矩形,当每个点阵投射模组130中的两个或更多个点阵投射器132在同一直线上,且由两个点阵投射模组130中的点阵投射器132分别形成的两直线相互平行时,可以使得两点阵投射模组130之间的投射区域更好的分布,以提升投射光束的利用率,并能够避免双目摄像头100出光处没有光束投射区域。
在本申请的可选实施例中,点阵投射器132包括光源,以及位于光源出射光路上的准直透镜和衍射光学元件。
可选地,光源可采用发光二极管(light emitting diode,LED)、半导体激光器(Laser diode,LD)、垂直腔面发射激光器(Vertical Cavity Surface Emitting Laser,VCSEL)中的任意一种。光源出射的光束经准直透镜的准直作用,使光束平行射出,再经过衍射光学元件的整形衍射作用,形成特定的散斑图案。
如图5所示,在本申请实施例中,固定架120还可以包括间隔设置的定位座122。两点阵投射模组130分别位于定位座122的定位面1222上,且两定位面1222分别与等腰三角形的两腰重合。
采用上述形式,可以认为是将等腰三角形不是很必要的位置切除,只保留了三角形的两个角,以降低成本。可以理解的,定位座122也可以采用直角梯形的形式,以便于根据实际需要对点阵投射模组130的设置位置进行加高处理等。
返回图1,在本申请实施例中,第一光接收模组140和第二光接收模组150可以分别位于两点阵投射模组130的相对两侧。这样一来,便于第一光接收模组140和第二光接收模组150分别采集信息,通过不同维度的信息进行整合来测算深度信息。
可选地,固定架120为导热材料制备。示例性地,固定架120可采用铜或铝,也可以 采用导热硅积或陶瓷等,具体可根据实际使用环境灵活设置。
本申请实施例还公开了一种机器人,包括前述实施例中的双目摄像头100。该机器人包含与前述实施例中的双目摄像头100相同的结构和有益效果。双目摄像头100的结构和有益效果已经在前述实施例中进行了详细描述,在此不再赘述。
以上所述仅为本申请的优选实施例而已,并不用于限制本申请,对于本领域的技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。
工业实用性
综上所述,本公开提供了一种双目摄像头及机器人,其能够提升视场角,有利于接收更大范围的图案信息,扩大双目摄像头的深度重建范围,进而提升空间三维重建能力。该双目摄像头及机器人使得能够实现大范围立体障碍物的检测,有利于实现机器人避障、即时定位与地图构建或导航等功能。
此外,可以理解的是,本申请的双目摄像头及机器人是可以重现的,并且可以应用在多种工业应用中。例如,本申请的双目摄像头及机器人可以用于机器视觉技术领域。

Claims (15)

  1. 一种双目摄像头,其特征在于,包括基板,以及设置在所述基板上的固定架,所述固定架上间隔设置有点阵投射模组,两所述点阵投射模组的出射光路呈预设夹角,且两所述点阵投射模组与所述基板的垂直距离相等;所述双目摄像头还包括设置在所述基板上的第一光接收模组和第二光接收模组,所述第一光接收模组和所述第二光接收模组被配置成分别采集两所述点阵投射模组的光反射信息。
  2. 根据权利要求1所述的双目摄像头,其特征在于,所述固定架呈等腰三角形结构,两所述点阵投射模组分别位于所述等腰三角形相对的两腰上。
  3. 根据权利要求1-2中任一项所述的双目摄像头,其特征在于,所述固定架被配置为可更换的,以实现所述点阵投射模组的不同的出射光路角度。
  4. 根据权利要求1-3中任一项所述的双目摄像头,其特征在于,所述双目摄像头还包括封闭壳体,以及设置在壳体一侧的透明盖板,所述基板、所述固定架、所述点阵投射模组、所述第一光接收模组和所述第二光接收模组均位于所述封闭壳体内,且所述基板与所述透明盖板平行设置。
  5. 根据权利要求4所述的双目摄像头,其特征在于,所述封闭壳体为圆柱形、圆台形、或者不会影响所述双目摄像头的视场角且不会对所述第一光接收模组和所述第二光接收模组造成视线遮挡的任何其他形状。
  6. 根据权利要求1-5中任一项所述的双目摄像头,其特征在于,两所述点阵投射模组之间的距离为
    Figure PCTCN2022080691-appb-100001
    其中,d为两所述点阵投射模组之间的距离,2β为所述点阵投射模组的视场角,2θ为所述双目摄像头的视场角,h为所述双目摄像头期望的最小应用距离。
  7. 根据权利要求1-6中任意一项所述的双目摄像头,其特征在于,所述点阵投射模组包括至少一个点阵投射器。
  8. 根据权利要求7所述的双目摄像头,其特征在于,所述点阵投射模组包括两个或更多个点阵投射器,每个所述点阵投射模组中的点阵投射器在同一直线上,且分别由两所述点阵投射模组中的点阵投射器形成的两直线之间相互平行。
  9. 根据权利要求1-8中任一项所述的双目摄像头,其特征在于,所述点阵投射器所投射的区域为矩形。
  10. 根据权利要求1-9中任一项所述的双目摄像头,其特征在于,所述点阵投射器包括光源,以及位于所述光源出射光路上的准直透镜和衍射光学元件。
  11. 根据权利要求1-10中任一项所述的双目摄像头,其特征在于,所述固定架包括间隔设置的两定位座,两所述点阵投射模组分别位于两所述定位座的定位面上,且两所述定位面分别与等腰三角形的两腰重合。
  12. 根据权利要求11所述的双目摄像头,其特征在于,所述定位座采用三角形或直角梯形的形式。
  13. 根据权利要求1-6中任意一项所述的双目摄像头,其特征在于,两所述点阵投射模组包括第一点阵投射模组和第二点阵投射模组,其中,所述第一光接收模组位于所述第一点阵投射模组的远离所述第二点阵投射模组的一侧,所述第二光接收模组位于所述第二点阵投射模组的远离所述第一点阵投射模组的一侧。
  14. 根据权利要求1-6中任意一项所述的双目摄像头,其特征在于,所述固定架为导热材料制备。
  15. 一种机器人,其特征在于,包括权利要求1-14中任意一项所述的双目摄像头。
PCT/CN2022/080691 2021-04-14 2022-03-14 一种双目摄像头及机器人 WO2022218081A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110401481.7A CN112995486A (zh) 2021-04-14 2021-04-14 一种双目摄像头及机器人
CN202110401481.7 2021-04-14

Publications (1)

Publication Number Publication Date
WO2022218081A1 true WO2022218081A1 (zh) 2022-10-20

Family

ID=76339731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/080691 WO2022218081A1 (zh) 2021-04-14 2022-03-14 一种双目摄像头及机器人

Country Status (2)

Country Link
CN (1) CN112995486A (zh)
WO (1) WO2022218081A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112995486A (zh) * 2021-04-14 2021-06-18 东莞埃科思科技有限公司 一种双目摄像头及机器人
CN115102036B (zh) * 2022-08-24 2022-11-22 立臻精密智造(昆山)有限公司 点阵激光发射结构、点阵激光系统及深度计算方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028874A (ja) * 2002-06-27 2004-01-29 Matsushita Electric Ind Co Ltd レンジファインダ装置、物体検出装置および物体検出方法
CN103054522A (zh) * 2012-12-31 2013-04-24 河海大学 基于视觉测量的清洁机器人系统及其测控方法
CN111121722A (zh) * 2019-12-13 2020-05-08 南京理工大学 结合激光点阵和偏振视觉的双目三维成像方法
CN112445004A (zh) * 2019-08-14 2021-03-05 南昌欧菲生物识别技术有限公司 光发射模组和电子设备
CN112489193A (zh) * 2020-11-24 2021-03-12 江苏科技大学 一种基于结构光的三维重建方法
CN112995486A (zh) * 2021-04-14 2021-06-18 东莞埃科思科技有限公司 一种双目摄像头及机器人

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104506842A (zh) * 2015-01-15 2015-04-08 京东方科技集团股份有限公司 三维摄像头模组、终端设备以及测距方法
CN105627933B (zh) * 2016-02-23 2018-12-21 京东方科技集团股份有限公司 测距模组、三维扫描系统以及测距方法
CN215120941U (zh) * 2021-04-14 2021-12-10 东莞埃科思科技有限公司 一种双目摄像头及机器人

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028874A (ja) * 2002-06-27 2004-01-29 Matsushita Electric Ind Co Ltd レンジファインダ装置、物体検出装置および物体検出方法
CN103054522A (zh) * 2012-12-31 2013-04-24 河海大学 基于视觉测量的清洁机器人系统及其测控方法
CN112445004A (zh) * 2019-08-14 2021-03-05 南昌欧菲生物识别技术有限公司 光发射模组和电子设备
CN111121722A (zh) * 2019-12-13 2020-05-08 南京理工大学 结合激光点阵和偏振视觉的双目三维成像方法
CN112489193A (zh) * 2020-11-24 2021-03-12 江苏科技大学 一种基于结构光的三维重建方法
CN112995486A (zh) * 2021-04-14 2021-06-18 东莞埃科思科技有限公司 一种双目摄像头及机器人

Also Published As

Publication number Publication date
CN112995486A (zh) 2021-06-18

Similar Documents

Publication Publication Date Title
WO2022218081A1 (zh) 一种双目摄像头及机器人
CN103292710B (zh) 一种应用双目视觉视差测距原理的距离测量方法
CN106127745B (zh) 结构光3d视觉系统与线阵相机的联合标定方法及装置
US20180051982A1 (en) Object-point three-dimensional measuring system using multi-camera array, and measuring method
US6246468B1 (en) Integrated system for quickly and accurately imaging and modeling three-dimensional objects
EP3364643A1 (en) Method for obtaining combined depth image, and depth camera
CN102650886A (zh) 基于主动全景视觉传感器的机器人视觉系统
US20200192206A1 (en) Structured light projector, three-dimensional camera module and terminal device
KR101926953B1 (ko) 4카메라 그룹 평면 어레이의 특징점의 매칭 방법 및 그에 기초한 측정 방법
CN109724540A (zh) 二维mems扫描反射镜转角标定系统及标定方法
CN104406539A (zh) 全天候主动式全景感知装置及3d全景建模方法
WO2023207756A1 (zh) 图像重建方法和装置及设备
CN215120941U (zh) 一种双目摄像头及机器人
CN114543787B (zh) 基于条纹投影轮廓术的毫米级室内建图定位方法
WO2018171031A1 (zh) 三相机组特征点匹配方法、测量方法及三维检测装置
CN103134444A (zh) 双视场可变焦三维测量系统
CN208350997U (zh) 一种目标物运动监测系统
Wang et al. The human-height measurement scheme by using image processing techniques
CN103245334A (zh) 测绘相机线面阵混合配置ccd焦面的拼接实现装置
CN114111626B (zh) 一种基于同轴投影的光场相机三维测量装置及系统
CN103697825B (zh) 一种超分辨3d激光测量系统及方法
CN102436657A (zh) 基于物联网应用的主动光深度测量值修正方法
KR20110093105A (ko) 카메라 측정 시스템 및 이를 이용한 측정 방법
Xu et al. Calibration method of laser plane equation for vision measurement adopting objective function of uniform horizontal height of feature points
JP2014013319A (ja) 光学システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787311

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22787311

Country of ref document: EP

Kind code of ref document: A1