CN112995486A - Binocular camera and robot - Google Patents

Binocular camera and robot Download PDF

Info

Publication number
CN112995486A
CN112995486A CN202110401481.7A CN202110401481A CN112995486A CN 112995486 A CN112995486 A CN 112995486A CN 202110401481 A CN202110401481 A CN 202110401481A CN 112995486 A CN112995486 A CN 112995486A
Authority
CN
China
Prior art keywords
dot matrix
binocular camera
light receiving
receiving module
matrix projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110401481.7A
Other languages
Chinese (zh)
Inventor
陈展耀
周宗华
罗德国
戴书麟
刘风雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Ekos Technology Co Ltd
Original Assignee
Dongguan Ekos Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Ekos Technology Co Ltd filed Critical Dongguan Ekos Technology Co Ltd
Priority to CN202110401481.7A priority Critical patent/CN112995486A/en
Publication of CN112995486A publication Critical patent/CN112995486A/en
Priority to PCT/CN2022/080691 priority patent/WO2022218081A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses two mesh cameras and robot relates to machine vision technical field. The binocular camera comprises a substrate and a fixing frame arranged on the substrate, wherein dot matrix projection modules are arranged on the fixing frame at intervals, emergent light paths of the two dot matrix projection modules form a preset included angle, and the vertical distances between the two dot matrix projection modules and the substrate are equal; the binocular camera further comprises a first light receiving module and a second light receiving module which are arranged on the substrate, and the first light receiving module and the second light receiving module are used for respectively collecting light reflection information of the two dot matrix projection modules. The field angle can be improved, and the three-dimensional space reconstruction capability is further improved.

Description

Binocular camera and robot
Technical Field
The application relates to the technical field of machine vision, in particular to a binocular camera and a robot.
Background
Binocular Stereo Vision (Binocular Stereo Vision) is an important form of machine Vision, and is a method for acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions by using imaging equipment based on a parallax principle and calculating position deviation between corresponding points of the images.
In the prior art, the binocular stereo vision mostly adopts an active binocular structured light scheme to reconstruct the space three-dimension. In the existing active binocular structure optical camera, the visual angle is small due to the field angle transmitted by the dot matrix projector, so that the operable space is inevitably small, the detection of large-range three-dimensional obstacles cannot be realized, and the functions of obstacle avoidance, instant positioning and mapping (SLAM) or navigation of the robot and the like are influenced.
Disclosure of Invention
An object of the application is to provide a binocular camera and robot, can promote the angle of view, and then promote the three-dimensional reconstruction ability in space.
The embodiment of the application is realized as follows:
on one hand, the binocular camera comprises a substrate and a fixing frame arranged on the substrate, wherein dot matrix projection modules are arranged on the fixing frame at intervals, emergent light paths of the two dot matrix projection modules form a preset included angle, and the vertical distances between the two dot matrix projection modules and the substrate are equal; the binocular camera further comprises a first light receiving module and a second light receiving module which are arranged on the substrate, and the first light receiving module and the second light receiving module are used for respectively collecting light reflection information of the two dot matrix projection modules.
Optionally, the fixing frame is in an isosceles triangle structure, and the two dot matrix projection modules are respectively located on two opposite waists of the isosceles triangle.
Optionally, the binocular camera still includes the closed housing to and set up the transparent cover in casing one side, the base plate the mount the dot matrix is thrown the module first light receiving module with the second light receiving module all is located in the closed housing, just the base plate with transparent cover parallel arrangement.
Optionally, the distance between the two dot matrix projection modules is
Figure BDA0003020495700000021
And d is the distance between the two dot matrix projection modules, 2 beta is the angle of view of the dot matrix projection modules, 2 theta is the angle of view of the binocular camera, and h is the expected minimum application distance of the binocular camera.
Optionally, the dot matrix projection module comprises at least one dot matrix projector; when the number of the dot matrix projectors is larger than or equal to two, the dot matrix projectors of each dot matrix projection module are on the same straight line, and the two straight lines are parallel to each other.
Optionally, the dot matrix projector comprises a light source, and a collimating lens and a diffractive optical element located on an exit light path of the light source.
Optionally, the fixing frame includes positioning seats arranged at intervals, the two dot matrix projection modules are respectively located on positioning surfaces of the positioning seats, and the two positioning surfaces are respectively overlapped with two waists of the isosceles triangle.
Optionally, the first light receiving module and the second light receiving module are respectively located at two opposite sides of the two dot matrix projection modules.
Optionally, the fixing frame is made of a heat conducting material.
In another aspect of the embodiments of the present application, there is provided a robot including the binocular camera according to any one of the above.
The beneficial effects of the embodiment of the application include:
the binocular camera that this application embodiment provided, through the base plate to and the mount of setting on the base plate, in order to provide stable support to dot matrix projection module, first light receiving module and second light receiving module, in order to guarantee that the dot matrix projects the stability of relative position between module, first light receiving module and the second light receiving module. The emergent light path of the two dot matrix projection modules arranged on the fixing frame is arranged at a preset included angle, and compared with a single projector, the emergent light path is favorable for improving the field angle of the binocular camera. Under the condition that the field angle of the binocular camera is increased, the first light receiving module and the second light receiving module are favorable for receiving speckle pattern information in a wider range, the depth reconstruction range of the binocular camera is enlarged, and further the three-dimensional space reconstruction capability is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is one of schematic structural diagrams of a binocular camera provided in an embodiment of the present application;
fig. 2 is a second schematic structural diagram of a binocular camera provided in the embodiment of the present application;
fig. 3 is a diagram illustrating a positional relationship between the dot matrix projection module and the transparent cover plate according to an embodiment of the disclosure;
FIG. 4 is a schematic structural diagram of a connection between a fixing frame and a dot matrix projector according to an embodiment of the present disclosure;
fig. 5 is a schematic structural view illustrating a connection between the positioning seat and the dot matrix projection module according to an embodiment of the present application.
Icon: 100-binocular camera; 120-a fixed mount; 122-a positioning seat; 1222-a positioning surface; 130-a dot matrix projection module; 132-a dot matrix projector; 140-a first light receiving module; 150-a second light receiving module; 160-a closed housing; 170-transparent cover plate.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Furthermore, the terms "first," "second," and the like are used merely to distinguish one description from another, and are not to be construed as indicating or implying relative importance.
In the description of the present application, it is also to be noted that, unless otherwise explicitly specified or limited, the terms "disposed" and "connected" are to be interpreted broadly, e.g., as being either fixedly connected, detachably connected, or integrally connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
With the improvement of living standard of people, the indoor robot based on the intelligent navigation scheme gradually enters the life of people, and the 3D sensing system is the most core part of the indoor robot to realize functions of SLAM, obstacle avoidance and the like. At present, the binocular stereo vision mostly adopts an active binocular structured light scheme to reconstruct the space three-dimension. However, in practical use, the effect is strong, the main problem is that the obstacle avoidance capability is poor, the main sensors are located at the top of the robot, the visual angle is small, the visual range is small, the operable space is small, and the detection of large-range three-dimensional obstacles cannot be realized.
And the depth field angle of the active binocular structured light depends on the field angle of the camera. The existing field angle is about 60 multiplied by 80 at most under the premise of ensuring the optical performance due to the micro-nano optical technology, and the method can be applied to certain specific scenes such as door control, door lock and the like. In an application scene of intelligent robot navigation, the required depth reconstruction field angle can reach 120 multiplied by 80, and the existing camera obviously cannot reach such a large field angle, so that the three-dimensional reconstruction capability in practical application is limited. Based on this, the following scheme is specifically proposed in the embodiments of the present application to improve the field angle and further improve the spatial three-dimensional reconstruction capability.
Referring to fig. 1, the present embodiment provides a binocular camera 100, including a substrate and a fixing frame 120 disposed on the substrate, wherein dot matrix projection modules 130 are disposed on the fixing frame 120 at intervals, an emergent light path of the dot matrix projection modules 130 forms a predetermined included angle, and vertical distances between the dot matrix projection modules 130 and the substrate are equal; the binocular camera 100 further includes a first light receiving module 140 and a second light receiving module 150 disposed on the substrate, and the first light receiving module 140 and the second light receiving module 150 are used for respectively collecting light reflection information of the two-dot matrix projection module 130.
Specifically, the parameters of the two-dot matrix projection modules 130 spaced apart from each other on the fixing frame 120 are the same, and the vertical distances between the two-dot matrix projection modules 130 and the substrate are the same. Therefore, the two-dot matrix projection module 130 is located at the same installation height, and when the two-dot matrix projection module is used, the speckle patterns projected by the two-dot matrix projection module 130 are equal in position and size at the same distance, which is beneficial to ensuring the consistency of the patterns projected by the two-dot matrix projection module 130, so that the calculation difficulty is reduced. In practical applications, the dot matrix projection module 130 with different parameter information can be set according to the requirements to meet the diversified requirements.
The binocular camera 100 of the present application is implemented based on the principle of optical triangulation of active binocular structured light three-dimensional vision. When the two-dot array projection module 130 is used, structured light in a certain mode is projected on the surface of an object, so that a three-dimensional image of light bars modulated by the surface shape of the object to be measured is formed on the surface of the object. The three-dimensional image is collected by the first light receiving module 140 and the second light receiving module 150, so as to obtain a two-dimensional distorted image of the light bar. The degree of distortion of the light bar depends on the relative position and the object surface profile (height) between the dot matrix projection module 130 and the first light receiving module 140 and the second light receiving module 150, respectively. When the relative positions of the dot matrix projection module 130, the first light receiving module 140 and the second light receiving module 150 are fixed, the three-dimensional profile of the object surface can be reproduced by the distorted two-dimensional light strip image coordinates, so as to achieve the purpose of spatial three-dimensional reconstruction.
It should be noted that, when the active binocular structured light is used to perform spatial three-dimensional reconstruction, the projection range of the speckle pattern of the binocular camera 100, that is, the field angle, cannot be calculated in the area where the speckle pattern is not projected. In the embodiment of the present application, the exit light path of the two-dot matrix projection module 130 forms a preset included angle, so that the field angle of the binocular camera 100 can be increased, the parallax offset of the same-name point is calculated through the light reflection information (image information) collected by the first light receiving module 140 and the second light receiving module 150, and finally, the depth calculation and the depth compensation are performed, so as to generate the high-resolution and high-precision image depth information. The first light receiving module 140 and the second light receiving module 150 may employ a receiving camera, and the photosensitive chips thereof are Complementary Metal-Oxide-Semiconductor (CMOS) or Charge Coupled Device (CCD) to collect the speckle pattern of the space to be measured.
The binocular camera 100 provided by the embodiment of the present application, through the substrate and the fixing frame 120 disposed on the substrate, is convenient for providing stable support for the dot matrix projection module 130, the first light receiving module 140 and the second light receiving module 150, so as to ensure stability of relative positions among the dot matrix projection module 130, the first light receiving module 140 and the second light receiving module 150. The exit light path of the two-dot matrix projection module 130 disposed on the fixing frame 120 is set to have a predetermined included angle, which is advantageous to increase the field angle of the binocular camera 100 compared with the case of using a single projector. Under the condition that the field angle of the binocular camera 100 is increased, the first light receiving module 140 and the second light receiving module 150 are favorable for receiving speckle pattern information in a wider range, the depth reconstruction range of the binocular camera 100 is expanded, and the three-dimensional space reconstruction capability is improved.
As shown in fig. 1, the fixing frame 120 is an isosceles triangle structure, and the two-dot matrix projection modules 130 are respectively located on two opposite sides of the isosceles triangle.
Specifically, the fixing frame 120 is an isosceles triangle structure, that is, the fixing frame 120 adopts an isosceles triangle bracket, so that the structure of the fixing frame 120 is more stable and reliable. In addition, the two-dot matrix projection module 130 is respectively positioned on two opposite waists of the isosceles triangle, so that the two-dot matrix projection module 130 is stably connected with the fixing frame 120, when the two-dot matrix projection module is fixedly installed, the emergent light path of the two-dot matrix projection module 130 is perpendicular to the waists of the isosceles triangle, and thus, the base angle degree of the isosceles triangle determines the size of the preset included angle of the emergent light path of the two-dot matrix projection module 130, when the binocular cameras 100 of different models are assembled, the adjustment of the required emergent light path angle can be realized by replacing different fixing frames 120, the assembly structure is facilitated to be simplified, the operation difficulty is reduced, and the assembly efficiency is improved.
As shown in fig. 2, the binocular camera 100 further includes a closed housing 160 and a transparent cover plate 170 disposed on one side of the housing, the substrate, the fixing frame 120, the dot matrix projection module 130, the first light receiving module 140 and the second light receiving module 150 are all disposed in the closed housing 160, and the substrate and the transparent cover plate 170 are disposed in parallel.
Specifically, adopt above-mentioned form, be favorable to throwing components and parts such as module 130, first light receiving module 140 and second light receiving module 150 to the dot matrix through transparent cover 170 and closed housing 160 and protect to guarantee the stability when binocular camera 100 uses, for example sealed, prevent the entering of dust or steam, thereby avoid receiving external environment's interference. It should be noted that the arrangement form of the close housing 160 is not specifically limited in the embodiment of the present application, and for example, the close housing 160 may be a cylinder, and may also be a truncated cone or other shapes, as long as it can be ensured that the field angle of the binocular camera 100 is not affected, and the first light receiving module 140 and the second light receiving module 150 are not shielded from the line of sight.
As shown in fig. 2 and 3, the distance between the two-dot matrix projection modules 130Is composed of
Figure BDA0003020495700000081
Where d is the distance between the two dot matrix projection modules 130, 2 β is the angle of view of the dot matrix projection modules 130, 2 θ is the angle of view of the binocular camera 100, and h is the minimum application distance expected by the binocular camera 100.
Specifically, fig. 3 is a simplified geometric model of the dot matrix projection module 130 and the fixing frame 120 in fig. 2, assuming that the two dot matrix projection modules 130 are respectively the points B and E in fig. 3. The plane of the straight line GN is the plane of the transparent cover 170, and the fixing frame 120 is an isosceles triangle of Δ JCI. Wherein, angle GBF and angle NEF are the field angles of the two-dot matrix projector 132, respectively, and are set to be 2 β, the straight line BP and the straight line EM are the angular bisectors of angle GBF and angle NEF, respectively, and when the emergent light path of the dot matrix projection module 130 is perpendicular to the waists of the isosceles triangle, the straight line BP and the straight line EM are perpendicular to CJ and IJ. And (3) extending the straight line GB and the straight line NE, comparing the straight line GB and the straight line NE with the point A, knowing from the geometrical relationship that the intersection points F, J and A are in the same straight line, and setting the angle GAN as the angle of view required to be spliced finally as 2 theta. The edge rays BF and EF of the two dot matrix projectors 132 will intersect at point F, meaning that the minimum application distance of the product is FJ, set to h, otherwise there is a region without scattered spots (e.g., the region surrounded by FBJE shown in the figure), resulting in failure of depth reconstruction.
Under the premise that the field angle 2 β of the single dot matrix projector 132 is known and the field angle 2 θ and the minimum application distance h are finally spliced, an isosceles triangle angle a (angle JCI in fig. 3) and a distance BE between the two dot matrix projection modules 130 can BE obtained.
From the geometric relationship of the triangle:
∠PBJ=∠PBF+∠FBJ (1)
∠BJF=∠BDJ+∠DBJ (2)
∠ABE+∠EBJ+∠PBJ+∠GBP=180 (3)
∠JCI=∠JBE (4)
∠FBJ+∠BJF+∠BFJ=180 (5)
substituting ═ PBJ ═ 90 into the above equation to obtain:
∠BFJ=2β-θ,∠FBJ=90-β,∠JCI=α=θ-β
in Δ BFJ, there are, by the triangular sine theorem:
Figure BDA0003020495700000091
as can be seen from equation (6):
Figure BDA0003020495700000101
in Δ BDJ, there is a constant equation:
Figure BDA0003020495700000102
as can be seen from the equations (7) and (8), the distance d between the two-dot matrix projection modules 130 is:
Figure BDA0003020495700000103
as can be seen from the above formula, as long as the viewing angle 2 β of a single dot matrix projection module 130, the desired splicing viewing angle 2 θ and the desired minimum application distance h are determined, the relative distance between the isosceles triangular fixing frame 120 and the left and right dot matrix projection modules 130 can be designed according to the above formula. Wherein the desired minimum application distance h may be determined according to the installation application environment of the product.
As shown in fig. 1 and 4, the dot matrix projection module 130 includes at least one dot matrix projector 132; when the number of the dot matrix projectors 132 is greater than or equal to two, the dot matrix projectors 132 of each dot matrix projection module 130 are on the same straight line, and the two straight lines are parallel to each other.
Specifically, each dot matrix projection module 130 may only include one dot matrix projector 132, and also may be configured to have two or more dot matrix projectors 132 according to the complexity of the object, so as to facilitate increasing the density of the speckle patterns in the unit area, and further improve the three-dimensional reconstruction capability. It can be understood that the area projected by the dot matrix projector 132 is generally rectangular, and when the dot matrix projector 132 of each dot matrix projection module 130 is on the same straight line and the two straight lines are parallel to each other, the projection areas between the two dot matrix projection modules 130 can be better distributed, so as to improve the utilization rate of the projected light beams and avoid the light-emitting area of the binocular camera 100 without the light beam projection area.
In an alternative embodiment of the present application, the dot matrix projector 132 includes a light source, and a collimating lens and diffractive optical element positioned in the path of the light source's exit light.
Specifically, the light source may be any one of a Light Emitting Diode (LED), a semiconductor Laser (LD), and a Vertical Cavity Surface Emitting Laser (VCSEL). The light beam emitted from the light source is collimated by the collimating lens to be emitted in parallel, and then is shaped and diffracted by the diffractive optical element to form a specific speckle pattern.
As shown in fig. 5, the fixing frame 120 includes positioning bases 122 disposed at intervals, the two-dot matrix projection module 130 is respectively located on positioning surfaces 1222 of the positioning bases 122, and the two positioning surfaces 1222 are respectively overlapped with two waists of the isosceles triangle.
Specifically, in the above form, it is considered that the isosceles triangle is cut off at a position where it is not necessary, and only two corners of the triangle are reserved, so as to reduce the cost. It can be understood that the positioning seat 122 may also be in the form of a right trapezoid, so as to perform heightening processing and the like on the installation position of the lattice projection module 130 according to actual needs.
As shown in fig. 1, the first light receiving module 140 and the second light receiving module 150 are respectively located at two opposite sides of the two-dot matrix projection module 130. In this way, the first light receiving module 140 and the second light receiving module 150 can collect information respectively, and the depth information can be measured by integrating information of different dimensions.
Optionally, the fixing frame 120 is made of a heat conductive material. For example, the fixing frame 120 may be made of copper or aluminum, or may be made of thermal conductive silicon or ceramic, and may be flexibly configured according to the actual use environment.
The embodiment of the application also discloses a robot, which comprises the binocular camera 100 in the embodiment. The robot includes the same structure and advantageous effects as the binocular camera 100 in the foregoing embodiment. The structure and the advantageous effects of the binocular camera 100 have been described in detail in the foregoing embodiments, and are not described in detail herein.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A binocular camera is characterized by comprising a substrate and a fixing frame arranged on the substrate, wherein dot matrix projection modules are arranged on the fixing frame at intervals, emergent light paths of the two dot matrix projection modules form a preset included angle, and the vertical distances between the two dot matrix projection modules and the substrate are equal; the binocular camera further comprises a first light receiving module and a second light receiving module which are arranged on the substrate, and the first light receiving module and the second light receiving module are used for respectively collecting light reflection information of the two dot matrix projection modules.
2. The binocular camera according to claim 1, wherein the fixing frame is an isosceles triangle structure, and the two dot matrix projection modules are respectively located on two opposite sides of the isosceles triangle.
3. The binocular camera according to claim 2, further comprising a closed housing and a transparent cover plate disposed on one side of the housing, wherein the base plate, the fixing frame, the dot matrix projection module, the first light receiving module and the second light receiving module are all located in the closed housing, and the base plate and the transparent cover plate are disposed in parallel.
4. The binocular camera of claim 3, wherein the distance between the two dot matrix projection modules is
Figure FDA0003020495690000011
And d is the distance between the two dot matrix projection modules, 2 beta is the angle of view of the dot matrix projection modules, 2 theta is the angle of view of the binocular camera, and h is the expected minimum application distance of the binocular camera.
5. The binocular camera of any one of claims 1-4, wherein the dot matrix projection module comprises at least one dot matrix projector; when the number of the dot matrix projectors is larger than or equal to two, the dot matrix projectors of each dot matrix projection module are on the same straight line, and the two straight lines are parallel to each other.
6. The binocular camera of claim 5, wherein the dot matrix projector includes a light source, and a collimating lens and diffractive optical element positioned in an exit light path of the light source.
7. The binocular camera according to claim 1, wherein the fixing frame includes positioning seats arranged at intervals, the two dot matrix projection modules are respectively located on positioning surfaces of the positioning seats, and the two positioning surfaces are respectively overlapped with two waists of the isosceles triangle.
8. The binocular camera according to any one of claims 1 to 4, wherein the first light receiving module and the second light receiving module are respectively located at opposite sides of the two dot matrix projection modules.
9. The binocular camera of any one of claims 1-4, wherein the mount is made of a thermally conductive material.
10. A robot comprising the binocular camera of any one of claims 1 to 9.
CN202110401481.7A 2021-04-14 2021-04-14 Binocular camera and robot Pending CN112995486A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110401481.7A CN112995486A (en) 2021-04-14 2021-04-14 Binocular camera and robot
PCT/CN2022/080691 WO2022218081A1 (en) 2021-04-14 2022-03-14 Binocular camera and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110401481.7A CN112995486A (en) 2021-04-14 2021-04-14 Binocular camera and robot

Publications (1)

Publication Number Publication Date
CN112995486A true CN112995486A (en) 2021-06-18

Family

ID=76339731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110401481.7A Pending CN112995486A (en) 2021-04-14 2021-04-14 Binocular camera and robot

Country Status (2)

Country Link
CN (1) CN112995486A (en)
WO (1) WO2022218081A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115102036A (en) * 2022-08-24 2022-09-23 立臻精密智造(昆山)有限公司 Lattice laser emission structure, lattice laser system and depth calculation method
WO2022218081A1 (en) * 2021-04-14 2022-10-20 东莞埃科思科技有限公司 Binocular camera and robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004028874A (en) * 2002-06-27 2004-01-29 Matsushita Electric Ind Co Ltd Range finder device, and device and method for detecting object
CN103054522B (en) * 2012-12-31 2015-07-29 河海大学 A kind of cleaning robot system and investigating method thereof
CN112445004A (en) * 2019-08-14 2021-03-05 南昌欧菲生物识别技术有限公司 Light emitting module and electronic device
CN111121722A (en) * 2019-12-13 2020-05-08 南京理工大学 Binocular three-dimensional imaging method combining laser dot matrix and polarization vision
CN112489193A (en) * 2020-11-24 2021-03-12 江苏科技大学 Three-dimensional reconstruction method based on structured light
CN112995486A (en) * 2021-04-14 2021-06-18 东莞埃科思科技有限公司 Binocular camera and robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022218081A1 (en) * 2021-04-14 2022-10-20 东莞埃科思科技有限公司 Binocular camera and robot
CN115102036A (en) * 2022-08-24 2022-09-23 立臻精密智造(昆山)有限公司 Lattice laser emission structure, lattice laser system and depth calculation method

Also Published As

Publication number Publication date
WO2022218081A1 (en) 2022-10-20

Similar Documents

Publication Publication Date Title
US10088296B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
WO2022218081A1 (en) Binocular camera and robot
CN106127745B (en) The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera
US20160073104A1 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US8525983B2 (en) Device and method for measuring six degrees of freedom
TWI615299B (en) Vehicle monitoring system and method of vehicle monitoring
US20200192206A1 (en) Structured light projector, three-dimensional camera module and terminal device
US20140307100A1 (en) Orthographic image capture system
WO2021238214A1 (en) Three-dimensional measurement system and method, and computer device
US20190246091A1 (en) Systems and methods for enhanced depth sensor devices
JP2020515120A (en) Image correction method and apparatus, storage medium, and projection apparatus
US20100245824A1 (en) Method for orienting a parallax barrier screen on a display screen
JP6308637B1 (en) 3D measurement method and apparatus using feature quantity
US11435448B2 (en) Systems and methods for optical demodulation in a depth-sensing device
JP2017527812A (en) Method for optical measurement of three-dimensional coordinates and calibration of a three-dimensional measuring device
CN106254738A (en) Dual image acquisition system and image-pickup method
JP2017528714A (en) Method for optical measurement of three-dimensional coordinates and control of a three-dimensional measuring device
JP2021517635A (en) Road surface monitoring systems and methods using infrared rays, automobiles
EP2793042B1 (en) Positioning device comprising a light beam
CN215120941U (en) Binocular camera and robot
CN109883393B (en) Method for predicting front gradient of mobile robot based on binocular stereo vision
CN207301612U (en) Integrated big visual angle 3D vision systems
CN216646799U (en) Projection module, imaging device and optical equipment
WO2016039955A1 (en) A portable device for optically measuring three- dimensional coordinates
CN213091888U (en) Depth measurement system and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination