CN216535127U - Depth camera and sweeping robot - Google Patents

Depth camera and sweeping robot Download PDF

Info

Publication number
CN216535127U
CN216535127U CN202122498496.8U CN202122498496U CN216535127U CN 216535127 U CN216535127 U CN 216535127U CN 202122498496 U CN202122498496 U CN 202122498496U CN 216535127 U CN216535127 U CN 216535127U
Authority
CN
China
Prior art keywords
light
area
structured light
lattice
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202122498496.8U
Other languages
Chinese (zh)
Inventor
黄龙祥
杨煦
沈燕
陈松坤
汪博
朱力
吕方璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN202122498496.8U priority Critical patent/CN216535127U/en
Application granted granted Critical
Publication of CN216535127U publication Critical patent/CN216535127U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model provides a depth camera and a sweeping robot, which comprise a light projector, a light receiver and a driving circuit, wherein the light projector is arranged on the front end of the light receiver; the light projector is used for projecting lattice structured light and linear array structured light to a target scene, and the power density of each light beam in the lattice structured light is greater than that of each light beam in the linear array structured light; the optical receiver is used for receiving the lattice structure light and the linear array structure light reflected by any object in the target scene, generating first depth information according to the lattice structure light, and generating second depth information according to the linear array structure light; and the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off. The utility model reduces the complexity of the product, reduces the manufacturing cost of the product and is convenient for popularization and application of the product.

Description

Depth camera and sweeping robot
Technical Field
The utility model relates to intelligent equipment, in particular to a depth camera and a sweeping robot.
Background
A floor sweeping robot is one of intelligent household appliances, and can automatically finish floor cleaning work in a room by means of certain artificial intelligence. Generally, the floor cleaning machine adopts a brushing and vacuum mode, and firstly absorbs the impurities on the floor into the garbage storage box, so that the function of cleaning the floor is achieved.
In the prior art, a sweeping robot generally performs path planning and mapping by using an lds (laser Direct structuring) laser radar arranged at the top, and performs obstacle avoidance by using a camera arranged at the front end. However, path planning and mapping by LDS have at least two disadvantages: firstly, the laser radar needs to rotate frequently and is easy to damage, and secondly, high-reflectivity objects such as a French window, a floor mirror, a vase and the like cannot be detected. And two sets of devices are needed for path planning and obstacle avoidance functions, so that the complexity of the product is increased, the manufacturing cost of the product is increased, and the popularization and the application of the product are not facilitated.
SUMMERY OF THE UTILITY MODEL
Aiming at the defects in the prior art, the utility model aims to provide a depth camera and a sweeping robot.
The depth camera provided by the utility model comprises a light projector, a light receiver and a driving circuit;
the light projector is used for projecting lattice structured light and linear array structured light to a target scene, and the power density of each light beam in the lattice structured light is greater than that of each light beam in the linear array structured light;
the optical receiver is used for receiving the lattice structure light and the linear array structure light reflected by any object in the target scene, generating first depth information according to the lattice structure light, and generating second depth information according to the linear array structure light;
and the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off.
Preferably, the lattice-structured light forms a lattice pattern, and the line-structured light forms a line pattern;
the lattice pattern is located in an upper side region of the linear array pattern.
Preferably, the lattice-structured light forms a lattice pattern, and the line-structured light forms a line pattern;
the dot matrix pattern is located in a middle area in a height direction of the linear array pattern.
Preferably, the light projector comprises a first laser module and a first projection lens;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area and a second area, the lattice structured light is received and projected through the first area, and the linear array structured light is received and projected through the second area.
Preferably, the light projector comprises a second laser module, a beam splitting device and a second projection lens;
the second laser module is used for projecting laser beams;
the beam splitting device comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light, and the second beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light;
the second projection lens is arranged on the light emitting side of the beam splitting device and comprises a first area and a second area, the beam splitting device is received through the first area and projects lattice structured light, and the linear array structured light is received through the second area and projects.
Preferably, the optical receiver is configured to generate first depth information according to the transmission time or the phase difference of the lattice structured light, and generate second depth information according to a spot image formed by the linear array structured light.
Preferably, the light projector comprises a first laser module and a first projection lens;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area, a second area and a third area, wherein the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, and linear array structured light is received and projected through the second area and the third area.
Preferably, the light projector comprises a second laser module, a beam splitting device and a second projection lens;
the second laser module is used for projecting laser beams;
the beam splitting device comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light, and the second beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light;
the second projection lens is arranged on the light emitting side of the beam splitting device and comprises a first area, a second area and a third area, wherein the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, and linear array structured light is received and projected through the second area and the third area.
Preferably, the line laser array group includes a first line laser array group and a second line laser array group;
the first line laser array group is used for projecting first line array structured light; the second line laser array group is used for projecting second line array structure light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area, a second area and a third area, the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, the first lattice structured light is received and projected through the second area, and the second lattice structured light is received and projected through the third area.
The sweeping robot provided by the utility model comprises a robot body, a depth camera and a controller module; the depth camera is arranged on the side surface of the robot body;
the depth camera comprises a light projector, a driving circuit and a light receiver;
the light projector is used for projecting lattice structured light and linear array structured light to a target scene, and the power density of each light beam in the lattice structured light is greater than that of each light beam in the linear array structured light;
the optical receiver is used for receiving the lattice structure light and the linear array structure light reflected by any object in the target scene, generating first depth information according to the lattice structure light, and generating second depth information according to the linear array structure light;
the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off;
and the controller module is used for carrying out instant positioning and map construction according to the first depth information and generating obstacle avoidance information according to the second depth information.
Compared with the prior art, the utility model has the following beneficial effects:
the depth camera can be applied to a sweeping robot, a light projector of the depth camera is used for projecting dot matrix structured light and linear array structured light to a target scene, an optical receiver can receive the dot matrix structured light and the linear array structured light reflected by any object in the target scene and generate first depth information according to the dot matrix structured light with higher power density, generating second depth information according to the linear array structured light with lower power density, so that the controller module can irradiate the lattice structured light with longer distance to generate first depth information for instant positioning and map construction, the obstacle avoidance information is generated according to the second depth information generated by the close-distance linear array structured light, so that the floor sweeping robot can realize instant positioning, map construction and obstacle avoidance through one depth camera module, the complexity of the product is reduced, the manufacturing cost of the product is reduced, and the popularization and the application of the product are facilitated;
according to the utility model, the second depth information generated by the linear array structured light is adopted for obstacle avoidance, the linear array structured light has a longer extension range, so that the detection of long strip-shaped objects such as table legs, electric wires and the like is facilitated, and the obstacle avoidance effect can be better realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the utility model will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic view of the working principle of the sweeping robot in the embodiment of the utility model;
FIG. 2 is a schematic view of a light field view of a depth camera in an embodiment of the utility model;
FIG. 3 is a schematic view of another light field view of a depth camera in an embodiment of the utility model;
FIG. 4 is a schematic diagram of a depth camera according to an embodiment of the present invention; and
FIG. 5 is a schematic diagram of another structure of a depth camera according to an embodiment of the utility model.
In the figure: 100 is a robot body; 200 is an object; 1 is a light projector; 2 is an optical receiver; 201 is a first region; 202 is a second region; 203 is a third region; 3 is a driving circuit; 101 is an edge-emitting laser; 102 is a collimating lens; 103 is a beam splitting device; 104 is a projection lens; 105 is a diffractive device; 106 is a laser array.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the utility model, but are not intended to limit the utility model in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the utility model. All falling within the scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the utility model described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes the technical solution of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic view of a working principle of a sweeping robot in an embodiment of the present invention, and as shown in fig. 1, the sweeping robot provided by the present invention includes a robot body 100, a depth camera, and a controller module; the depth camera is disposed on a side surface of the robot body 100;
the depth camera comprises a light projector 1, a light receiver 2 and a driving circuit;
the light projector 1 is configured to project lattice structured light and linear array structured light to a target scene, where a power density of each light beam in the lattice structured light is greater than a power density of each light beam in the linear array structured light;
the optical receiver 2 is configured to receive the lattice structured light and the linear array structured light reflected by any object 200 in the target scene, generate first depth information according to the lattice structured light, and generate second depth information according to the linear array structured light;
the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off;
the controller module is used for performing instant positioning and map building (SLAM) according to the first depth information and generating obstacle avoidance information according to the second depth information.
In the embodiment of the utility model, each light beam in the lattice structured light has higher power density, the projection distance is longer, the distribution of objects far away from the sweeping robot indoors can be obtained, the sweeping robot can perform instant positioning and map construction conveniently, each light beam in the linear array structured light has lower power density and higher light beam density, the projection distance is shorter, the distribution of objects 200 near the sweeping robot indoors can be obtained, the surface profile of the objects 200 can be obtained due to higher light beam density, and the obstacle avoidance operation of the sweeping robot is facilitated.
The line type of the line structured light includes, but is not limited to, any one or any plurality of straight lines, curved lines, line segments, and dotted lines, and the number of lines may be 1 or more.
The linear array structured light comprises a plurality of linear light beams;
and a plurality of linear light beams are distributed in an inclined manner, and the vertical lines of two adjacent linear light beams in the width direction of the field angle are provided with overlapping areas.
Fig. 4 is a schematic diagram of a depth camera according to an embodiment of the present invention, and as shown in fig. 4, the light projector 1 includes a first laser module and a first projection lens 104;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens 104 is disposed on the light emitting side of the laser module, and includes a first region 201 and a second region 202, where the first region 201 and the second region 202 are both transparent regions, the lattice structured light is received by the first region 201 and projected by the lattice structured light, and the lattice structured light is received by the second region 202 and projected by the linear array structured light.
In one embodiment of the utility model, the lattice structured light forms a lattice pattern, and the line structured light forms a line pattern;
the dot pattern is located at an upper region of the line pattern, as shown in fig. 2.
In an embodiment of the utility model, the light projector 1 comprises a first laser module and a first projection lens 104;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens 104 is disposed on the light emitting side of the laser module, and includes a first region 201, a second region 202, and a third region 203, the first region 201 is disposed between the second region 202 and the third region 203, the first region 201, the second region 202, and the third region 203 are all transparent regions, the lattice structured light is received by the first region 201 and projected by the lattice structured light, and the linear structured light is received by the second region 202 and the third region 203 and projected by the linear structured light.
The lattice structured light forms a lattice pattern, and the linear array structured light forms a linear array pattern; the dot pattern is located in the middle area of two adjacent line patterns, as shown in fig. 3.
In an embodiment of the present invention, the number of the light beams in the lattice-structured light is between two and several thousand, such as 2 to 1 thousand; the number of light beams in the linear array structured light is between 1 and several beams, such as 1 to 10 beams.
The first Laser module may adopt a Laser array 106 formed by a plurality of Vertical Cavity Surface Emitting Lasers (VCSELs) or a plurality of Edge Emitting Lasers (EELs). After passing through the collimating lens 102, the multiple laser beams can become highly parallel collimated beams, and the projection of the lattice structured light is realized.
Fig. 5 is another schematic structural diagram of a depth camera in an embodiment of the present invention, and as shown in fig. 5, the light projector 1 includes a second laser module, a beam splitter, and a second projection lens 104;
the second laser module is used for projecting laser beams;
the beam splitting device 103 comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into one group of multi-beam laser forming lattice structured light, and the second beam splitting area is used for splitting the laser beam into another group of multi-beam laser forming linear array structured light;
the second projection lens 104 is disposed on the light exit side of the beam splitter 103, and includes a first region 201 and a second region 202, where the first region 201 and the second region 202 are transparent regions, and receive and project lattice structured light through the first region 201, and receive and project linear structured light through the second region 202.
In one embodiment of the utility model, the lattice structured light forms a lattice pattern, and the line structured light forms a line pattern;
the dot pattern is located at an upper region of the line pattern, as shown in fig. 2.
In an embodiment of the utility model, the light projector 1 comprises a second laser module, a beam splitting device and a second projection lens 104;
the second laser module is used for projecting laser beams;
the beam splitting device 103 comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into one group of multi-beam laser forming lattice structured light, and the second beam splitting area is used for splitting the laser beam into another group of multi-beam laser forming linear array structured light;
the second projection lens 104 is disposed on the light-emitting side of the beam splitter 103, and includes a first region 201, a second region 202, and a third region 203, where the first region 201 is disposed between the second region 202 and the third region 203, the first region 201, the second region 202, and the third region 203 are transparent regions, and receive and project lattice structured light through the first region 201, and receive and project linear structured light through the second region 202 and the third region 203.
The lattice structured light forms a lattice pattern, and the linear array structured light forms a linear array pattern; the dot pattern is located in the middle area in the height direction of the line pattern, as shown in fig. 3.
The beam splitting device 103 achieves more collimated laser beams. The beam splitting device 103 may employ a diffraction grating (DOE), a waveguide device, a coded structure photomask, a Spatial Light Modulator (SLM), or the like.
In an embodiment of the present invention, the optical receiver 2 is configured to generate first depth information according to the transmission time or the phase difference of the lattice-structured light, and generate second depth information according to the transmission time or the phase difference of the linear-array-structured light.
The driving circuit 3 and the driving circuit 3 are used for controlling the light projector 1 and the light receiver 2 to be turned on or off simultaneously. The driving circuit 3 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, or the like, or may include a general-purpose processor, for example, when the depth camera is integrated into an intelligent terminal, such as a sweeping robot, the processor in the terminal may serve as at least one part of the processing circuit.
The field angle of the depth camera is preferably between 100 ° and 110 °.
The optical receiver 2 comprises an optical imaging lens, a light detector array and a driving circuit 3; the light detector array comprises a plurality of light detectors distributed in an array;
the optical imaging lens is used for receiving the lattice structure light and the linear array structure light reflected by any object in a target scene and projecting the lattice structure light and the linear array structure light to the optical detector;
the optical detector is used for receiving the lattice structure light and the linear array structure light;
the driving circuit 3 is configured to measure a propagation time or a phase difference of the lattice structured light to generate first depth data of the surface of the target object, and generate second depth data of the surface of the target object according to a light spot image formed by the linear array structured light.
In order to filter background noise, a narrow band filter is usually installed in the optical imaging lens, so that the photodetector array 1 can only pass incident collimated light beams with preset wavelength. The preset wavelength can be the wavelength of the incident collimated light beam, and can also be between 50 nanometers smaller than the incident collimated light beam and 50 nanometers larger than the incident collimated light beam. The photodetector array may be arranged periodically or aperiodically. Each photodetector, in cooperation with an auxiliary circuit, may enable measurement of the time of flight of the collimated beam. The photodetector array may be a combination of multiple single-point photodetectors or a sensor chip integrating multiple photodetectors, as required by the number of discrete collimated beams. To further optimize the sensitivity of the light detectors, the illumination spot of one discrete collimated light beam on the target object may correspond to one or more light detectors. When a plurality of light detectors correspond to the same irradiation light spot, signals of each detector can be communicated through a circuit, so that the light detectors with larger detection areas can be combined.
The light detector adopts a CMOS light sensor, a CD light sensor or a SPAD light sensor.
In an embodiment of the present invention, the depth camera provided by the present invention comprises a light projector 1, a light receiver 2 and a driving circuit;
the light projector 1 is configured to project lattice structured light and linear array structured light to a target scene, where a power density of each light beam in the lattice structured light is greater than a power density of each light beam in the linear array structured light;
the optical receiver 2 is configured to receive the lattice structured light and the linear array structured light reflected by any object in the target scene, generate first depth information according to the lattice structured light, and generate second depth information according to the linear array structured light;
and the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off.
In the embodiment of the utility model, the depth camera can be applied to a sweeping robot, the light projector of the depth camera is used for projecting dot matrix structure light and linear array structure light to a target scene, the light receiver can receive the dot matrix structure light and the linear array structure light reflected by any object in the target scene, generates first depth information according to the dot matrix structure light with higher power density, generates second depth information according to the linear array structure light with lower power density, so that the controller module can irradiate the dot matrix structure light with longer distance to generate the first depth information for instant positioning and map construction, generates obstacle avoidance information according to the second depth information generated by the linear array structure light with shorter distance, realizes instant positioning and map construction and obstacle avoidance of the sweeping robot through one depth camera module, and reduces the complexity of products, and the manufacturing cost of the product is reduced, so that the product is convenient to popularize and apply.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the utility model. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the utility model.

Claims (10)

1. A depth camera includes a light projector, a light receiver, and a driving circuit;
the light projector is used for projecting lattice structured light and linear array structured light to a target scene, and the power density of each light beam in the lattice structured light is greater than that of each light beam in the linear array structured light;
the optical receiver is used for receiving the lattice structure light and the linear array structure light reflected by any object in the target scene, generating first depth information according to the lattice structure light, and generating second depth information according to the linear array structure light;
and the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off.
2. The depth camera of claim 1, wherein the lattice-structured light forms a lattice pattern and the line-structured light forms a line pattern;
the lattice pattern is located in an upper side region of the linear array pattern.
3. The depth camera of claim 1, wherein the lattice-structured light forms a lattice pattern and the line-structured light forms a line pattern;
the lattice pattern is positioned in the middle area of two adjacent linear array patterns.
4. The depth camera of claim 1, wherein the light projector comprises a first laser module and a first projection lens;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area and a second area, the lattice structured light is received and projected through the first area, and the linear array structured light is received and projected through the second area.
5. The depth camera of claim 1, wherein the light projector comprises a second laser module, a beam splitting device, and a second projection lens;
the second laser module is used for projecting laser beams;
the beam splitting device comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light, and the second beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light;
the second projection lens is arranged on the light emitting side of the beam splitting device and comprises a first area and a second area, the beam splitting device is received through the first area and projects lattice structured light, and the linear array structured light is received through the second area and projects.
6. The depth camera according to claim 1, wherein the optical receiver is configured to generate first depth information according to a transmission time or a phase difference of the lattice-structured light, and generate second depth information according to a spot image formed by the linear-structured light.
7. The depth camera of claim 4, wherein the light projector comprises a first laser module and a first projection lens;
the first laser module comprises a point laser array group and a line laser array group, wherein the point laser array group is used for projecting dot matrix structured light, and the line laser array group is used for projecting linear array structured light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area, a second area and a third area, wherein the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, and linear array structured light is received and projected through the second area and the third area.
8. The depth camera of claim 1, wherein the light projector comprises a second laser module, a beam splitting device, and a second projection lens;
the second laser module is used for projecting laser beams;
the beam splitting device comprises a first beam splitting area and a second beam splitting area, wherein the first beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light, and the second beam splitting area is used for splitting the laser beam into a plurality of laser forming linear array structured light;
the second projection lens is arranged on the light emitting side of the beam splitting device and comprises a first area, a second area and a third area, wherein the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, and linear array structured light is received and projected through the second area and the third area.
9. The depth camera of claim 4, wherein the line laser array group comprises a first line laser array group and a second line laser array group;
the first line laser array group is used for projecting first line array structured light; the second line laser array group is used for projecting second line array structure light;
the first projection lens is arranged on the light emitting side of the laser module and comprises a first area, a second area and a third area, the first area is arranged between the second area and the third area, lattice structured light is received and projected through the first area, the first lattice structured light is received and projected through the second area, and the second lattice structured light is received and projected through the third area.
10. A floor sweeping robot is characterized by comprising a robot body, a depth camera and a controller module; the depth camera is arranged on the side surface of the robot body;
the depth camera comprises a light projector, a driving circuit and a light receiver;
the light projector is used for projecting lattice structured light and linear array structured light to a target scene, and the power density of each light beam in the lattice structured light is greater than that of each light beam in the linear array structured light;
the optical receiver is used for receiving the lattice structure light and the linear array structure light reflected by any object in the target scene, generating first depth information according to the lattice structure light, and generating second depth information according to the linear array structure light;
the driving circuit is used for controlling the light projector and the light receiver to be simultaneously switched on or switched off;
and the controller module is used for carrying out instant positioning and map construction according to the first depth information and generating obstacle avoidance information according to the second depth information.
CN202122498496.8U 2021-10-18 2021-10-18 Depth camera and sweeping robot Active CN216535127U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202122498496.8U CN216535127U (en) 2021-10-18 2021-10-18 Depth camera and sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202122498496.8U CN216535127U (en) 2021-10-18 2021-10-18 Depth camera and sweeping robot

Publications (1)

Publication Number Publication Date
CN216535127U true CN216535127U (en) 2022-05-17

Family

ID=81565792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202122498496.8U Active CN216535127U (en) 2021-10-18 2021-10-18 Depth camera and sweeping robot

Country Status (1)

Country Link
CN (1) CN216535127U (en)

Similar Documents

Publication Publication Date Title
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
EP3465266B1 (en) Optical system for object detection and location
JP2017173298A (en) Object detection device and mobile entity device
CN111487639B (en) Laser ranging device and method
CN216167219U (en) Depth camera and sweeping robot
CN216535127U (en) Depth camera and sweeping robot
CN216535129U (en) Depth camera and sweeping robot
CN216167222U (en) Depth camera and sweeping robot
CN115248440A (en) TOF depth camera based on dot matrix light projection
CN216167218U (en) Depth camera and sweeping robot
CN210835244U (en) 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud
CN216535123U (en) Depth camera and sweeping robot
CN210128694U (en) Depth imaging device
CN114935743B (en) Emission module, photoelectric detection device and electronic equipment
CN114935742B (en) Emission module, photoelectric detection device and electronic equipment
CN216754353U (en) Depth camera and sweeping robot
CN115868858A (en) Sweeping robot and structured light camera
CN115989971A (en) Sweeping robot and depth camera
CN115868857A (en) Double-scanning sweeping robot and depth camera
CN115989973A (en) Sweeping robot and depth camera
CN117322804A (en) Sweeping robot and structured light camera
CN216535126U (en) Depth camera covering short distance and long distance and sweeping robot
CN216167221U (en) Depth camera and sweeping robot
CN117322805A (en) Double-scanning floor sweeping robot and depth camera
CN115989972A (en) Sweeping robot and depth camera

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant