CN214011503U - Light emission module, depth camera and floor sweeping robot - Google Patents

Light emission module, depth camera and floor sweeping robot Download PDF

Info

Publication number
CN214011503U
CN214011503U CN202023317073.3U CN202023317073U CN214011503U CN 214011503 U CN214011503 U CN 214011503U CN 202023317073 U CN202023317073 U CN 202023317073U CN 214011503 U CN214011503 U CN 214011503U
Authority
CN
China
Prior art keywords
vcsel
light
module
vcsel array
depth camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202023317073.3U
Other languages
Chinese (zh)
Inventor
卢鹏
王济东
王定国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuyao Sunny Optical Intelligence Technology Co Ltd
Original Assignee
Yuyao Sunny Optical Intelligence Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuyao Sunny Optical Intelligence Technology Co Ltd filed Critical Yuyao Sunny Optical Intelligence Technology Co Ltd
Priority to CN202023317073.3U priority Critical patent/CN214011503U/en
Application granted granted Critical
Publication of CN214011503U publication Critical patent/CN214011503U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The application discloses light emission module, degree of depth camera and robot of sweeping floor. The optical transmission module comprises a VCSEL array. The VCSEL array is for projecting an output light field towards a scene and has a plurality of VCSEL emitting units arranged in an array. The number of VCSEL emitting units arranged in the row direction of the VCSEL array is at least three times the number of VCSEL emitting units arranged in the column direction of the VCSEL array. According to the light emission module of the present application, a light field having a sufficiently large angle of view in the horizontal direction and having an angle of view within a specified angle range in the vertical direction can be projected.

Description

Light emission module, depth camera and floor sweeping robot
Technical Field
The application relates to the technical field of machine vision, more specifically relates to a light emission module, including this light emission module's degree of depth camera and the robot of sweeping the floor that is applicable to the robot of sweeping the floor.
Background
The time of flight (TOF) measures the three-dimensional structure or three-dimensional profile of a target scene by measuring the time interval t (pulse ranging method) between the emission of a pulse signal from an optical emission module and the reception of the pulse signal by an optical reception module or the phase difference (phase difference ranging method) generated when laser light travels back and forth to the target scene once. The TOF depth camera is used as distance measuring equipment with high precision and widely applied to the fields of somatosensory control, behavior analysis, monitoring, automatic driving, artificial intelligence, machine vision, automatic 3D modeling and the like. However, different application fields or application scenarios have different performance requirements for TOF depth cameras, which requires designing different TOF depth cameras according to the specific application fields or application scenarios.
With the rapid development of science and technology, the application of the sweeping robot is more and more popular. In the implementation scheme of the light emission module applied to the sweeping robot, the field angle of the TOF depth camera with the mature technology at present is generally smaller. However, the sweeping robot is generally required to have a large angle of view in the horizontal direction (i.e., H direction) in order to measure a large range of obstacle information. Meanwhile, the sweeping robot needs to have a smaller field of view in the vertical direction (i.e., the V direction), which is not only because the sweeping robot does not have a large field of view requirement in the V direction, but also because the large field of view in the V direction may cause ground reflection and further affect the measurement accuracy of the TOF depth camera, even may cause the TOF depth camera to fail to work normally.
SUMMERY OF THE UTILITY MODEL
The technical scheme provided by the utility model can solve or solve above-mentioned technical problem at least partially.
In one aspect, the present application provides a light emission module including a VCSEL array. A VCSEL array may be used to project an output light field towards a scene and has a plurality of VCSEL emitting units arranged in an array. In the VCSEL array, the number of VCSEL emitting units arranged in the row direction may be at least three times the number of VCSEL emitting units arranged in the column direction.
In some embodiments, the ratio of the angle of the output light field in the horizontal direction to the angle in the vertical direction may be greater than or equal to 8.
In some embodiments, the light emitting module may further include a printed circuit board. The VCSEL array can be disposed on and electrically connected to a printed circuit board.
In some embodiments, the light emitting module may further include a driver. A driver may be disposed on the printed circuit board and used to drive the VCSEL array to produce modulated light.
In some embodiments, the number of VCSEL emitting units arranged in the row direction of the VCSEL array may be four times the number of VCSEL emitting units arranged in the column direction of the VCSEL array.
In some embodiments, the number of VCSEL emitting units arranged in the row direction of the VCSEL array may be eight, and the number of VCSEL emitting units arranged in the column direction of the VCSEL array may be two.
In another aspect, the present application provides a depth camera including a light receiving module and a light emitting module as described above. The light receiving module can be used for receiving the light emitted by the VCSEL array and returned by the scene.
In some embodiments, the light receiving module may include a lens unit and an image sensor. The lens unit may include at least one lens. The image sensor may be configured to convert the received light signal into an electrical signal, and distance information of the scene from the depth camera may be obtained by processing the electrical signal.
In some implementations, the depth camera can also include a housing. The light emitting module and the light receiving module may be disposed within the housing.
In yet another aspect, the present application further provides a sweeping robot, which includes a sweeping robot body and the depth camera as described above. The depth camera may be mounted on a side of the sweeping robot body.
This application is through the quantity of the VCSEL emission unit of rational design in the VCSEL array on the row direction and the quantity of the VCSEL emission unit on the direction of being listed as to make the output light field that the optical emission module throws to the scene have enough big angle of vision on the horizontal direction and have the angle of vision in appointed angle range on the vertical direction, promptly, make the optical emission module can throw out the strip facula to the scene.
In addition, the Vertical Cavity Surface Emitting Laser (VCSEL) has the characteristics of small field of view (FOV), uniform light intensity and small edge gradient region in single-point emission, so that the VCSEL emitting units in the VCSEL array can be reasonably arranged to project the light field, and the light field has the characteristics of large FOV in the horizontal direction, small FOV in the vertical direction, small edge gradient region and easiness in distinguishing.
Drawings
The above and other advantages of embodiments of the present application will become apparent from the detailed description with reference to the following drawings, which are intended to illustrate and not to limit exemplary embodiments of the present application. In the drawings:
fig. 1 is a schematic view showing an angle of view of a related art light emission module in a horizontal direction;
FIG. 2 is a schematic diagram of a strip-shaped light spot formed by a light emitting module in the prior art;
figures 3A to 3D show schematic layout diagrams of a VCSEL array according to an embodiment of the present application;
fig. 4 illustrates a schematic view of an angle of view of a light emitting module according to an embodiment of the present application in a horizontal direction;
fig. 5 is a schematic diagram illustrating a stripe-shaped light spot formed by a light emitting module according to an embodiment of the present application;
fig. 6 shows a block diagram of a light emitting module according to an embodiment of the present application;
figure 7 shows a schematic block diagram of a sweeping robot configured with a TOF depth camera in accordance with embodiments of the present application; and
FIG. 8 shows a schematic diagram of a TOF depth camera according to an embodiment of the present application.
Detailed Description
For a better understanding of the present application, various aspects of the present application will be described in more detail with reference to the accompanying drawings. It should be understood that the detailed description is merely illustrative of exemplary embodiments of the present application and does not limit the scope of the present application in any way. In addition, descriptions of features well known in the art may be omitted for the sake of clarity and conciseness.
Like reference numerals refer to like elements throughout the drawings and detailed description. The figures may not be drawn to scale and the relative sizes, proportions and depictions of the elements in the figures may be exaggerated for clarity, illustration and convenience.
It should be noted that the expressions first, second, etc. in this specification are used only to distinguish one feature from another feature, and do not indicate any limitation on the features. Thus, a first direction discussed below can be referred to as a second direction, and likewise, a second direction can also be referred to as a first direction, without departing from the teachings of the present application.
It will be understood that when an element or layer is referred to herein as being "on," "connected to" or "coupled to" another element or layer, it can be directly on, connected or coupled to the other element or layer or intervening elements or layers may be present. When an element is referred to as being "directly on," "directly connected to" or "directly coupled to" another element or layer, there are no intervening elements or layers present. Like numbers refer to like elements throughout the specification. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be further understood that the terms "comprises," "comprising," "includes," "including," "has," "having," "contains" and/or "containing," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Moreover, when a statement such as "at least one of" appears after a list of listed features, the entirety of the listed features is modified rather than modifying individual elements in the list. Furthermore, when describing embodiments of the present application, the use of "may" mean "one or more embodiments of the present application. Also, the term "exemplary" is intended to refer to an example or illustration.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As already explained in the background section, the sweeping robot generally needs to have a large field angle in the H direction in order to measure a large range of obstacle information. Meanwhile, the sweeping robot needs to have a smaller field of view in the V direction to ensure that the depth camera can obtain higher measurement accuracy.
Some existing solutions implement a strip-like light field by using a laser and a light homogenizer. However, the solution of laser and light homogenizing sheet cannot provide a large field angle in the H direction, and it is generally the case that only a field angle θ of about 5 ° to about 20 ° can be provided in the H directionHAs shown in fig. 1. Moreover, referring to fig. 2, the stripe-shaped optical field provided by the solution of the laser and the light equalizing sheet has the defects of large gradual change area, difficulty in distinguishing and the like.
In addition, since the light uniformizing sheet is required to be used, the light emitting module further includes a support structure for supporting and fixing the light uniformizing sheet. This will undoubtedly increase the weight and volume of the light emitting module.
Various aspects of the present application are described in more detail below with reference to the figures.
Fig. 3A to 3D show schematic arrangement patterns of the VCSEL array 211 according to the embodiment of the present application, fig. 4 shows a schematic view of an angle of view of the light emission module 21 according to the embodiment of the present application in a horizontal direction, and fig. 5 shows a schematic view of a stripe-shaped light spot formed by the light emission module 21 according to the embodiment of the present application.
The light emitting module 21 according to the present application may include a VCSEL array 211. The VCSEL array 211 is used to project an output light field towards a scene and may have a plurality of VCSEL emitting units 2110 arranged in an array. In the VCSEL array 211, the number of VCSEL emission units 2110 arranged in the row direction may be at least three times the number of VCSEL emission units 2110 arranged in the column direction.
In some embodiments, the number of VCSEL emitting units arranged in the row direction of the VCSEL array may be four times the number of VCSEL emitting units arranged in the column direction of the VCSEL array.
As shown in fig. 3A to 3D, the VCSEL emitting unit 2110 may be formed as an array of 2 × 7, 2 × 8, 3 × 9, or 3 × 10. It should be understood that the present application is not limited thereto and should also include other arrangements as long as the number of VCSEL emission units arranged in the row direction is at least three times the number of VCSEL emission units arranged in the column direction.
As shown in fig. 4, the light emitting module 21 according to the present application can provide a large angle of view in the H direction. For example, the angle of view θ in the H directionHAnd may range from about 100 deg. to about 130 deg.. In addition, since the VCSEL itself has the characteristics of uniform light intensity and small edge gradient region in single-point emission, the light field projected by the light emission module 21 according to the present application also has the characteristics of small edge gradient region and easy distinction, as shown in fig. 5.
Fig. 6 shows a block diagram of the light emitting module 21 according to an embodiment of the present application.
The light emitting module 21 according to the present application may further include a Printed Circuit Board (PCB)212 and a driver 213. The VCSEL array 211 may be disposed on a printed circuit board 212 and electrically connected with the printed circuit board 212. Meanwhile, a driver 213 may also be disposed on the printed circuit board 212 for driving the VCSEL array 211 to generate modulated light.
In some embodiments, the light emitting module 21 may further include a temperature detection circuit for detecting the temperature of the light source. The temperature detection circuit may include a thermistor. The thermistor is adopted to detect the temperature of the light source, the temperature of the periphery of the VCSEL array can be directly obtained, the measurement is more accurate, and the number of devices, the arrangement space on the circuit board and the space in the module can be effectively saved.
Fig. 7 shows a schematic block diagram of the sweeping robot 1 configured with the TOF depth camera 20 according to an embodiment of the present application, and fig. 8 shows a schematic diagram of the TOF depth camera 20 according to an embodiment of the present application.
Referring to fig. 7 and 8, the sweeping robot 1 includes a sweeping robot body 10 and a depth camera 20. The depth camera 20 is disposed at a side portion of the sweeping robot body 10.
The depth camera 20 includes a light emitting module 21 and a light receiving module 22. The light emitting module 21 may be a light emitting module including the VCSEL array 211 as described above. The light emission module 21 may be used to project an output light field 210 towards the scene 30. The light receiving module 22 may be used to receive light emitted by the VCSEL array 211 and returned by the scene 30.
In an exemplary embodiment, a ratio between the horizontal angle of view θ H and the vertical angle of view θ V of the light emitting module 21 may be equal to or greater than 8. At this time, the output light field 210 projected by the light emitting module 21 will form a narrow band-shaped light spot on the surface of the scene 30. For example, the horizontal angle of view θ of the light emitting module 21HAnd vertical field angle thetaVMay be 120 ° and 5 °, 120 ° and 10 °, or 120 ° and 15 °, respectively. However, the present application is not limited thereto.
According to the light emitting module 21 of the application, the light emitting module can have a large angle of view in the horizontal direction and a small angle of view in the vertical direction, so that the application requirement of the sweeping robot can be met.
The light receiving module 22 may include an image sensor 221 and a lens unit 222. The lens unit 222 may include at least one lens, and is disposed at a photosensitive path of the image sensor 221. The image sensor 221 may be used to convert the received light signals into electrical signals, and may obtain distance information of the scene 30 from the depth camera 10 by processing the converted electrical signals. The image sensor 221 may include, for example, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) sensor.
The depth camera 10 may also include a housing (not shown). The light emitting module 21 and the light receiving module 22 may be enclosed in a case.
The light emitting module according to the application can realize that the output light field which has a large enough field angle in the horizontal direction and has the field angle in the specified angle range in the vertical direction is projected to a scene, and the output light field has the characteristics of small edge gradient area and easiness in distinguishing.
The above description is only a preferred embodiment of the present application and is illustrative of the principles of the technology employed. It will be understood by those skilled in the art that the scope of the present invention is not limited to the specific combination of the above-mentioned features, but also covers other embodiments formed by any combination of the above-mentioned features or their equivalents without departing from the spirit of the present invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. An optical transmission module, comprising:
a VCSEL array for projecting an output light field towards a scene and having a plurality of VCSEL emitting units arranged in an array,
characterized in that the number of VCSEL emitting units arranged in the row direction of the VCSEL array is at least three times the number of VCSEL emitting units arranged in the column direction of the VCSEL array.
2. The light emission module of claim 1, wherein the ratio of the angle of the output light field in the horizontal direction to the angle in the vertical direction is greater than or equal to 8.
3. The optical transmit module of claim 1, further comprising a printed circuit board,
the VCSEL array is disposed on and electrically connected to the printed circuit board.
4. The optical transmit module of claim 3, further comprising a driver,
the driver is arranged on the printed circuit board and used for driving the VCSEL array to generate modulated light.
5. The optical transmit module of claim 1, wherein the number of VCSEL emitting units arranged in the row direction of the VCSEL array is four times the number of VCSEL emitting units arranged in the column direction of the VCSEL array.
6. The light emitting module of claim 5, wherein the number of VCSEL emitting units arranged in the row direction of the VCSEL array is eight, and the number of VCSEL emitting units arranged in the column direction of the VCSEL array is two.
7. A depth camera, comprising:
the light emission module according to any one of claims 1 to 6; and
and the light receiving module is used for receiving the light rays emitted by the VCSEL array and returned by the scene.
8. The depth camera of claim 7, wherein the light receiving module comprises:
a lens unit including at least one lens; and
an image sensor to convert the received light signal into an electrical signal and obtain distance information of the scene from the depth camera by processing the electrical signal.
9. The depth camera of claim 8, further comprising:
the shell, the light emission module with the light receiving module sets up in the shell.
10. A robot of sweeping floor, characterized in that, the robot of sweeping floor includes:
the sweeping robot body, and
the depth camera of any of claims 7-9, mounted on a side of the sweeping robot body.
CN202023317073.3U 2020-12-31 2020-12-31 Light emission module, depth camera and floor sweeping robot Active CN214011503U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202023317073.3U CN214011503U (en) 2020-12-31 2020-12-31 Light emission module, depth camera and floor sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202023317073.3U CN214011503U (en) 2020-12-31 2020-12-31 Light emission module, depth camera and floor sweeping robot

Publications (1)

Publication Number Publication Date
CN214011503U true CN214011503U (en) 2021-08-20

Family

ID=77294846

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202023317073.3U Active CN214011503U (en) 2020-12-31 2020-12-31 Light emission module, depth camera and floor sweeping robot

Country Status (1)

Country Link
CN (1) CN214011503U (en)

Similar Documents

Publication Publication Date Title
CN110914705B (en) Devices, systems, and methods for integrated LIDAR illumination power control
WO2021072802A1 (en) Distance measurement system and method
CA3062701A1 (en) Lidar data acquisition and control
CN110824490B (en) Dynamic distance measuring system and method
CN111123289B (en) Depth measuring device and measuring method
CN112020660A (en) LIDAR-based distance measurement with layered power control
CN109819144B (en) TOF camera module and design method thereof
CN107656284B (en) Distance measuring device and distance measuring method
CN110780312B (en) Adjustable distance measuring system and method
CN110658529A (en) Integrated beam splitting scanning unit and manufacturing method thereof
CN110716190A (en) Transmitter and distance measurement system
CN111427230A (en) Imaging method based on time flight and 3D imaging device
CN110716189A (en) Transmitter and distance measurement system
CN111257896B (en) Gated array lidar receiving optical system and lidar
CN212135134U (en) 3D imaging device based on time flight
CN217425688U (en) Optical lens, laser emission system, and laser transmitter/receiver
CN110986816B (en) Depth measurement system and measurement method thereof
CN214011503U (en) Light emission module, depth camera and floor sweeping robot
CN116097309A (en) Enhanced multispectral sensor calibration
JP3265449B2 (en) Distance sensor
CN110554398B (en) Laser radar and detection method
CN112346076A (en) Control method of electronic device, and computer-readable storage medium
WO2020215745A1 (en) Three-dimensional lidar and positioning method therefor
CN116879912A (en) All-solid-state laser radar
CN109901137B (en) Calibration method and calibration equipment of wide-angle TOF module

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant