CN115598899A - Linear light beam emitting module, depth camera and control method of depth camera - Google Patents

Linear light beam emitting module, depth camera and control method of depth camera Download PDF

Info

Publication number
CN115598899A
CN115598899A CN202211250430.XA CN202211250430A CN115598899A CN 115598899 A CN115598899 A CN 115598899A CN 202211250430 A CN202211250430 A CN 202211250430A CN 115598899 A CN115598899 A CN 115598899A
Authority
CN
China
Prior art keywords
emitting module
light
beam emitting
line
depth camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211250430.XA
Other languages
Chinese (zh)
Inventor
郑德金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN202211250430.XA priority Critical patent/CN115598899A/en
Publication of CN115598899A publication Critical patent/CN115598899A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The application provides a line beam emission module, include: a light source array, a mask and a collimating lens; the mask is arranged between the light source array and the collimating lens, the light source array is used for emitting a plurality of spot light beams, and the spot light beams are modulated by the mask to form linear light beams; the linear light beam is collimated by the collimating lens and then projected to a target area to form a linear light field. Through the implementation of the scheme, the influence of the multipath effect on the acquisition of the depth information can be effectively reduced, and the acquisition accuracy of the depth information is improved.

Description

Linear light beam emitting module, depth camera and control method of depth camera
Technical Field
The invention belongs to the technical field of depth cameras, and particularly relates to a line beam emitting module, a depth camera and a control method of the depth camera.
Background
With the development of society, people are pursuing more and more intelligent life, and smart homes are ubiquitous in life. The sweeping robot can automatically sweep garbage and has an obstacle avoidance function.
A plurality of sensors are used in the robot, and the navigation and obstacle avoidance functions are realized by the plurality of sensors. It is common for a single line laser radar to measure the surrounding environment and assist the robot in navigation; 3D sensors such as a binocular sensor, a structured light sensor, an iTOF (indirect Time-of-Flight) sensor and the like are used for carrying out short-distance depth measurement and assisting the robot to realize a short-distance obstacle avoidance function. Therefore, when the sweeping robot needs to perform multiple sensing functions such as navigation and obstacle avoidance, a plurality of sensors need to be installed, and the cost is increased.
The low-cost sensor is mainly a mechanical laser radar, but the mechanical laser radar is low in reliability and shock impact resistance, large in size and low in scanning speed. Particularly, in the sweeping robot, it is necessary to structurally avoid the space in the robot and to increase the height of the sweeper mounted on the top of the sweeper, so that the sweeper is more difficult to enter a position with a smaller space height for sweeping. And because the robot often collides with other objects, generally, anti-vibration and collision designs need to be added on the structure or other aspects, so that the cost is increased while the structural complexity is increased.
The existing depth camera based on the iTOF technology mainly uses a floodlight emission module and an area array image sensor, so that the depth camera can obtain higher effective spatial resolution, but emits floodlight beams to carry out depth detection to generate multipath effect, the places with rough surfaces such as wall corners have larger influence on depth measurement precision, and the area array image sensor has higher power consumption relatively. Particularly, when the robot cleaner is applied to a sweeping robot, the power consumption of the robot cleaner is greatly influenced by the robot using a battery as power. The depth camera based on the active binocular structured light technology has relatively high cost due to the two receiving modules and the one transmitting module, and depth information cannot be well obtained in scenes and dark environments with less texture information such as white walls and the ground based on the passive binocular technology.
How to effectively solve these problems is the focus of current attention.
Disclosure of Invention
The invention provides a
The line beam emitting module, the depth camera and the control method of the depth camera at least can solve the problem of low information acquisition accuracy of the depth camera in the related technology.
The first aspect of the embodiment of the present application provides a line beam emission module, includes: the device comprises a light source array, a mask and a collimating lens; the mask is arranged between the light source array and the collimating lens, the light source array is used for emitting a plurality of spot light beams, and the spot light beams are modulated by the mask to form linear light beams; the linear light beam is collimated by the collimating lens and then projected to a target area to form a linear light field.
In some embodiments, the mask includes a light transmissive region and a non-light transmissive region, the light transmissive region including a plurality of line transmissive regions. In some embodiments, the light source array comprises a plurality of column light sources, each column light source comprising a plurality of light sources, the column light sources corresponding one-to-one to the line-transparent regions.
A second aspect of embodiments of the present application provides a depth camera, including: the device comprises a line beam emitting module, an iTOF image sensor, a driving unit, a circuit board and a control and processing circuit; the line beam emission module and the iTOF image sensor are arranged on the circuit board; the control and processing circuit is used for controlling the driving unit to drive the line beam emitting module to emit linear beams to the target area; the iTOF image sensor is used for collecting linear light beams reflected back by the target object and generating electric signals; the control and processing circuit processes and calculates the electric signal to obtain the depth information of the target object.
In some embodiments, the control and processing circuit is further configured to control the driving unit to drive the line beam emitting module to perform a line scanning emission mode.
In some embodiments, the depth camera further includes a face beam emitting module to emit a face beam toward the target area. The depth camera further comprises a time-sharing control unit, and the time-sharing control unit is used for controlling the opening of the line light beam emitting module or the surface light beam emitting module. The control and processing circuit generates an activation signal according to the working mode and synchronizes the activation signal to the iTOF image sensor and the time-sharing control unit; the time-sharing control unit regulates the starting of the line beam emitting module or the surface beam emitting module according to the activation signal; the iTOF image sensor outputs a start signal to the driving unit; the driving unit outputs laser driving current for driving the line beam emitting module or the surface beam emitting module to emit light signals. The working modes comprise a line scanning detection mode and a surface beam detection mode.
A third aspect of the embodiments of the present application provides a depth measurement method applied to a depth camera, where the method includes: starting the linear light beam emitting module to project linear light beams to a target area according to the working mode, or starting the surface light beam emitting module to emit surface light beams to the target area; receiving the linear light beam or the surface light beam reflected by the target and generating an electric signal; and receiving the electric signal, processing and calculating depth information corresponding to the target.
It can be seen from the above that according to the depth measurement method of a line beam emission module, depth camera that this application scheme provided, through a plurality of spot light beams of light source array transmission, a plurality of spot light beams form the line beam through mask modulation, form the linear light field in throwing the target area after the collimation lens collimation again, when receiving module carries out the light source and receives, can effectual reduction multichannel effect to the influence of depth information collection to the detection accuracy of depth information has been improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a depth camera according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a line beam emitting module according to an embodiment of the present disclosure;
fig. 3 is a diagram illustrating a relationship between a linear array region and a column light source provided in an embodiment of the present application;
FIG. 4 is a diagram illustrating an effect of a linear beam of a depth camera according to an embodiment of the present disclosure;
FIG. 5 is a block diagram of a depth camera according to an embodiment of the present disclosure;
fig. 6 is another schematic structural diagram of a depth camera according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 is a schematic diagram of a depth camera according to one embodiment of the invention. Depth camera 10 includes a transmitter 11, a collector 12, and a control and processing circuit 13 connected to the transmitter and collector. The emitter 11 is configured to continuously emit a time-sequence amplitude-modulated emission beam to a target object, at least a part of the emission beam is reflected by the target point to form a reflection beam, at least a part of the reflection beam is received by the collector 12 and generates an electrical signal, the control and processing circuit 13 synchronizes trigger signals of the emitter 11 and the collector 12, receives the electrical signal, processes the electrical signal to calculate a flight time of the reflection beam relative to the emission beam, and further calculates depth information of the target according to the flight time.
The transmitter 11 includes a light source 111, a transmitting optical element 112, a driver 113, and the like. The light source 111 may be a single light source such as a Light Emitting Diode (LED), an Edge Emitting Laser (EEL), a Vertical Cavity Surface Emitting Laser (VCSEL), or a VCSEL array light source chip formed by generating a plurality of VCSEL light sources on a single semiconductor substrate. Wherein the light source 111 may be modulated to emit a light beam outward at certain time-sequential amplitudes under the control of a driver 113 (which may be further controlled by the control and processing circuit 13), such as in one embodiment, the light source 111 emits a light beam at a certain frequency under the control of a pulse modulated light beam, a square wave modulated light beam, a sine wave modulated light beam, and so on. The emitting optical element 112 receives the light beam emitted from the light source 111 and emits the light beam outward, and simultaneously can modulate the light beam such as collimation, beam expansion, diffraction, and the like to emit the light beam 30 outward. The emitting optical element 112 may be one or more of a one or more piece lens, a microlens array, a Diffractive Optical Element (DOE), a diffuser (diffuser), and the like.
Collector 12 includes an iTOF image sensor 121, a filtering unit 122, and a lens unit 123, where the lens unit 123 receives and images at least part of the reflected light beam reflected back by the target object onto the iTOF image sensor 121, and the filtering unit 122 is provided as a narrow band filter matched with the wavelength of the light source for suppressing background light noise of the remaining wavelength bands. In one embodiment, the ietf image sensor 121 includes at least one pixel, where each pixel includes two or more taps (taps for storing and reading or discharging charge signals generated by incident photons under the control of corresponding electrodes), such as 3 taps, and the taps are sequentially switched in a certain order within a single frame period (or single exposure time) to collect corresponding light signals and convert the light signals into electric signals, so that the ietf image sensor 121 outputs a rap map in each frame period, where the pixel value is the amount of charge accumulated in each tap, and the rap map includes phase difference information of the reflected light beam relative to the emitted light beam, and the flight time of the light beam can be determined according to the phase difference information.
The control and processing circuit 13 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc. including a CPU, a memory, a bus, etc., or may include a general-purpose processing circuit, such as when the depth camera is integrated into an intelligent terminal, such as a mobile phone, a television, a computer, etc., and the processing circuit in the terminal may be at least a part of the control and processing circuit 13.
Fig. 2 is a schematic structural diagram of a line beam emitting module according to this embodiment, in which the line beam emitting module 20 includes a light source array 21, a mask 22, and a collimating lens 23, which are sequentially disposed, where the light source array 21 includes a plurality of column light sources, each column light source includes a plurality of light sources, each column light source emits a plurality of spot beams, each spot beam is modulated by the mask 22 to form a line beam, and the line beam is collimated by the collimating lens 23 and then projected into a target area. The mask 22 includes a light-transmitting region and a non-light-transmitting region, the light-transmitting region includes a plurality of line-transmitting regions, wherein the light sources correspond to the line-transmitting regions one to one, and fig. 3 shows a schematic diagram of the line-transmitting regions 221 and the column light sources 211. The linear light beam in the present application mainly refers to the projected linear light beam with a horizontal field angle much larger than a vertical field angle, and fig. 4a shows an effect diagram of the corresponding linear light beam, and the light beam emitted by each column light source 211 passes through the line light-transmitting region 211 to form the linear light beam 41, wherein the horizontal field angle corresponds to the length direction of the linear light beam, and the vertical field angle corresponds to the width direction of the linear light beam. In some other embodiments, the shape of the outgoing linear light beam can be modulated by designing the shape of the linear light-transmitting region 211, such as the broken-line linear light beams 42 and 43 shown in fig. 4b and 4c, and the pattern of the linear light beam can be designed according to specific requirements.
Referring to fig. 2 and 3, in order to enable the linear light beam emitting module 20 to emit a linear light beam with uniform light intensity distribution and a desired field angle, various parameters of the module need to be configured to meet product requirements. Specifically, the distance between the light source array 21 and the mask 22 is configured to be d1, the distance between the mask 22 and the collimator lens 23 is configured to be d2, and the distance between the mask 22 and the collimator lens 23 is equal to the focal length of the collimator lens 23. In one embodiment, as shown in fig. 3, the light source array 21 includes m column light sources, each column light source includes n light sources, a distance between adjacent light sources is d0, in the embodiment, the distance between adjacent light sources is defined as a connection distance between centers of the light sources, and then the length d = (n-1) d0 of the column light source. Correspondingly, the mask 22 is provided with a light-transmitting region, the light-transmitting region also comprises m line light-transmitting regions, the width of each line light-transmitting region is n0, the distance between adjacent line light-transmitting regions is V0, the length of each light-transmitting region is Hm (the length of the corresponding line light-transmitting region), and the width of each light-transmitting region is Vm. In order to ensure that the laser beams emitted by each column of light sources are projected onto the mask 22 and then emitted through the linear light-transmitting area need to be fused together, and the light intensity distribution of the emitted beam is uniform, the configured parameters specifically include:
(1) The distance between the light source array 21 and the mask 22 is d 1 Satisfies the following conditions: d 1 >d 0 /2*tan(theta/2);
(2) Width n of each line-transmitting region 0 The following relationship is satisfied: n is 0 <d 0 2 tan (theta/2), otherwise the output light field edge light field is not uniform; preferably, n 0 =d 0 2, alpha is a constant and is determined according to the size of a light spot emitted by the light source;
(3) The focal length of the collimator lens 23 satisfies: f. of 0 =n 0 /(2*tan(delta/2));
(4) The distance between the adjacent line light-transmitting areas satisfies V 0 =f0*2*tan(T 0 /2);
(5) The length of the light-transmitting area satisfies: h m =f 0 *2 star (H); the width of the light-transmitting area satisfies: v m =f 0 *2*tan(V)。
Wherein theta is the laser emission angle of the light source, delta and T 0 Projecting the width projection angle of each line and the interval angle between adjacent linear beams for the predefined linear beam emitting module respectively; H. v is a horizontal field angle and a vertical field angle of the linear light beam projected by the predefined linear light source projection module respectively.
Fig. 5 is a schematic structural diagram of a depth camera according to this embodiment. In some embodiments, the device comprises a line beam emitting module 20, a collecting module 30, a driving unit 50, a circuit board 60, and a control and processing circuit (not shown). The acquisition module 30 includes an ietf image sensor 31, a filter 32, and an imaging lens 33, where the imaging lens 33 receives and images at least part of the reflected light beam reflected by the target object onto the ietf image sensor 31, and the filter 32 is configured as a narrow-band filter matched with the wavelength of the light source for suppressing background light noise of the remaining wavelength bands. In one embodiment, the light source array 21 and the ietf image sensor are attached to the upper surface of the circuit Board 60 by a COB (Chip on Board) process, i.e., attached to the upper surface of the circuit Board 60 by an adhesive.
The control and processing circuit is used for controlling the driving unit 50 to drive the linear light beam emitting module 20 to project linear light beams to the target area; and the iTOF image sensor 31 is controlled to collect the linear light beam reflected back by the target object to generate an electric signal, and the control and processing circuit calculates the electric signal to obtain the depth information of the target object. Therefore, a large horizontal projection visual angle can obtain a large horizontal distance measurement range, and a small vertical projection visual angle can reduce ground reflection to improve the detection accuracy of the depth camera. In some embodiments, the control and processing circuit is further configured to control the driving unit 50 to drive the line beam emitting module to perform a plurality of different beam emitting modes, such as a single beam emitting mode, a multi-line beam emitting mode, a line scanning emitting mode, and the like. In one embodiment, the driving line beam emitting module performs a line scanning mode, that is, each row of light sources is sequentially controlled to emit a light beam such that the line beam emitting module projects a line beam toward the target scene, as shown in fig. 3, each row of light sources can be sequentially controlled to emit a light beam in a sequence from top to bottom or from bottom to top. Correspondingly, the control and processing circuit may divide the working area of the pixel array according to the imaging position of the linear light beam on the ietf image sensor 31, that is, each column light source corresponds to one working area in the pixel array one by one, and when the column light source is turned on, the corresponding working area is turned on, and this way of the divisional light emission and the divisional reception may reduce the total power consumption of a single frame and improve the frame rate. The ray-emitting linear light beam is arranged for depth measurement, so that the detection distance and the detection precision can be improved, and particularly in a near-ground detection scene, the influence of a ground reflected light signal on the detection precision can be effectively reduced due to the fact that the vertical field angle is reduced, and the multi-path interference is reduced; the configuration of emitting a plurality of linear beams can also improve the optical power density, and the line-by-line scanning mode can also reduce the crosstalk generated by the adjacent linear beams.
In some other embodiments, the partitioning control may be performed in addition to the line-by-line scanning manner, the partitioning manner may be along the direction of the horizontal field angle or the direction of the vertical field angle, as shown by the dotted line in fig. 4b, the projected linear light beam may be divided into three regions, the number of the linear light beams in each region is the same, or as shown by the dotted line in fig. 4c, the projected linear light beam may be divided into six regions, and the number of the linear light beams in each region is not the same, this partitioning manner is also beneficial to regulating and controlling the depth camera to have different detection fields, each region serves as a sub-detection field, and the detection field range of the depth camera may be dynamically adjusted according to the actual application requirement. Compared with a depth camera capable of projecting linear beams in the prior art, the depth camera mainly uses a wave sheet, DOE (DOE) or a diffuser to modulate and form the linear beams, the shapes of the formed linear beams are fixed, the output light fields are arc-shaped, scanning can be performed only by line-by-line regulation, and partition control cannot be performed. The linear light beams are formed by adopting mask modulation, linear light beams in any shapes can be projected, partition detection can be carried out, the shapes, the number and the light intensity of the linear light beams projected in each area can be specifically designed by combining practical application scenes, an asymmetric output light field can be designed, and the comprehensive utilization efficiency of light energy is improved.
In some embodiments, the depth camera further includes a face beam emitting module 40 for emitting a face beam toward the target area. Compared with the depth measurement by utilizing linear beams, the method has the advantages that the resolution is higher but the measurement accuracy is slightly lower when the surface beams are used for measurement, the multipath effect is higher mainly during surface beam detection, the depth measurement accuracy is greatly influenced at places with rough surfaces such as wall corners, and the power consumption is relatively higher. Therefore, the depth camera formed by organically combining the surface light beam emitting module and the line light beam emitting module is more favorable for being applied to terminal products, and can be dynamically adjusted according to a specific working mode. In one embodiment, the area light beam emitting module 40 includes a point light source, which may be a photodiode, a laser diode, or the like. In yet another embodiment, the surface beam emitting module 40 includes a light source array and a diffusing element, and the spot beam emitted from the light source array is modulated by the diffusing element to form the surface beam. In one embodiment, since the surface light beam emitting module 40 is smaller than the line light beam emitting module 20, the surface light beam emitting module 40 can be elevated by the bracket such that the light emitting surface of the surface light beam emitting module is flush with the light emitting surface of the line light beam emitting module.
In some embodiments, the depth camera further comprises a time-sharing control unit 70, the control and processing circuit is connected to the time-sharing control unit 70, and the time-sharing control unit 70 is respectively connected to the line beam emitting module and the surface beam emitting module, and is configured to control the opening of the line beam emitting module and the surface beam emitting module, so that the depth camera has two operating modes, namely a line scanning detection mode and a surface beam detection mode. As shown in fig. 6, in one embodiment, the control and processing circuit selects a working mode (a line scanning detection mode or a surface beam detection mode) and synchronously activates the collector 30 and the time-sharing control unit 70, the time-sharing control unit 70 is used for switching the opening of the line beam emitting module and the surface beam emitting module, and the collector 30 is used for outputting a synchronous start signal. Specifically, the collector 30 outputs a start signal to the driving unit 50, and the driving unit 50 performs laser driving current modulation and output, wherein the driving currents corresponding to the line beam emitting module and the surface beam emitting module are different in magnitude, and the magnitude of the driving current can be determined according to the start signal. In one embodiment, the time-sharing control unit includes two output ports, a first output port is connected to the line beam emitting module, a second output port is connected to the area beam emitting module, and the two output ports are configured in a reverse output mode, that is, if the first output port is in a high level output mode, the second output port is in a low level output mode, and vice versa. When the control and processing circuit selects the line scanning detection mode, a first activation signal is sent to the collector 30 and the time-sharing control unit 70, a first port of the time-sharing control unit 70 outputs a high level, a second port of the time-sharing control unit 70 outputs a low level, the collector 30 outputs a first start signal to the driving unit 50, and the driving unit 50 modulates and outputs a first driving current so that the line beam emitting module outputs a first optical signal with proper power; if the control and processing circuit selects the area beam detection mode, the second activation signal is sent to the collector 30 and the time-sharing control unit 70, the first port of the time-sharing control unit 70 outputs a low level, the second port outputs a high level, the collector 30 outputs a second start signal to the driving unit 50, and the driving unit 50 modulates and outputs a second driving current so that the area beam emitting module outputs a second optical signal with proper power. When the control surface light beam emits a light beam, the time-sharing control unit drives the surface light beam emitting module to emit the surface light beam to irradiate the target area for depth measurement; when the control line beam emitting module emits the beams, the time-sharing control unit controls the control line beam emitting module to sequentially emit the beams to perform scanning type depth measurement on the space area.
The embodiment of the application also provides an intelligent terminal, such as a sweeping robot, wherein the sweeping robot comprises a processor and the depth camera in the embodiment. In the work of the sweeper, the functions of navigation and obstacle avoidance are usually required, and the functions of navigation and obstacle avoidance can be simultaneously realized by configuring the depth camera in the embodiment of the application. When the surface light beam emitting module works, the depth camera can provide higher effective spatial resolution for identifying small objects, and the sweeping robot can perform short-distance obstacle avoidance according to identification information; when the line beam emitting module works, the depth information with higher precision can be provided for carrying out three-dimensional reconstruction so as to carry out remote navigation.
In this embodiment, the Circuit board is an FPC (Flexible Printed Circuit board), and the Circuit board can be bent adaptively according to the structure of the robot cleaner. The user can select the target working mode from a display screen or a function key on the device body. The wireless communication unit is arranged on the corresponding sweeping robot, and the mobile terminal performs data transmission with the processor through the wireless communication unit.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above description of the depth camera provided by the present invention, those skilled in the art will recognize that changes may be made in the embodiments and applications of the depth camera according to the teachings of the present invention.

Claims (10)

1. A line beam emitter module, comprising: a light source array, a mask and a collimating lens;
the mask is arranged between the light source array and the collimating lens;
the light source array is used for emitting a plurality of spot light beams, and the spot light beams are modulated by the mask to form linear light beams;
the linear light beam is collimated by the collimating lens and then projected to a target area to form a linear light field.
2. The line beam emitter module of claim 1, wherein the mask comprises a light transmissive region and a non-light transmissive region, the light transmissive region comprising a plurality of line transmissive regions.
3. The line beam emitting module of claim 2, wherein the light source array comprises a plurality of columns of light sources, each of the columns of light sources comprising a plurality of light sources, the columns of light sources corresponding one-to-one with the line-transparent regions.
4. A depth camera, comprising: the line beam emitting module of any one of claims 1-3, the iTOF image sensor, the drive unit, the circuit board, and the control and processing circuitry;
the line beam emitting module and the iTOF image sensor are mounted on the circuit board;
the control and processing circuit is used for controlling the driving unit to drive the linear light beam emitting module to emit linear light beams to a target area;
the iTOF image sensor is used for collecting the linear light beam reflected back by the target object and generating an electric signal;
and the control and processing circuit processes and calculates the electric signal to obtain the depth information of the target object.
5. The depth camera of claim 4, wherein the control and processing circuit is further configured to control the driving unit to drive the line beam emitting module to perform a line scan emitting mode.
6. The depth camera of claim 4, further comprising a face beam emitting module to emit a face beam toward a target area.
7. The depth camera of claim 6, further comprising a time-sharing control unit for controlling the opening of the line beam emitting module or the area beam emitting module.
8. The depth camera of claim 7, wherein the control and processing circuitry generates activation signals synchronized to the iTOF image sensor and the time-sharing control unit according to an operating mode;
the time-sharing control unit regulates and controls the linear light beam emitting module or the surface light beam emitting module to be started according to the activation signal;
the iTOF image sensor outputs a start signal to the driving unit;
the driving unit outputs laser driving current for driving the line beam emitting module or the surface beam emitting module to emit light signals.
9. The depth camera of claim 8, wherein the operating modes include a line scan detection mode and an area beam detection mode.
10. A depth measurement method for use with a depth camera, the method comprising:
starting the linear light beam emitting module to project linear light beams to a target area according to the working mode, or starting the surface light beam emitting module to emit surface light beams to the target area;
receiving the linear light beam or the surface light beam reflected by the target and generating an electric signal;
and receiving the electric signal, processing and calculating depth information corresponding to the target.
CN202211250430.XA 2022-10-12 2022-10-12 Linear light beam emitting module, depth camera and control method of depth camera Pending CN115598899A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211250430.XA CN115598899A (en) 2022-10-12 2022-10-12 Linear light beam emitting module, depth camera and control method of depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211250430.XA CN115598899A (en) 2022-10-12 2022-10-12 Linear light beam emitting module, depth camera and control method of depth camera

Publications (1)

Publication Number Publication Date
CN115598899A true CN115598899A (en) 2023-01-13

Family

ID=84845999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211250430.XA Pending CN115598899A (en) 2022-10-12 2022-10-12 Linear light beam emitting module, depth camera and control method of depth camera

Country Status (1)

Country Link
CN (1) CN115598899A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116532523A (en) * 2023-06-28 2023-08-04 广州本金机电设备有限公司 Intelligent control longmen system of bending

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116532523A (en) * 2023-06-28 2023-08-04 广州本金机电设备有限公司 Intelligent control longmen system of bending
CN116532523B (en) * 2023-06-28 2023-09-19 广州本金机电设备有限公司 Intelligent control longmen system of bending

Similar Documents

Publication Publication Date Title
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
CN111025317B (en) Adjustable depth measuring device and measuring method
CN111142088B (en) Light emitting unit, depth measuring device and method
EP3424279B1 (en) Curved array of light-emitting elements for sweeping out an angular range
JP2022516854A (en) Solid-state electron-scanning laser array with high-side and low-side switches for increased channels
CN109343070A (en) Time flight depth camera
CN111487639B (en) Laser ranging device and method
CN111123289B (en) Depth measuring device and measuring method
EP3424278B1 (en) Staggered array of light-emitting elements for sweeping out an angular range
EP3786707A1 (en) Projection module and terminal
CN209167538U (en) Time flight depth camera
CN111458717A (en) TOF depth measuring device and method and electronic equipment
CN212694038U (en) TOF depth measuring device and electronic equipment
CN209894976U (en) Time flight depth camera and electronic equipment
CN115598899A (en) Linear light beam emitting module, depth camera and control method of depth camera
CN111323787A (en) Detection device and method
US20230393245A1 (en) Integrated long-range narrow-fov and short-range wide-fov solid-state flash lidar system
KR102567502B1 (en) Time of flight apparatus
CN114935743B (en) Emission module, photoelectric detection device and electronic equipment
CN114935742B (en) Emission module, photoelectric detection device and electronic equipment
WO2023065589A1 (en) Ranging system and ranging method
CN210090674U (en) Distance measuring system
CN219302660U (en) Scanning laser radar
CN218852613U (en) Light emitter, iTOF module and robot of sweeping floor
CN220141574U (en) Radar assembly and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination