CN111458717A - TOF depth measuring device and method and electronic equipment - Google Patents
TOF depth measuring device and method and electronic equipment Download PDFInfo
- Publication number
- CN111458717A CN111458717A CN202010311680.4A CN202010311680A CN111458717A CN 111458717 A CN111458717 A CN 111458717A CN 202010311680 A CN202010311680 A CN 202010311680A CN 111458717 A CN111458717 A CN 111458717A
- Authority
- CN
- China
- Prior art keywords
- light beam
- light
- target object
- light beams
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title description 14
- 230000003287 optical effect Effects 0.000 claims abstract description 22
- 230000003213 activating effect Effects 0.000 claims abstract description 5
- 239000011159 matrix material Substances 0.000 claims description 25
- 238000000691 measurement method Methods 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 4
- 238000002366 time-of-flight method Methods 0.000 description 41
- 238000010586 diagram Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 5
- 239000000203 mixture Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000006467 substitution reaction Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- DNTFEAHNXKUSKQ-RFZPGFLSSA-N (1r,2r)-2-aminocyclopentane-1-sulfonic acid Chemical compound N[C@@H]1CCC[C@H]1S(O)(=O)=O DNTFEAHNXKUSKQ-RFZPGFLSSA-N 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/484—Transmitters
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The invention discloses a TOF depth measuring device, which comprises: the device comprises a transmitting module, an acquisition module and a control and processor; the emission module is used for emitting a light beam and comprises a light source and a light beam scanner, and the light beam emitted by the light source is deflected by the light beam scanner to irradiate a given area of a target object; the acquisition module is used for acquiring the light beam reflected by the target object and comprises an image sensor consisting of a pixel array; and the control and processor is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of a corresponding area in the image sensor according to the deflected light beam, responding to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object. The TOF depth measuring device can improve the image signal-to-noise ratio, improve the resolution of the image and reduce the power consumption.
Description
Technical Field
The invention relates to the technical field of three-dimensional imaging, in particular to a TOF depth measuring device and method and electronic equipment.
Background
TOF is known collectively as Time-of-Flight, and TOF ranging technology is a technology that achieves accurate ranging by measuring the round-trip Time-of-Flight of a light pulse between a transmitting/receiving device and a target object. The technique of measuring the time of flight of light directly in the TOF technique is known as D-TOF (direct-TOF), also known as direct TOF ranging; the measurement technique of periodically modulating the emitted light signal, measuring the phase delay of the reflected light signal relative to the emitted light signal, and calculating the time of flight by the phase delay is called an I-TOF (Indirect-TOF) technique, and also called Indirect TOF ranging or phase TOF ranging.
The existing TOF measuring device based on the I-TOF technique generally includes a transmitting module that provides flood/dot illumination to a target space and a collecting module that images a reflected light beam, and a depth measuring device calculates a phase difference based on the reflected light signal to obtain a distance of an object.
The scheme disclosed in the chinese patent application No. 201911032055.X proposes that the emission module provides flood lighting for the target space, and because the flood lighting can illuminate the field of view to the maximum extent, each pixel in the collection module can acquire a relatively effective light signal and can calculate the depth information of the response. However, TOF measuring devices using flood illumination are susceptible to interference from ambient light and multipath, resulting in poor measurement accuracy.
The scheme of the Chinese patent application No. 201811393403.1 provides a transmitting module for providing lattice illumination for a target space, the lattice illumination can be more concentrated through the energy of single points, and the distribution between the points is sparse, so that the signal-to-noise ratio of an image can be effectively improved, and the influence of multiple paths is reduced. However, the TOF depth measuring device using dot matrix illumination cannot fully cover all pixels due to the dot matrix, so that only a part of the pixels can measure effective depth data, thereby reducing the resolution of the image.
The above background disclosure is only for the purpose of assisting understanding of the inventive concept and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
Disclosure of Invention
The present invention is directed to a TOF depth measuring apparatus, a TOF depth measuring method and an electronic device, so as to solve at least one of the above-mentioned problems of the background art.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
a TOF depth measurement apparatus comprising:
the emission module is used for emitting a light beam towards a given area of the target object; the emission module comprises a light source and a light beam scanner, and the light beam scanner deflects a light beam emitted by the light source to irradiate a given area of the target object;
the acquisition module is used for acquiring the light beam reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array;
and the control and processor is respectively connected with the emission module and the acquisition module and is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of the corresponding area in the image sensor according to the light beam formed after deflection so as to respond to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object.
In some embodiments, the emission module further comprises a lens, and the light source generates a line beam through the lens, and the line beam is line-scanned by the beam scanner to irradiate the target object.
In some embodiments, the lens is a cylindrical lens, the light beam emitted from the light source passes through the cylindrical lens to form a first line beam, and the first line beam is deflected by the beam scanner for multiple times to obtain a projection pattern composed of multiple second line beams.
In some embodiments, the emission module further comprises a collimating lens and a diffractive optical element; the light beam emitted by the light source is collimated by the collimating lens and then emits a collimated light beam, the collimated light beam is diffracted by the diffractive optical element to form a first line-string light beam or a first dot-matrix light beam, and the first line-string light beam or the first dot-matrix light beam is deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line-string light beams or second dot-matrix light beams.
In some embodiments, the second lattice of light beams is regularly arranged.
The other technical scheme of the invention is as follows:
a TOF depth measurement method comprising the steps of:
emitting a light beam towards a given area of a target object through an emission module; wherein the emission module comprises a light source and a beam scanner, and the light beam emitted by the light source is deflected by the beam scanner to irradiate a given area of the target object;
collecting the light beam reflected by the target object through a collecting module; wherein the acquisition module comprises an image sensor consisting of a pixel array;
the control and processor activates pixels of a corresponding region in the image sensor according to the deflected light beam, responds to the photo-charges accumulated by the reflected light beam, and calculates a phase difference based on the photo-charges to acquire the distance of the target object.
In some embodiments, the emission module further includes a cylindrical lens, the light beam emitted from the light source passes through the cylindrical lens to form a first linear light beam, and the first linear light beam is deflected by the light beam scanner for multiple times to obtain a projection pattern composed of multiple second linear light beams.
In some embodiments, the emission module further includes a collimating lens and a diffractive optical element, the light beam emitted from the light source emits a collimated light beam after being collimated by the collimating lens, the collimated light beam is diffracted by the diffractive optical element to form a first linear light beam or a first dot matrix light beam, and the first linear light beam or the first dot matrix light beam is deflected by the light beam scanner for multiple times to obtain a projection pattern composed of a plurality of second light beams or a plurality of second dot matrix light beams.
In some embodiments, the control and processor controls the number of deflections, the deflection angles, and the deflection sequence for each direction of the beam scanner to obtain projection patterns of different densities and different field angles.
The other technical scheme of the invention is as follows:
an electronic device, comprising: the TOF depth measuring device comprises a shell, a screen and the TOF depth measuring device in the technical scheme; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting light beams to the target object, receiving the light beams reflected back by the target object and forming electric signals.
The technical scheme of the invention has the beneficial effects that:
compared with the prior art, the TOF depth measuring device can improve the image signal-to-noise ratio and improve the image resolution and reduce the power consumption.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without paying creative efforts.
Fig. 1 is a schematic structural view of a TOF depth measuring apparatus according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a transmitting module of the TOF depth measuring device of the embodiment of fig. 1.
Fig. 3a-3c are schematic diagrams of a projected pattern according to one embodiment of the present invention.
FIG. 4 is a schematic diagram of an image sensor pixel array in accordance with one embodiment of the present invention.
FIG. 5 is a flow chart of a TOF depth measurement method according to another embodiment of the present disclosure.
Fig. 6 is a diagram of an electronic device employing the TOF depth measuring device of the embodiment of fig. 1.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the embodiments of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and the embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. The connection may be for fixation or for circuit connection.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be construed as limiting the present invention.
Furthermore, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Referring to fig. 1 and fig. 2, a TOF depth measuring apparatus is provided as an embodiment of the present invention, fig. 1 is a schematic structural diagram of the TOF depth measuring apparatus according to an embodiment of the present invention, and fig. 2 is a schematic diagram of a transmitting module of the TOF depth measuring apparatus. The TOF depth measuring device 10 includes a transmitting module 11, a collecting module 12, and a control and processor 13 connected to the transmitting module 11 and the collecting module 12, respectively. As shown in fig. 2, the emission module 11 is used for emitting a light beam toward a given area of a target object, and includes a light source 101, a light beam scanner 102; the light source 101 is configured to emit a light beam, and the light beam scanner 102 is configured to receive the light beam emitted by the light source, deflect the light beam, and project the light beam to the target object 20; the acquisition module comprises an image sensor 121 consisting of a pixel array for acquiring the light beam 40 reflected by the target object 20; the control and processor 13 controls the beam scanner 102 to deflect the light beam to illuminate a given area of the target object 20, and activates pixels of a corresponding area in the image sensor 121 according to the deflected light beam, to respond to the photo-charges accumulated by the light beam reflected back by the given area, to calculate a phase difference based on the photo-charges to acquire a distance of the target object 20, and to output a depth image of the target object 20.
The emitting module 11 further includes a light source driver (not shown in the figure) for driving the light source to emit light beams, wherein the light source may be a light source such as a light emitting diode (L ED), an edge emitting laser (EE L), a vertical cavity surface emitting laser (VCSE L), or a light source array composed of a plurality of light sources, and the light beams emitted by the light source may be visible light, infrared light, ultraviolet light, or the like.
The light source 101 emits a light beam, which the beam scanner 102 receives and emits the light beam to the target object 20 by rotating along a single axis or multiple axes in one embodiment, the beam scanner 102 may be a liquid crystal Polarization Grating (L iquid crystal Polarization Grating, L CPG), Micro-Electro mechanical systems (MEMS) scanner, etc. preferably, the beam scanner 102 employs a MEMS scanner, which may have a small volume and high performance due to the MEMS having a very high scanning frequency and a small volume, in some embodiments, the MEMS scanner may scan at a frequency of 1MHz to 20MHz, thus providing sufficient spatial and temporal resolution, the light beam emitted by the light source 101 may be spatially and temporally modulated to produce a variety of pattern beam emissions, such as a regular spot pattern, a line beam pattern, a line-string beam pattern, etc. by the configuration of the light source driver and the beam scanner 102.
The acquisition module 12 includes a TOF image sensor 121, a lens unit, and a filter (not shown in the figure); wherein the lens unit receives and images at least part of the light beam reflected by the target object 20 on at least part of the TOF image sensor, and the filter is a narrow-band filter matched with the wavelength of the light source and used for suppressing background light noise of other wave bands. The TOF image sensor may be an image sensor composed of a Charge Coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), an Avalanche Diode (AD), a Single Photon Avalanche Diode (SPAD), etc., with an array size representing the resolution of the depth camera, e.g., 320x240, etc. Generally, a readout circuit (not shown in the figure) composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and the like is also included in connection with the image sensor 121.
In some embodiments, a TOF image sensor comprises at least one pixel, each pixel comprising more than two taps (tap) for storing and reading or draining charge signals generated by incident photons under control of respective electrodes, such as: comprising 2 taps that are sequentially switched in a single frame period (or within a single exposure time) in a sequence to collect the corresponding photons for receiving the optical signal and converting into an electrical signal.
The control and processor 13 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc. including a CPU, a memory, a bus, etc., or may include a general-purpose processing circuit, such as a processing circuit in a smart terminal, such as a mobile phone, a television, a computer, etc., as at least a part of the control and processor 13 when the depth camera is integrated into the smart terminal.
The control and processor 13 is used to provide the emission signal required when the light source emits laser light, and the light source emits a light beam to the target object 20 under the control of the emission signal.
In some embodiments, the control and processor 13 provides a demodulated signal (acquisition signal) for each tap in each pixel of the TOF image sensor, the tap acquiring an electrical signal generated by the reflected light beam reflected back by the target object 20 under control of the demodulated signal. The electrical signal is related to the intensity of the reflected beam and the control and processor 13 processes the electrical signal and calculates the phase difference to obtain the distance of the target object 20.
In one embodiment, the emission module 11 includes a lens (not shown), and the light source 101 generates a line beam through the lens, and the line beam is line-scanned by the beam scanner 102 to illuminate the target object 20. As shown in fig. 3a, in the embodiment of the present invention, the lens is a cylindrical lens, the light source 101 generates the first line beam through the cylindrical lens, the beam scanner 102 receives the first line beam and deflects the first line beam to form the second line beam 301, and assuming that the angle of the first deflection performed on the first line beam by the beam scanner 102 is 0 degree, the second line beam 301 is formed as shown by the solid line in fig. 3 a; the first line beam is then deflected again by the beam scanner 102 and has a certain deflection angle, forming a further second line beam 301 as indicated by the dashed line in fig. 3 a. It is understood that all the second line beams are not distinguished, and are the second line beams formed by deflection by the beam scanner 102. The projection pattern 30 composed of the plurality of second line beams thus formed after the plurality of deflections has a larger field of view than the first line beam without the beam scanner 102, so that an image with a high signal-to-noise ratio and a high resolution can be obtained. It will be appreciated, of course, that the lens may be other combination lenses that produce a line beam.
In one embodiment, the emission module 11 includes a lens and a Diffractive Optical Element (DOE) (not shown), the light source 101 emits a collimated light beam after being collimated by the lens, and the collimated light beam is diffracted by the DOE to form a line beam including a plurality of spots connected together to illuminate the target object 20. Specifically, as shown in fig. 3b, the light source 101 emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first series of light beams formed by connecting a plurality of spots, the beam scanner 102 receives the first series of light beams and deflects the first series of light beams to form a second series of light beams, and assuming that the first deflection angle is 0 degree, the formed second series of light beams are the line series of light beams 302 shown by the solid lines in fig. 3 b; the first bunch of beams is then deflected again by the beam scanner 102, at a certain deflection angle, to form a second bunch of beams, i.e. the bunch of beams 302 indicated by dashed lines in fig. 3 b. It is understood that all of the line-string beams are not distinct and are the second line-string beams formed by deflection by the beam scanner 102, and are separated by dashed-solid lines in the figures for ease of illustration. The projection pattern 30 composed of the plurality of second line beam beams thus formed after the plurality of deflections has a larger field of view than the first line beam without the beam scanner 102, thereby acquiring an image with high signal-to-noise ratio and high resolution.
In one embodiment, the light source 101 emits a collimated light beam after being collimated by the lens, and the collimated light beam is then diffracted by the DOE to form a lattice pattern including a plurality of spots to illuminate the target object 20. As shown in fig. 3c, the light source 101 emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first dot matrix light beam, the light beam scanner 102 receives the first dot matrix light beam and deflects the first dot matrix light beam to form a second dot matrix light beam, and assuming that the first deflection angle is 0 degree, the formed dot matrix light beam is the second dot matrix light beam formed by the solid circles 303 in fig. 3 c; the first lattice beam is then deflected again by the beam scanner 102 with a certain deflection angle, forming a second lattice beam, i.e. a second lattice beam consisting of the dashed circle 303 in fig. 3 c. It is understood that all the lattice beams are not distinct and are the second lattice beam formed by the deflection of the beam scanner 102. The projection pattern 30 composed of the plurality of second lattice beams thus formed after the plurality of deflections has a higher density and a larger field of view than the first lattice beam without the beam scanner 102, thereby acquiring an image with a high signal-to-noise ratio and a high resolution. It will be appreciated that the second array of light beams may be arranged in one dimension or two dimensions, and may be arranged regularly or irregularly, preferably in a regular arrangement, so as to provide a more uniform depth value distribution.
As shown in fig. 3a to 3c, the light source may be a single light source or multiple light sources, and if the light source is multiple light sources, multiple first light beams may be formed at a time, and the multiple first light beams are deflected by the light beam scanner 102 to form multiple second light beams, so that the scanning angle is small, the scanning speed is fast, and the measurement accuracy can be improved. It is understood that, according to the requirements of the actual usage scenario, the deflection times, deflection angles and deflection sequence of each direction of the beam scanner 102 can be controlled to realize the projection with different densities and different field angles, so as to obtain the image with high signal-to-noise ratio and high resolution.
The projection pattern 30 projected onto the target object 20 by the emission module 11 according to the above-mentioned schemes of fig. 3a to 3c will be specifically described below with the control and processing unit activating the pixels in the corresponding areas of the image sensor 121 in a time-sharing manner based on the second line beam 301 formed by the first line beam in fig. 3a being deflected by the beam scanner 102.
The light source 101 generates a first line beam through the cylindrical lens, and the beam scanner 102 receives and deflects the first line beam to form a plurality of second line beams 302. The control and processor 13 controls the second line beam 301 to be irradiated to a given area of the target object 20 and activates pixels in a corresponding area of the image sensor 121 based on the second line beam 301, as shown in fig. 4. Assuming that one deflected light beam 301 is formed by the light beam scanner 102 through one deflection to irradiate a given region of the target object 20, the control and processor 13 activates the pixels of the corresponding region 401 in the image sensor 121 based on the deflected light beam 301 to accumulate the photo-charges in response to the light beam reflected back by the given region. Each second line beam is shown in fig. 4 as approximately 13 × 2 ═ 26 pixels, and may be of other sizes, and is not particularly limited in the embodiments of the present invention.
Similarly, when the next second line beam 301 is deflected by the beam scanner 102 to irradiate another region of the target object 20, the control and processor 13 activates the pixels of another corresponding region in the image sensor 121 based on the next second line beam 301 to accumulate the photo-charges in response to the beam reflected back by another region of the target object 20.
It will be appreciated that by time-sharing activation of the pixels of the respective regions in the image sensor 121, power consumption can be greatly reduced, and multiple regions of the target object 20 can be illuminated by multiple deflections of the beam scanner 102, such that the projection pattern formed by the plurality of second line beams completely covers the target object. The pixels of the respective areas of the image sensor 121 may sequentially collect the optical charges accumulated by the light beam reflected by the target object 20, and the control and processor 13 may calculate the phase difference based on the optical charges to obtain the distance of the target object 20, so that a depth image with high signal-to-noise ratio and high resolution may be output.
Based on the TOF depth measuring device in each of the above embodiments, another embodiment of the present invention further provides a TOF depth measuring method. Fig. 5 shows a flow of a TOF depth measurement method, as tried with reference to fig. 5, the measurement method comprising the steps of:
s501, emitting a light beam towards a given area of a target object through an emission module; the emission module comprises a light source and a light beam scanner, and the light beam scanner deflects a light beam emitted by the light source to irradiate a given area of a target object;
in an embodiment of the present invention, the lens is a cylindrical lens, the light beam emitted by the light source passes through the cylindrical lens to form a first line light beam, the first line light beam is deflected by the light beam scanner for multiple times to form a projection pattern composed of a plurality of second line light speeds, and the projection pattern has a larger field angle than the first line light beam.
In one embodiment, the emission module further comprises a collimating lens and a diffractive optical element, a light beam emitted by the light source emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first linear light beam or a first dot matrix light beam, the first linear light beam or the first dot matrix light beam forms a projection pattern formed by a plurality of second linear light beams or a plurality of second dot matrix light beams after being deflected for multiple times by the light beam scanner, the projection pattern has a higher density and/or a larger angle of view than the first linear light beam or the first dot matrix light beam, and the second dot matrix light beams are regularly arranged.
S502, collecting the light beam reflected by the target object through a collection module; the acquisition module comprises an image sensor consisting of a pixel array;
in some embodiments, pixels of each respective area in the image sensor may sequentially collect photocharges accumulated by light beams reflected back by the target object.
And S503, the control and processor activates the pixels of the corresponding area in the image sensor according to the deflected light beam, responds to the optical charges accumulated by the reflected light beam, and calculates the phase difference based on the optical charges to acquire the distance of the target object.
Specifically, the control and processor acquires photocharge accumulated by light beams reflected back by a given region by time-divisionally activating an image sensor composed of an array of pixels, calculates a phase difference based on the photocharge acquired by the pixels to acquire a distance to a target object, and outputs a depth image of the target object. In the embodiment of the invention, the projection pattern formed by the plurality of deflected light beams completely covers the target object, so that a depth image with high signal-to-noise ratio and high resolution can be output.
As another embodiment of the present invention, there is also provided an electronic apparatus, which may be a table, a desktop-mounted apparatus, a portable apparatus, a wearable apparatus, or a vehicle-mounted apparatus, and a robot, etc. In particular, the device may be a laptop or an electronic device to allow gesture recognition or biometric recognition. In other examples, the device may be a head-mounted device for identifying objects or hazards in the user's surroundings for safety, e.g., a virtual reality system that obstructs the user's vision of the environment, objects or hazards in the surroundings may be detected to provide the user with warnings about nearby objects or obstacles. In other examples, which may be a mixed reality system that mixes virtual information and images with the user's surroundings, objects or people in the user's environment may be detected to integrate virtual information with the physical environment and objects. In other examples, the device may be applied to the field of unmanned driving and the like. Referring to fig. 6, taking a mobile phone as an example for explanation, the electronic device 600 includes a housing 61, a screen 62, and the TOF depth measuring apparatus according to the foregoing embodiment; the emission module 11 and the collection module 12 of the TOF depth measuring device are disposed on the same plane of the electronic device 600, and are configured to emit a light beam to a target object, receive the light beam reflected by the target object, and form an electrical signal.
The present application further provides a storage medium for storing a computer program, which when executed performs at least the method described in any of the foregoing embodiments.
The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an erasable Programmable Read-Only Memory (EPROM), an electrically erasable Programmable Read-Only Memory (EEPROM), a magnetic Random Access Memory (FRAM), a magnetic Random Access Memory (RAM), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM), a Compact Disc-Random Access Memory (DRAM), a magnetic surface Memory (RAM), a magnetic tape Memory (DRAM), or a Dynamic Random Access Memory (SDRAM), or a Random Access Memory (SDRAM), or any other Random Access Memory (SDRAM), which may be of the type described by Random Access bus, or any other suitable type, including but not limited to DRAM, Random Access RAM (SDRAM), Random Access RAM (SDRAM), or SDRAM (SDRAM), or RAM (RAM).
It is to be understood that the foregoing is a more detailed description of the invention, and that the invention is not to be construed as limited to the specific embodiments disclosed herein. It will be apparent to those skilled in the art that various substitutions and modifications can be made to the described embodiments without departing from the spirit of the invention, and these substitutions and modifications should be construed as falling within the scope of the invention. In the description herein, references to the description of the term "one embodiment," "some embodiments," "preferred embodiments," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention.
In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction. Although embodiments of the present invention and their advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope of the invention as defined by the appended claims.
Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. One of ordinary skill in the art will readily appreciate that the above-disclosed, presently existing or later to be developed, compositions of matter, means, methods or steps, that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.
Claims (10)
1. A TOF depth measurement apparatus, comprising:
the emission module is used for emitting a light beam towards a given area of the target object; the emission module comprises a light source and a light beam scanner, and light beams emitted by the light source are deflected by the light beam scanner to irradiate a given area of the target object;
the acquisition module is used for acquiring the light beam reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array;
and the control and processor is respectively connected with the emission module and the acquisition module and is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of the corresponding area in the image sensor according to the deflected light beam, responding to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object.
2. The TOF depth measuring device of claim 1, wherein: the emission module also comprises a lens, the light source generates a line beam through the lens, and the line beam is subjected to line scanning through the beam scanner to irradiate the target object.
3. The TOF depth measuring device of claim 2, wherein: the lens is a cylindrical lens, light beams emitted by the light source form first line light beams after passing through the cylindrical lens, and the first line light beams are deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line light beams.
4. The TOF depth measuring device of claim 2, wherein: the emission module also comprises a collimating lens and a diffraction optical element; the light beam emitted by the light source is collimated by the collimating lens and then emits a collimated light beam, the collimated light beam is diffracted by the diffractive optical element to form a first line-string light beam or a first dot-matrix light beam, and the first line-string light beam or the first dot-matrix light beam is deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line-string light beams or second dot-matrix light beams.
5. The TOF depth measuring device of claim 4, wherein: the second lattice light beams are regularly arranged.
6. A TOF depth measurement method is characterized by comprising the following steps:
emitting a light beam towards a given area of a target object through an emission module; the emission module comprises a light source and a light beam scanner, and light beams emitted by the light source are deflected by the light beam scanner to irradiate a given area of the target object;
collecting the light beam reflected by the target object through a collecting module; the acquisition module comprises an image sensor consisting of a pixel array;
the control and processor activates pixels of a corresponding region in the image sensor according to the deflected light beam to respond to the photo-charges accumulated by the reflected light beam, and calculates a phase difference based on the photo-charges to obtain a distance of the target object.
7. The TOF depth measurement method of claim 6, wherein: the emission module further comprises a cylindrical lens, light beams emitted by the light source pass through the cylindrical lens to form first line light beams, and the first line light beams are deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line light beams.
8. The TOF depth measurement method of claim 6, wherein: the transmitting module further comprises a collimating lens and a diffractive optical element, light beams emitted by the light source emit collimated light beams after being collimated by the collimating lens, the collimated light beams are diffracted by the diffractive optical element to form first linear light beams or first dot matrix light beams, and the first linear light beams or the first dot matrix light beams are deflected for multiple times by the light beam scanner to obtain projection patterns formed by a plurality of second light beams or a plurality of second dot matrix light beams.
9. The TOF depth measurement method of claim 6, wherein: the control and processor controls the deflection times, deflection angles and deflection sequence of each direction of the light beam scanner to obtain projection patterns with different densities and different field angles.
10. An electronic device, comprising: a housing, a screen, and a TOF depth measuring device of any one of claims 1-5; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting light beams to the target object, receiving the light beams reflected back by the target object and forming electric signals.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010311680.4A CN111458717A (en) | 2020-04-20 | 2020-04-20 | TOF depth measuring device and method and electronic equipment |
PCT/CN2020/141870 WO2021212916A1 (en) | 2020-04-20 | 2020-12-30 | Tof depth measurement apparatus and method, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010311680.4A CN111458717A (en) | 2020-04-20 | 2020-04-20 | TOF depth measuring device and method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111458717A true CN111458717A (en) | 2020-07-28 |
Family
ID=71685994
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010311680.4A Pending CN111458717A (en) | 2020-04-20 | 2020-04-20 | TOF depth measuring device and method and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111458717A (en) |
WO (1) | WO2021212916A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112729164A (en) * | 2020-12-21 | 2021-04-30 | 革点科技(深圳)有限公司 | Self-adaptive lattice structure light projection method based on MEMS |
CN113534596A (en) * | 2021-07-13 | 2021-10-22 | 盛景智能科技(嘉兴)有限公司 | RGBD stereo camera and imaging method |
WO2021212916A1 (en) * | 2020-04-20 | 2021-10-28 | 奥比中光科技集团股份有限公司 | Tof depth measurement apparatus and method, and electronic device |
CN115144842A (en) * | 2022-09-02 | 2022-10-04 | 深圳阜时科技有限公司 | Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method |
WO2023071307A1 (en) * | 2021-10-26 | 2023-05-04 | 华为技术有限公司 | Three-dimensional imaging method and apparatus |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115588037A (en) * | 2022-09-27 | 2023-01-10 | 杭州海康机器人股份有限公司 | Data acquisition equipment, method and device and storage medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106772407A (en) * | 2016-12-02 | 2017-05-31 | 深圳市镭神智能系统有限公司 | Laser radar system based on MEMS micromirror scanning |
CN206331115U (en) * | 2016-12-02 | 2017-07-14 | 深圳市镭神智能系统有限公司 | The laser radar system scanned based on MEMS micromirror |
CN109188451A (en) * | 2018-10-15 | 2019-01-11 | 北京径科技有限公司 | A kind of laser radar system |
US20190310370A1 (en) * | 2018-04-09 | 2019-10-10 | Sick Ag | Optoelectronic sensor and method for detection and distance determination of objects |
CN110658529A (en) * | 2019-09-27 | 2020-01-07 | 深圳奥锐达科技有限公司 | Integrated beam splitting scanning unit and manufacturing method thereof |
CN110824490A (en) * | 2019-09-27 | 2020-02-21 | 深圳奥锐达科技有限公司 | Dynamic distance measuring system and method |
WO2020040390A1 (en) * | 2018-08-23 | 2020-02-27 | 엘지전자 주식회사 | Apparatus and method for generating three-dimensional image |
CN111025319A (en) * | 2019-12-28 | 2020-04-17 | 深圳奥比中光科技有限公司 | Depth measuring device and measuring method |
CN212694038U (en) * | 2020-04-20 | 2021-03-12 | 奥比中光科技集团股份有限公司 | TOF depth measuring device and electronic equipment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10598771B2 (en) * | 2017-01-18 | 2020-03-24 | Analog Devices Global Unlimited Company | Depth sensing with multiple light sources |
US10191155B2 (en) * | 2017-03-29 | 2019-01-29 | Luminar Technologies, Inc. | Optical resolution in front of a vehicle |
CN108490420A (en) * | 2018-06-12 | 2018-09-04 | 深圳市镭神智能系统有限公司 | A kind of micro mirror scanning optics |
CN109963138A (en) * | 2019-02-15 | 2019-07-02 | 深圳奥比中光科技有限公司 | A kind of depth camera and image acquiring method |
CN111458717A (en) * | 2020-04-20 | 2020-07-28 | 深圳奥比中光科技有限公司 | TOF depth measuring device and method and electronic equipment |
-
2020
- 2020-04-20 CN CN202010311680.4A patent/CN111458717A/en active Pending
- 2020-12-30 WO PCT/CN2020/141870 patent/WO2021212916A1/en active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106772407A (en) * | 2016-12-02 | 2017-05-31 | 深圳市镭神智能系统有限公司 | Laser radar system based on MEMS micromirror scanning |
CN206331115U (en) * | 2016-12-02 | 2017-07-14 | 深圳市镭神智能系统有限公司 | The laser radar system scanned based on MEMS micromirror |
US20190310370A1 (en) * | 2018-04-09 | 2019-10-10 | Sick Ag | Optoelectronic sensor and method for detection and distance determination of objects |
WO2020040390A1 (en) * | 2018-08-23 | 2020-02-27 | 엘지전자 주식회사 | Apparatus and method for generating three-dimensional image |
CN109188451A (en) * | 2018-10-15 | 2019-01-11 | 北京径科技有限公司 | A kind of laser radar system |
CN110658529A (en) * | 2019-09-27 | 2020-01-07 | 深圳奥锐达科技有限公司 | Integrated beam splitting scanning unit and manufacturing method thereof |
CN110824490A (en) * | 2019-09-27 | 2020-02-21 | 深圳奥锐达科技有限公司 | Dynamic distance measuring system and method |
CN111025319A (en) * | 2019-12-28 | 2020-04-17 | 深圳奥比中光科技有限公司 | Depth measuring device and measuring method |
CN212694038U (en) * | 2020-04-20 | 2021-03-12 | 奥比中光科技集团股份有限公司 | TOF depth measuring device and electronic equipment |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021212916A1 (en) * | 2020-04-20 | 2021-10-28 | 奥比中光科技集团股份有限公司 | Tof depth measurement apparatus and method, and electronic device |
CN112729164A (en) * | 2020-12-21 | 2021-04-30 | 革点科技(深圳)有限公司 | Self-adaptive lattice structure light projection method based on MEMS |
CN113534596A (en) * | 2021-07-13 | 2021-10-22 | 盛景智能科技(嘉兴)有限公司 | RGBD stereo camera and imaging method |
WO2023071307A1 (en) * | 2021-10-26 | 2023-05-04 | 华为技术有限公司 | Three-dimensional imaging method and apparatus |
CN115144842A (en) * | 2022-09-02 | 2022-10-04 | 深圳阜时科技有限公司 | Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method |
CN115144842B (en) * | 2022-09-02 | 2023-03-14 | 深圳阜时科技有限公司 | Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method |
Also Published As
Publication number | Publication date |
---|---|
WO2021212916A1 (en) | 2021-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN212694038U (en) | TOF depth measuring device and electronic equipment | |
CN111458717A (en) | TOF depth measuring device and method and electronic equipment | |
CN110596721B (en) | Flight time distance measuring system and method of double-shared TDC circuit | |
CN111025317B (en) | Adjustable depth measuring device and measuring method | |
CN110596722B (en) | System and method for measuring flight time distance with adjustable histogram | |
JP6977045B2 (en) | Systems and methods for determining the distance to an object | |
CN111123289B (en) | Depth measuring device and measuring method | |
CN111722241B (en) | Multi-line scanning distance measuring system, method and electronic equipment | |
JP4405154B2 (en) | Imaging system and method for acquiring an image of an object | |
CN110596725B (en) | Time-of-flight measurement method and system based on interpolation | |
CN110325879B (en) | System and method for compressed three-dimensional depth sensing | |
CN110596723B (en) | Dynamic histogram drawing flight time distance measuring method and measuring system | |
CN111025318B (en) | Depth measuring device and measuring method | |
US9621876B2 (en) | Scanning 3D imager | |
CN110596724B (en) | Method and system for measuring flight time distance during dynamic histogram drawing | |
JP2018531374A (en) | System and method for measuring distance to an object | |
JP2018531374A6 (en) | System and method for measuring distance to an object | |
CN111708039A (en) | Depth measuring device and method and electronic equipment | |
CN110687541A (en) | Distance measuring system and method | |
JP2022505772A (en) | Time-of-flight sensor with structured light illumination | |
CN111025321B (en) | Variable-focus depth measuring device and measuring method | |
JP4802891B2 (en) | Distance measuring system and distance measuring method | |
CN111487639A (en) | Laser ranging device and method | |
CN111366941A (en) | TOF depth measuring device and method | |
CN110780312B (en) | Adjustable distance measuring system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
CB02 | Change of applicant information |
Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000 Applicant after: Obi Zhongguang Technology Group Co., Ltd Address before: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000 Applicant before: SHENZHEN ORBBEC Co.,Ltd. |
|
CB02 | Change of applicant information | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |