CN212694038U - TOF depth measuring device and electronic equipment - Google Patents

TOF depth measuring device and electronic equipment Download PDF

Info

Publication number
CN212694038U
CN212694038U CN202020595778.2U CN202020595778U CN212694038U CN 212694038 U CN212694038 U CN 212694038U CN 202020595778 U CN202020595778 U CN 202020595778U CN 212694038 U CN212694038 U CN 212694038U
Authority
CN
China
Prior art keywords
light beam
target object
light
measuring device
depth measuring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020595778.2U
Other languages
Chinese (zh)
Inventor
王兆民
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN202020595778.2U priority Critical patent/CN212694038U/en
Application granted granted Critical
Publication of CN212694038U publication Critical patent/CN212694038U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of Optical Distance (AREA)

Abstract

The utility model discloses a TOF degree of depth measuring device and electronic equipment, wherein TOF degree of depth measuring device includes: the device comprises a transmitting module, an acquisition module and a control and processor; the emission module is used for emitting a light beam and comprises a light source and a light beam scanner, and the light beam emitted by the light source is deflected by the light beam scanner to irradiate a given area of a target object; the acquisition module is used for acquiring the light beam reflected by the target object and comprises an image sensor consisting of a pixel array; the control and processor is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of the corresponding area in the image sensor based on the deflected light beam, responding to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object. The utility model discloses TOF degree of depth measuring device can improve the resolution ratio of image and reduce the consumption when promoting the image SNR.

Description

TOF depth measuring device and electronic equipment
Technical Field
The utility model relates to a three-dimensional imaging technology field especially relates to a TOF degree of depth measuring device and electronic equipment.
Background
TOF is known collectively as Time-of-Flight, and TOF ranging technology is a technology that achieves accurate ranging by measuring the round-trip Time-of-Flight of a light pulse between a transmitting/receiving device and a target object. The technique of measuring the time of flight of light directly in the TOF technique is known as D-TOF (direct-TOF), also known as direct TOF ranging; the measurement technique of periodically modulating the emitted light signal, measuring the phase delay of the reflected light signal relative to the emitted light signal, and calculating the time of flight by the phase delay is called an I-TOF (Indirect-TOF) technique, and is also called an Indirect TOF ranging or a phase TOF ranging.
The existing TOF measuring device based on the I-TOF technology generally comprises a transmitting module and a collecting module, wherein the transmitting module provides flood lighting/dot matrix lighting for a target space, the collecting module images a reflected light beam, and the depth measuring device calculates a phase difference based on a reflected light signal to obtain a distance of an object.
The scheme disclosed in the chinese patent application No. 201911032055.X proposes that the emission module provides flood lighting to the target space, and because the flood lighting can illuminate the field of view to the maximum extent, each pixel in the collection module can acquire a relatively effective light signal and can calculate the depth information of the response. However, TOF measuring devices using flood illumination are susceptible to interference from ambient light and multipath, resulting in poor measurement accuracy.
The scheme of the Chinese patent application No. 201811393403.1 provides a transmitting module for providing lattice illumination for a target space, the lattice illumination can be more concentrated through the energy of single points, and the points are sparsely distributed, so that the signal-to-noise ratio of an image can be effectively improved, and the influence of multipath is reduced. However, in the TOF depth measuring device using dot matrix illumination, the dot matrix cannot completely cover all pixels, so that only part of the pixels can measure effective depth data, thereby reducing the resolution of the image.
The above background disclosure is only for the purpose of assisting understanding of the inventive concepts and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above contents are disclosed at the filing date of the present patent application.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to provide a TOF degree of depth measuring device and electronic equipment to solve at least one among the above-mentioned background art problem.
In order to achieve the above object, the embodiment of the present invention provides a technical solution that:
a TOF depth measurement apparatus comprising:
the emission module is used for emitting a light beam towards a given area of the target object; the emission module comprises a light source and a light beam scanner, and light beams emitted by the light source are deflected by the light beam scanner to irradiate a given area of the target object;
the acquisition module is used for acquiring the light beam reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array;
and the control and processor is respectively connected with the emission module and the acquisition module and is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of the corresponding area in the image sensor according to the deflected light beam, responding to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object.
In some embodiments, the emission module further comprises a lens, and the light source generates a line beam through the lens, and the line beam is line-scanned by the beam scanner to irradiate the target object.
In some embodiments, the lens is a cylindrical lens, the light beam emitted by the light source passes through the cylindrical lens to form a first linear light beam, and the first linear light beam is deflected by the light beam scanner for multiple times to obtain a projection pattern composed of multiple second linear light beams.
In some embodiments, the emission module further comprises a collimating lens and a diffractive optical element; the light beam emitted by the light source is collimated by the collimating lens and then emits a collimated light beam, the collimated light beam is diffracted by the diffractive optical element to form a first line-string light beam or a first dot-matrix light beam, and the first line-string light beam or the first dot-matrix light beam is deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line-string light beams or second dot-matrix light beams.
In some embodiments, the second lattice of light beams is regularly arranged.
In some embodiments, the control and processor controls the number of deflections, the deflection angles, and the deflection sequence for each direction of the beam scanner to obtain projection patterns of different densities and different field angles.
In some embodiments, the collecting module further comprises a lens unit and an optical filter; the lens unit is used for receiving at least part of the light beam reflected by the target object and imaging on at least part of the image sensor.
In some embodiments, the beam scanner is a liquid crystal polarization grating or a MEMS scanner.
In some embodiments, the filter is a narrow band filter matched to the wavelength of the light source.
The utility model discloses another technical scheme does:
an electronic device, comprising: the TOF depth measuring device comprises a shell, a screen and the TOF depth measuring device in the technical scheme; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting light beams to the target object, receiving the light beams reflected back by the target object and forming electric signals.
The utility model discloses technical scheme's beneficial effect is:
compared with the prior art, the utility model discloses TOF degree of depth measuring device and method can improve the resolution ratio of image and reduce the consumption when promoting image signal to noise ratio.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a TOF depth measuring device according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a transmitting module of the TOF depth measuring device of the embodiment of fig. 1.
Fig. 3a-3c are schematic diagrams of projected patterns according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of an image sensor pixel array according to one embodiment of the present invention.
Fig. 5 is a flow chart of a TOF depth measurement method according to another embodiment of the invention.
Fig. 6 is a diagram of an electronic device employing the TOF depth measuring device of the embodiment of fig. 1.
Detailed Description
In order to make the technical problem, technical scheme and beneficial effect that the embodiment of the present invention will solve more clearly understand, the following combines the drawings and embodiment, and goes forward the further detailed description of the present invention. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. The connection may be for fixation or for circuit connection.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be construed as limiting the invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Referring to fig. 1 and fig. 2, as an embodiment of the present invention, a TOF depth measuring device is provided, fig. 1 is the structure diagram of a TOF depth measuring device according to an embodiment of the present invention, and fig. 2 is a schematic diagram of a transmitting module of a TOF depth measuring device. The TOF depth measuring device 10 includes a transmitting module 11, a collecting module 12, and a control and processor 13 connected to the transmitting module 11 and the collecting module 12, respectively. As shown in fig. 2, the emission module 11 is used for emitting a light beam toward a given area of a target object, and includes a light source 101, a beam scanner 102; the light source 101 is configured to emit a light beam, and the light beam scanner 102 is configured to receive the light beam emitted by the light source, deflect the light beam, and project the light beam to the target object 20; the acquisition module comprises an image sensor 121 consisting of a pixel array for acquiring the light beam 40 reflected by the target object 20; the control and processor 13 controls the beam scanner 102 to deflect the light beam to illuminate a given area of the target object 20, and activates pixels of a corresponding area in the image sensor 121 according to the deflected light beam, in response to photo-charges accumulated by the light beam reflected back by the given area, calculates a phase difference based on the photo-charges to acquire a distance of the target object 20, and outputs a depth image of the target object 20.
The emitting module 11 further includes a light source driver (not shown in the figure) for driving the light source to emit light beams; the light source may be a light source such as a Light Emitting Diode (LED), an Edge Emitting Laser (EEL), a Vertical Cavity Surface Emitting Laser (VCSEL), or a light source array composed of a plurality of light sources, and the light beam emitted by the light source may be visible light, infrared light, ultraviolet light, or the like.
The light source 101 emits a light beam, and the light beam scanner 102 receives the light beam and emits the light beam to the target object 20 by rotating in a single axis or multiple axes. In one embodiment, the beam scanner 102 may be a Liquid Crystal Polarization Grating (LCPG), a Micro-Electro-Mechanical System (MEMS) scanner, or the like. Preferably, the beam scanner 102 is a MEMS scanner, which can make the transmitting module 11 have a smaller volume and higher performance because of the extremely high scanning frequency and smaller volume of the MEMS. In some embodiments, the MEMS scanner may scan at frequencies ranging from 1MHz to 20MHz, and thus may provide sufficient spatial and temporal resolution. By configuring the light source driver and the beam scanner 102, the light beam emitted by the light source 101 can be spatially and temporally modulated to produce a variety of patterned beam emissions, such as a regular speckle pattern, a line beam pattern, a line-string beam pattern, and so forth.
The acquisition module 12 includes a TOF image sensor 121, a lens unit, and a filter (not shown in the figure); wherein the lens unit receives and images at least part of the light beam reflected by the target object 20 on at least part of the TOF image sensor, and the filter is a narrow-band filter matched with the wavelength of the light source and used for suppressing background light noise of other wave bands. The TOF image sensor may be a Charge Coupled Device (CCD), Complementary Metal Oxide Semiconductor (CMOS), Avalanche Diode (AD), Single Photon Avalanche Diode (SPAD), etc., with an array size representing the resolution of the depth camera, e.g., 320x240, etc. Generally, a readout circuit (not shown in the figure) composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and the like is also included in connection with the image sensor 121.
In some embodiments, a TOF image sensor comprises at least one pixel, each pixel comprising more than two taps (tap) for storing and reading or draining charge signals generated by incident photons under control of respective electrodes, such as: comprising 2 taps that are sequentially switched in a certain order within a single frame period (or within a single exposure time) to collect the corresponding photons for receiving the optical signal and converting into an electrical signal.
The control and processor 13 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc. including a CPU, a memory, a bus, etc., or may include a general-purpose processing circuit, such as a processing circuit in a smart terminal, such as a mobile phone, a television, a computer, etc., as at least a part of the control and processor 13 when the depth camera is integrated into the smart terminal.
The control and processor 13 is used to provide the emission signal required when the light source emits laser light, and the light source emits a light beam to the target object 20 under the control of the emission signal.
In some embodiments, the control and processor 13 provides a demodulated signal (acquisition signal) for each tap in each pixel of the TOF image sensor, the tap acquiring an electrical signal generated by the reflected beam reflected back by the target object 20 under control of the demodulated signal. The electrical signal is related to the intensity of the reflected beam and the control and processor 13 processes the electrical signal and calculates the phase difference to obtain the distance of the target object 20.
In one embodiment, the emission module 11 includes a lens (not shown), and the light source 101 generates a line beam through the lens, and the line beam is line-scanned by the beam scanner 102 to illuminate the target object 20. As shown in fig. 3a, in the embodiment of the present invention, the lens is a cylindrical lens, the light source 101 generates the first line beam through the cylindrical lens, the beam scanner 102 receives the first line beam and deflects the first line beam to form the second line beam 301, and assuming that the angle of the first deflection of the first line beam by the beam scanner 102 is 0 degree, the second line beam 301 is formed as shown by the solid line in fig. 3 a; the first line beam is then deflected again by the beam scanner 102 and has a certain deflection angle, forming a further second line beam 301 as indicated by the dashed line in fig. 3 a. It is understood that all the second line lights are not distinguished, and are the second line lights deflected by the beam scanner 102. The projection pattern 30 composed of the plurality of second line beams thus formed after the plurality of deflections has a larger field of view than the first line beam without the beam scanner 102, so that an image with a high signal-to-noise ratio and a high resolution can be obtained. It will be appreciated, of course, that the lens may be other combination lenses that produce a line beam.
In one embodiment, the emission module 11 includes a lens and a Diffractive Optical Element (DOE) (not shown), the light source 101 emits a collimated light beam after being collimated by the lens, and the collimated light beam is diffracted by the DOE to form a line beam including a plurality of spots connected together to illuminate the target object 20. Specifically, as shown in fig. 3b, the light source 101 emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first series of light beams formed by connecting a plurality of spots, the beam scanner 102 receives the first series of light beams and deflects the first series of light beams to form a second series of light beams, and assuming that the first deflection angle is 0 degree, the formed second series of light beams are the line series of light beams 302 shown by the solid lines in fig. 3 b; the first bunch of beams is then deflected again by the beam scanner 102, at a certain deflection angle, to form a second bunch of beams, i.e. the bunch of beams 302 indicated by dashed lines in fig. 3 b. It is understood that all of the line beams are not distinct and are the second line beam formed by deflection by the beam scanner 102, and are separated by dashed and solid lines in the figure for ease of illustration. The projection pattern 30 composed of the plurality of second line-string beams thus formed after the plurality of deflections has a larger field of view than the first line-string beams without the beam scanner 102, thereby acquiring an image with high signal-to-noise ratio and high resolution.
In one embodiment, the light source 101 emits a collimated light beam after being collimated by the lens, and the collimated light beam is diffracted by the DOE to form a lattice pattern including a plurality of spots to illuminate the target object 20. As shown in fig. 3c, the light source 101 emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first dot matrix light beam, the light beam scanner 102 receives the first dot matrix light beam and deflects the first dot matrix light beam to form a second dot matrix light beam, and assuming that the first deflection angle is 0 degree, the formed dot matrix light beam is the second dot matrix light beam formed by the solid circles 303 in fig. 3 c; the first lattice beam is then deflected again by the beam scanner 102 with a certain deflection angle, forming a second lattice beam, which is formed by a dashed circle 303 in fig. 3 c. It is understood that all the lattice beams are not distinct and are the second lattice beam formed by the deflection of the beam scanner 102. The projection pattern 30 composed of the plurality of second lattice beams thus formed after the plurality of deflections has a higher density and a larger field of view than the first lattice beam without the beam scanner 102, thereby acquiring an image with a high signal-to-noise ratio and a high resolution. It is understood that the second lattice beam may be arranged in one dimension, two dimensions, regular or irregular, and preferably, the regular arrangement is adopted, so that the depth value distribution is more regular.
As shown in fig. 3a to 3c, the light source may be a single light source or multiple light sources, and if the light source is multiple light sources, multiple first light beams may be formed at a time, and the multiple first light beams are deflected by the light beam scanner 102 to form multiple second light beams, so that the scanning angle is small, the scanning speed is fast, and the measurement accuracy can be improved. It is understood that, according to the requirements of the actual usage scenario, the deflection times, deflection angles and deflection sequence of each direction of the beam scanner 102 can be controlled to realize the projection with different densities and different field angles, so as to obtain the image with high signal-to-noise ratio and high resolution.
The following description will specifically discuss the projection pattern 30 projected onto the target object 20 by the emission module 11 according to the above-mentioned embodiments of fig. 3a-3c, in which the control and processing unit time-divisionally activates the pixels in the corresponding area of the image sensor 121 based on the second line beam 301 formed by the first line beam in fig. 3a being deflected by the beam scanner 102.
The light source 101 generates a first line beam through the cylindrical lens, and the beam scanner 102 receives and deflects the first line beam to form a plurality of second line beams 302. The control and processor 13 controls the second line beam 301 to be irradiated to a given area of the target object 20 and activates pixels in a corresponding area of the image sensor 121 based on the second line beam 301, as shown in fig. 4. Assuming that when the beam scanner 102 forms one deflected beam 301 to be irradiated to a given region of the target object 20 through one deflection, the control and processor 13 activates pixels of the corresponding region 401 in the image sensor 121 based on the deflected beam 301 to accumulate photo-charges in response to the beam reflected back by the given region. Each second line beam is shown in fig. 4 to occupy approximately 13 × 2 ═ 26 pixels, and may have other sizes in practice, and is not particularly limited in the embodiment of the present invention.
Similarly, when the next second line beam 301 is deflected by the beam scanner 102 to irradiate another region of the target object 20, the control and processor 13 activates pixels of another corresponding region in the image sensor 121 based on the next second line beam 301 to accumulate photocharges in response to the beam reflected back by another region of the target object 20.
It will be appreciated that by time-sharing activation of the pixels of the respective regions of the image sensor 121, power consumption can be greatly reduced, and multiple regions of the target object 20 can be illuminated by multiple deflections of the beam scanner 102, such that the projection pattern formed by the plurality of second line beams completely covers the target object. The pixels of the respective areas of the image sensor 121 may sequentially collect the optical charges accumulated by the light beam reflected by the target object 20, and the control and processor 13 may calculate the phase difference based on the optical charges to obtain the distance of the target object 20, so that a depth image with high signal-to-noise ratio and high resolution may be output.
Based on TOF depth measurement device in above-mentioned each embodiment, the utility model discloses another embodiment still provides a TOF depth measurement method. Fig. 5 shows a flow of a TOF depth measurement method, as tried with reference to fig. 5, the measurement method comprising the steps of:
s501, emitting a light beam towards a given area of a target object through an emission module; the emission module comprises a light source and a light beam scanner, and the light beam scanner deflects a light beam emitted by the light source to irradiate a given area of a target object;
the utility model provides an it is concrete, launch the module and still including lens the embodiment of the utility model provides an, lens are column lens, and the light source sends the light beam and forms first ray beam through column lens, and first ray beam forms the projection pattern that many second ray are fast to be constituteed after light beam scanner deflects many times, and this projection pattern has a bigger angle of vision than first ray beam.
In one embodiment, the emission module further comprises a collimating lens and a diffractive optical element, a light beam emitted by the light source emits a collimated light beam after being collimated by the lens, the collimated light beam is diffracted by the DOE to form a first linear light beam or a first dot matrix light beam, the first linear light beam or the first dot matrix light beam forms a projection pattern formed by a plurality of second linear light beams or a plurality of second dot matrix light beams after being deflected for multiple times by the light beam scanner, the projection pattern has higher density and/or larger field angle than the first linear light beam or the first dot matrix light beam, and the second dot matrix light beams are regularly arranged.
S502, collecting the light beam reflected by the target object through a collection module; the acquisition module comprises an image sensor consisting of a pixel array;
in some embodiments, pixels of each respective area in the image sensor may sequentially collect photocharges accumulated by light beams reflected back by the target object.
And S503, the control and processor activates pixels of corresponding areas in the image sensor based on the deflected light beams to respond to the optical charges accumulated by the reflected light beams, and calculates phase differences based on the optical charges to acquire the distance of the target object.
Specifically, the control and processor acquires photocharges accumulated by light beams reflected back by a given region by activating an image sensor composed of an array of pixels in a time-sharing manner, calculates a phase difference based on the photocharges acquired by the pixels to acquire a distance to the target object, and outputs a depth image of the target object. The embodiment of the utility model provides an in, because the complete target object that covers of projection pattern that a plurality of deflection light beam formed, consequently can export a signal to noise ratio high and the high depth image of resolution ratio.
As another embodiment of the present invention, there is also provided an electronic device, which may be a desktop, a desktop installation device, a portable device, a wearable device, an in-vehicle device, a robot, or the like. In particular, the device may be a laptop or an electronic device to allow gesture recognition or biometric recognition. In other examples, the device may be a head-mounted device for identifying objects or hazards in the user's surroundings for safety, e.g., a virtual reality system that obstructs the user's vision of the environment, objects or hazards in the surroundings may be detected to provide the user with warnings about nearby objects or obstacles. In other examples, which may be a mixed reality system that mixes virtual information and images with the user's surroundings, objects or people in the user's environment may be detected to integrate virtual information with the physical environment and objects. In other examples, the device may be applied to the field of unmanned driving and the like. Referring to fig. 6, taking a mobile phone as an example for explanation, the electronic device 600 includes a housing 61, a screen 62, and the TOF depth measuring apparatus according to the foregoing embodiment; the emission module 11 and the collection module 12 of the TOF depth measuring device are disposed on the same plane of the electronic device 600, and are configured to emit a light beam to a target object, receive the light beam reflected by the target object, and form an electrical signal.
The present application further provides a storage medium for storing a computer program, which when executed performs at least the method described in any of the foregoing embodiments.
The storage medium may be implemented by any type of volatile or non-volatile storage device, or combination thereof. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an erasable Programmable Read-Only Memory (EPROM), an electrically erasable Programmable Read-Only Memory (EEPROM), a magnetic random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data rate Synchronous Dynamic Random Access Memory (DDRSDRAM, Double Data rate Synchronous Dynamic Random Access Memory), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM, Enhanced Synchronous Dynamic Random Access Memory), Synchronous link Dynamic Random Access Memory (SLDRAM, Synchronous Dynamic Random Access Memory (DRAM), Direct Memory (DRM, Random Access Memory). Storage media described herein in connection with embodiments of the invention are intended to comprise, without being limited to, these and any other suitable types of memory.
It is to be understood that the foregoing is a more detailed description of the invention, and specific/preferred embodiments thereof are described, and it is not intended that the invention be limited to the specific embodiments disclosed. For those skilled in the art to which the invention pertains, a plurality of alternatives or modifications can be made to the described embodiments without departing from the concept of the invention, and these alternatives or modifications should be considered as belonging to the protection scope of the invention. In the description herein, references to the description of the term "one embodiment," "some embodiments," "preferred embodiments," "an example," "a specific example," or "some examples" or the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention.
In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction. Although the embodiments of the present invention and their advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope of the invention as defined by the appended claims.
Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. One of ordinary skill in the art will readily appreciate that the above-disclosed, presently existing or later to be developed, processes, machines, manufacture, compositions of matter, means, methods, or steps, that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.

Claims (10)

1. A TOF depth measurement apparatus, comprising:
the emission module is used for emitting a light beam towards a given area of the target object; the emission module comprises a light source and a light beam scanner, and light beams emitted by the light source are deflected by the light beam scanner to irradiate a given area of the target object;
the acquisition module is used for acquiring the light beam reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array;
and the control and processor is respectively connected with the emission module and the acquisition module and is used for controlling the light beam scanner to deflect the light beam to irradiate a given area of the target object, activating pixels of the corresponding area in the image sensor according to the deflected light beam, responding to optical charges accumulated by the light beam reflected by the given area, calculating a phase difference based on the optical charges to obtain the distance of the target object and outputting a depth image of the target object.
2. The TOF depth measuring device of claim 1, wherein: the emission module also comprises a lens, the light source generates a line beam through the lens, and the line beam is subjected to line scanning through the beam scanner to irradiate the target object.
3. The TOF depth measuring device of claim 2, wherein: the lens is a cylindrical lens, light beams emitted by the light source form first line light beams after passing through the cylindrical lens, and the first line light beams are deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line light beams.
4. The TOF depth measuring device of claim 2, wherein: the emission module also comprises a collimating lens and a diffraction optical element; the light beam emitted by the light source is collimated by the collimating lens and then emits a collimated light beam, the collimated light beam is diffracted by the diffractive optical element to form a first line-string light beam or a first dot-matrix light beam, and the first line-string light beam or the first dot-matrix light beam is deflected for multiple times by the light beam scanner to obtain a projection pattern formed by a plurality of second line-string light beams or second dot-matrix light beams.
5. The TOF depth measuring device of claim 4, wherein: the second lattice light beams are regularly arranged.
6. The TOF depth measuring device of claim 1, wherein: the control and processor controls the deflection times, deflection angles and deflection sequence of each direction of the light beam scanner to obtain projection patterns with different densities and different field angles.
7. The TOF depth measuring device of claim 1, wherein: the acquisition module also comprises a lens unit and an optical filter; the lens unit is used for receiving at least part of the light beam reflected by the target object and imaging on at least part of the image sensor.
8. The TOF depth measuring device of claim 1, wherein: the light beam scanner is a liquid crystal polarization grating or a MEMS scanner.
9. The TOF depth measuring device of claim 7, wherein: the optical filter is a narrow-band optical filter matched with the wavelength of the light source.
10. An electronic device, comprising: a housing, a screen, and a TOF depth measuring device of any one of claims 1-9; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting light beams to the target object, receiving the light beams reflected back by the target object and forming electric signals.
CN202020595778.2U 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment Active CN212694038U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020595778.2U CN212694038U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020595778.2U CN212694038U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Publications (1)

Publication Number Publication Date
CN212694038U true CN212694038U (en) 2021-03-12

Family

ID=74893465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020595778.2U Active CN212694038U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Country Status (1)

Country Link
CN (1) CN212694038U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458717A (en) * 2020-04-20 2020-07-28 深圳奥比中光科技有限公司 TOF depth measuring device and method and electronic equipment
CN115102036A (en) * 2022-08-24 2022-09-23 立臻精密智造(昆山)有限公司 Lattice laser emission structure, lattice laser system and depth calculation method
CN115144842A (en) * 2022-09-02 2022-10-04 深圳阜时科技有限公司 Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method
WO2022242225A1 (en) * 2021-05-21 2022-11-24 深圳市汇顶科技股份有限公司 Time of flight camera module and electronic device
WO2023273332A1 (en) * 2021-06-30 2023-01-05 深圳市汇顶科技股份有限公司 Emission device for time-of-flight depth measurement, and electronic apparatus

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111458717A (en) * 2020-04-20 2020-07-28 深圳奥比中光科技有限公司 TOF depth measuring device and method and electronic equipment
WO2022242225A1 (en) * 2021-05-21 2022-11-24 深圳市汇顶科技股份有限公司 Time of flight camera module and electronic device
WO2023273332A1 (en) * 2021-06-30 2023-01-05 深圳市汇顶科技股份有限公司 Emission device for time-of-flight depth measurement, and electronic apparatus
CN115102036A (en) * 2022-08-24 2022-09-23 立臻精密智造(昆山)有限公司 Lattice laser emission structure, lattice laser system and depth calculation method
CN115144842A (en) * 2022-09-02 2022-10-04 深圳阜时科技有限公司 Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method
CN115144842B (en) * 2022-09-02 2023-03-14 深圳阜时科技有限公司 Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method

Similar Documents

Publication Publication Date Title
CN212694038U (en) TOF depth measuring device and electronic equipment
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
CN110596722B (en) System and method for measuring flight time distance with adjustable histogram
CN111025317B (en) Adjustable depth measuring device and measuring method
CN111458717A (en) TOF depth measuring device and method and electronic equipment
JP6977045B2 (en) Systems and methods for determining the distance to an object
CA3017819C (en) Lidar based 3-d imaging with varying illumination intensity
CN110596725B (en) Time-of-flight measurement method and system based on interpolation
CN110596724B (en) Method and system for measuring flight time distance during dynamic histogram drawing
CN111856433B (en) Distance measuring system and measuring method
CN111123289B (en) Depth measuring device and measuring method
CN110596723B (en) Dynamic histogram drawing flight time distance measuring method and measuring system
JP2022505772A (en) Time-of-flight sensor with structured light illumination
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
JP2018531374A (en) System and method for measuring distance to an object
JP2018531374A6 (en) System and method for measuring distance to an object
CN111736173B (en) Depth measuring device and method based on TOF and electronic equipment
CN111708039A (en) Depth measuring device and method and electronic equipment
WO2021212917A1 (en) Device and method for measuring tof depth
CN110687541A (en) Distance measuring system and method
CN111538024B (en) Filtering ToF depth measurement method and device
CN102947726A (en) Scanning 3d imager
WO2019076072A1 (en) Optical distance measurement method and apparatus
CN111025321B (en) Variable-focus depth measuring device and measuring method
CN111487639A (en) Laser ranging device and method

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant