CN212694034U - TOF depth measuring device and electronic equipment - Google Patents

TOF depth measuring device and electronic equipment Download PDF

Info

Publication number
CN212694034U
CN212694034U CN202020596333.6U CN202020596333U CN212694034U CN 212694034 U CN212694034 U CN 212694034U CN 202020596333 U CN202020596333 U CN 202020596333U CN 212694034 U CN212694034 U CN 212694034U
Authority
CN
China
Prior art keywords
real
light
dot matrix
reflected
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020596333.6U
Other languages
Chinese (zh)
Inventor
孙飞
武万多
王兆民
郑德金
王家麒
孙瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN202020596333.6U priority Critical patent/CN212694034U/en
Application granted granted Critical
Publication of CN212694034U publication Critical patent/CN212694034U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of Optical Distance (AREA)

Abstract

The utility model discloses a TOF depth measuring device and an electronic device, wherein the TOF depth measuring device comprises a transmitting module used for projecting dot matrix patterns to a target object; the dot matrix pattern comprises a real dot matrix formed by real light spots and a virtual dot matrix formed by virtual light spots; the acquisition module is used for receiving a reflected light signal reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array, wherein one part of pixels of the pixel array are used for detecting a first reflected light signal reflected back by a real light spot, and the other part of pixels are used for detecting a second reflected light signal directly reflected back by an unreal light spot; and the control and processor is respectively connected with the transmitting module and the collecting module, filters the first reflected light signal according to the second reflected light signal to obtain a third reflected light signal, and calculates the phase difference based on the third reflected light signal to obtain a first depth map of the target object. The utility model discloses the problem of reflected light beam multipath interference has been solved when realizing high resolution degree of depth image.

Description

TOF depth measuring device and electronic equipment
Technical Field
The utility model relates to a three-dimensional imaging technology field especially relates to a TOF degree of depth measuring device and electronic equipment.
Background
A depth measuring device of a Time of Flight (TOF) scheme calculates a distance of a target object by calculating a Time difference or a phase difference of a light beam reflected from a transmission area to the target area and back to the target object to be received, so as to obtain depth data information of the target object. Depth measurement devices based on TOF schemes have begun to be applied in the fields of three-dimensional measurement, gesture control, robot navigation, security, monitoring, and the like.
Conventional TOF depth measuring devices typically include a light source that emits a flood beam of light into a target space to provide illumination, and a camera that images the reflected flood beam of light, and calculate the distance to the object by calculating the time required for the beam to travel from emission to reflection to reception. However, when the conventional TOF depth measuring device is used for distance sensing, on one hand, the measurement accuracy is affected due to ambient light interference, for example, when the ambient light intensity is high and even reaches flood light which can submerge the light source, it is difficult to distinguish the light beam of the light source, so that a large measurement error occurs; on the other hand, the traditional TOF depth measuring device can only measure objects at a short distance, and the measurement of the objects at a long distance can generate great errors.
In order to solve the problem of measuring distance, chinese patent application No. 202010116700.2 discloses a TOF depth measuring device in which a transmitting module transmits a spot beam, the distance of measurement is further because the spatial distribution of the spot beam is sparse and the energy of each spot is more concentrated, and the direct irradiation intensity is higher than the intensity reflected back by multiple paths, so that the optical signals generated by the multiple paths can be distinguished, thereby improving the signal-to-noise ratio of effective signals to reduce the multipath interference. However, in this scheme, if the spot beam distribution is dense, the multipath interference cannot be eliminated; if the spot beam distribution is sparse, the image resolution is not high.
The above background disclosure is only for the purpose of assisting understanding of the inventive concepts and technical solutions of the present invention, and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above contents are disclosed at the filing date of the present patent application.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to provide a TOF degree of depth measuring device and electronic equipment to solve at least one among the above-mentioned background art problem.
In order to achieve the above object, the embodiment of the present invention provides a technical solution that:
a TOF depth measurement apparatus comprising:
the transmitting module is used for projecting a dot matrix pattern to a target object; the dot matrix pattern comprises a real dot matrix formed by real light spots and a virtual dot matrix formed by virtual light spots;
the acquisition module is used for receiving a reflected light signal reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array, wherein one part of pixels of the pixel array are used for detecting a first reflected light signal reflected back by the real light spot, and the other part of pixels are used for detecting a second reflected light signal not directly reflected back by the real light spot;
and the control and processor is respectively connected with the transmitting module and the collecting module, filters the first reflected light signal according to the second reflected light signal to obtain a third reflected light signal, and calculates a phase difference based on the third reflected light signal to obtain a first depth map of the target object.
In some embodiments, the lattice pattern includes a plurality of the real light spots and a plurality of the virtual light spots; wherein the number of real light spots is larger than the number of virtual light spots.
In some embodiments, the real lattice and the virtual lattice are in a regular arrangement.
In some embodiments, the lattice pattern formed by the plurality of real spots around the single spot of the virtual spot is a hexagon, a quadrangle or any other shape.
In some embodiments, the real lattice is staggered from the virtual lattice.
In some embodiments, the dot matrix pattern is configured to arrange a virtual light spot between every two real light spots in even rows of the dot matrix pattern, and the virtual dot matrix pattern formed by a plurality of the virtual light spots is arranged in a cross manner or a plurality of squares.
In some embodiments, the light received by the pixel corresponding to the real light spot includes a light signal directly reflected by the real light spot and a stray light signal; the photons received by the pixel corresponding to the virtual light spot only include stray light signals.
The utility model discloses another technical scheme of example does:
an electronic device, comprising: the TOF depth measuring device comprises a shell, a screen and the TOF depth measuring device in the technical scheme; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting floodlight beams to a target object and receiving the floodlight beams reflected back by the target object and forming electric signals.
The utility model discloses technical scheme's beneficial effect is:
compared with the prior art, the TOF depth measuring device can solve the problem of multipath interference of the reflected light beam while realizing a high-resolution depth image.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a TOF depth measuring device according to an embodiment of the present invention.
Fig. 2 is a schematic illustration of multipath reflections of an emitted light beam.
Fig. 3a-3d are schematic diagrams of dot patterns projected by a transmission module of a TOF depth measuring device according to an embodiment of the invention.
Fig. 4 is a schematic diagram of an image sensor pixel array of a TOF depth measuring device according to an embodiment of the invention.
Fig. 5 is a graphical illustration of the intensity of the reflected light generated in the embodiment of fig. 1.
FIG. 6 is a graphical representation of a calculation for filtering out the stray light signal in the embodiment of FIG. 1.
Fig. 7 is a flowchart illustration of a TOF depth measurement method according to another embodiment of the invention.
Fig. 8 is a diagram of an electronic device employing the TOF depth measuring device of the embodiment of fig. 1.
Detailed Description
In order to make the technical problem, technical scheme and beneficial effect that the embodiment of the present invention will solve more clearly understand, the following combines the drawings and embodiment, and goes forward the further detailed description of the present invention. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. The connection may be for fixation or for circuit connection.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be construed as limiting the invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Referring to fig. 1, as an embodiment of the present invention, a TOF depth measuring device is provided, fig. 1 is the present invention discloses a structure diagram of a TOF depth measuring device according to an embodiment of the present invention. The TOF depth measuring device 10 includes a transmitting module 11, a collecting module 12, and a control and processor 13 connected to the transmitting module 11 and the collecting module 12, respectively. The emitting module 11 is configured to project a dot matrix pattern to the target object 20, where the dot matrix pattern includes a real dot matrix formed by real light spots and a virtual dot matrix formed by a region without light spot irradiation; the collecting module 12 receives the reflected light signal reflected by the target object 20; the acquisition module 12 includes an image sensor 121 composed of a pixel array, wherein a part of pixels of the pixel array is used to detect a first reflected light signal that is reflected back by the target object 20 from a real light spot, and another part of pixels is used to detect a second reflected light signal that is directly reflected back from a non-real light spot; the control and processor 13 filters the first reflected light signal based on the second reflected light signal to obtain a third reflected light signal, and calculates a phase difference based on the third reflected light signal to obtain a first depth map of the target object 20.
The emitting module 11 includes a light source, a light source driver (not shown), and the like. Wherein, the light source can be light sources such as emitting diode (LED), edge-emitting laser (EEL), Vertical Cavity Surface Emitting Laser (VCSEL), also can be the light source array that a plurality of light sources are constituteed, and the light beam that the light source was launched can be visible light, infrared light, ultraviolet ray etc. do not do the special restriction in the embodiment of the utility model provides an.
In one embodiment, the emission module 11 further includes a Diffractive Optical Elements (DOE) for replicating the lattice pattern emitted by the light source. It will be understood that, assuming that the dot pattern emitted by the light source is a periodic arrangement pattern, adjacent dot patterns are adjacent to each other after being replicated by the diffractive optical element DOE, i.e. no significant gaps or overlaps exist in the finally formed patterns.
The acquisition module 12 includes a TOF image sensor 121, a lens unit, and may further include a filter (not shown in the figure); wherein the lens unit receives and images at least part of the light beam reflected back by the target object 20 on at least part of the TOF image sensor; the filter is a narrow-band filter matched with the wavelength of the light source and is used for inhibiting background light noise of other wave bands. TOF image sensors may be image sensors composed of Charge Coupled Devices (CCD), Complementary Metal Oxide Semiconductor (CMOS), Avalanche Diodes (AD), Single Photon Avalanche Diodes (SPAD), etc., with sensor array sizes representing the resolution of the depth camera, e.g., 320x240, etc. Generally, a readout circuit (not shown in the figure) composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC), and the like is also included in connection with the image sensor 121.
In some embodiments, the TOF image sensor comprises at least one pixel, each pixel comprising two and more taps (tap) for storing and reading or draining the charge signal generated by the incident photons under control of the respective electrode, for example comprising 2 taps, which are sequentially switched in an order within a single frame period (or single exposure time) to collect the respective photons for receiving the optical signal and converting into an electrical signal.
The control and processor 13 may be a separate dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc. including a CPU, a memory, a bus, etc., or may include a general-purpose processing circuit, such as a processing circuit in an intelligent terminal, such as a mobile phone, a television, a computer, etc., when the TOF depth measuring device is integrated into the intelligent terminal, such as the mobile phone, the television, the computer, etc., as at least a part of the control and processor 13.
The control and processor 13 is used for providing emission instruction signals required when the light source emits laser light, and the light source emits light beams to the target object 20 under the control of the emission instruction signals.
In some embodiments, the control and processor 13 also provides a demodulation signal (acquisition signal) for each tap in each pixel of the TOF image sensor, the tap acquiring an electrical signal including a reflected beam reflected back by the target object 20 under control of the demodulation signal. It will be appreciated that the electrical signal is related to the intensity of the reflected beam, and the control and processor 13 processes the electrical signal and calculates the phase difference to obtain the distance to the target object 20.
Referring to fig. 2, the following description will be made of a "multipath" situation, in which, in general, a TOF depth measuring device generally employs a floodlight as a light source to cover all pixel areas, but the floodlight beam is dense, and the light flux received by a pixel is generally not only generated by direct reflection of a target object but also includes stray light after multiple reflections. Specifically, the emission module 11 emits the light beam 201, and the light beam 201 is scattered after irradiating the target object 50 and can be reflected back to the collection module 12 through a plurality of paths.
In fig. 2, assuming that the target object 50 is a corner, the emission beam 201 irradiates the target object 50, and the collection module 12 detects at least a portion of the first reflected light 202 including the direct reflection of the beam 201 by the target object 20; if the emission beam 201 is scattered toward other areas of the target object 50 after being irradiated to the target object 50, the collection module 12 detects the second reflected light 203 having a longer flight path than the first reflected light 202. Similarly, the emitted beam 201 may be scattered more than once, and finally the collection module 12 may detect the third reflected light 204 having a longer flight path than the second reflected light 203, and even more paths of reflected light, which may result in a "multi-path" condition. Due to the different flight times of the directly reflected light and the indirectly reflected light, the interference of multiple paths will cause the depth value obtained by the corresponding pixel to be deviated.
Referring to fig. 3a-3d, in some embodiments of the present invention, the dot matrix pattern projected by the emission module is as shown in fig. 3a-3 d. The dot pattern 30 includes a real dot matrix formed by real spots and a virtual dot matrix formed by non-spot-irradiated areas, for convenience of description, the non-spot-irradiated areas are represented by virtual spots in the following, that is, the real spots form the real dot matrix, and the virtual spots form the virtual dot matrix. It is understood that the virtual light spot in this embodiment is an abstract representation for the purpose of simplifying and clearly explaining the real light spot arrangement rule, and cannot be simply understood as a virtual light spot in a literal sense.
As shown in fig. 3a and 3b, the lattice pattern 30 may be hexagonal, as shown by the dotted lines; of course, it may be a quadrilateral, as shown in FIG. 3 d. In fig. 3a, the dot matrix pattern 30 has a virtual light spot 302 between every two real light spots 301 in even rows, and the virtual dot matrix pattern formed by a plurality of virtual light spots 302 is arranged in a cross manner; in the dot matrix pattern 30 shown in fig. 3b, a virtual light spot 302 is also arranged between every two real light spots 301 in even rows, and the dot matrix pattern formed by a plurality of virtual light spots 302 is arranged in a plurality of squares; fig. 3c shows the dot matrix pattern 30 having a virtual light spot 302 between two real light spots 301 in even rows and two real light spots 301, as shown by the dotted line, the dot matrix pattern formed by a plurality of virtual light spots 302 is a plurality of rectangles; the dot pattern 30 shown in fig. 3d is a quadrilateral, wherein a virtual spot 302 is arranged between every two real spots 301 in even rows, and the dot pattern formed by a plurality of virtual spots 302 is a plurality of squares. It should be understood that the positions of the virtual light spots and the real light spots are illustrated only for convenience of illustrating the diversity of the lattice patterns formed by the virtual light spots and the real light spots, and the virtual light spots are not limited to be in odd rows or even rows and other positions, and the light spots are not necessarily circular, but may also be in other shapes such as oval or square.
Fig. 3a to 3d show that the dot pattern formed by the plurality of real light spots 301 around the single light spot of the virtual light spot 302 can be hexagonal, quadrilateral and any other shape, the real dot matrix and the virtual dot matrix are staggered, and the number of the real light spots 301 is greater than that of the virtual light spots 302, so that the dot pattern can be projected by the emission module 11 to reduce the multipath effect and improve the image resolution. It is understood that the real dot matrix and the virtual dot matrix may be arranged regularly or irregularly, and preferably, the regular arrangement is adopted, so that the depth value distribution is more regular.
Taking the example that the emitting module projects the dot pattern shown in fig. 3d to the target object as an example, the image sensor 121 detects a first reflected light signal reflected by the real light spot 301 via the target object 20 and detects a second reflected light signal directly reflected by the non-real light spot 302, and the control and processor filters the first reflected light signal based on the second reflected light signal. It can be understood that the second reflected light signal is a stray light signal, the first reflected light signal includes a light signal directly reflected by the real light spot and the stray light signal, and the stray light signal in the first reflected light signal is filtered based on the second reflected light signal, so as to obtain the light signal directly reflected by the real light spot (i.e. the foregoing third reflected light signal) to improve the signal-to-noise ratio of the image.
As shown in fig. 3d, the emitting module 11 projects a dot pattern 30 onto the target object 20, where the dot pattern 30 includes a plurality of real light spots 301 (indicated by solid lines) and a plurality of virtual light spots 302 (indicated by dashed lines); a part of pixels of the pixel array of the image sensor 121 collect a first reflected light signal reflected by the target object 20 by the plurality of real light spots 301, and another part of pixels collect a second reflected light signal directly reflected by the non-target object 20, as shown in fig. 4. For convenience of explanation, it is assumed that each real spot 301 and each virtual spot 302 occupy approximately 2 × 2-4 pixels, and in practice, the real spots 301 and the virtual spots 302 may have other sizes. It can be understood that if the real light spots are distributed more densely, the pixels occupied by the virtual light spots are smaller, and thus the resolution of the calculated depth map is higher. It should be noted that the pixel occupied by the virtual light spot refers to a lattice pattern which is relatively densely distributed with respect to the real light spot, and does not refer to a comparison between the pixel occupied by the virtual light spot and the pixel occupied by the real light spot, that is, a comparison between the pixels occupied by the virtual light spot and the pixels occupied by the real light.
Specifically, the light received by the pixel corresponding to the real light spot 301 includes a light signal directly reflected by the real light spot 301 and a stray light signal caused by multipath or background light; the photons received by the pixel corresponding to the virtual spot 302 only include the stray light signal. Because the energy of the optical signal directly reflected by the real light spot is greater than the energy of the stray light, and the intensity of the optical signal of the pixel occupied by the real light spot 301 is significantly higher than that of the optical signal of the pixel occupied by the virtual light spot 302, the control and processor 13 can filter the stray light signal received by the pixel occupied by the real light spot 301 based on the intensity of the stray light signal of the pixel occupied by the virtual light spot 302.
As shown in fig. 5, for example, a detection threshold may be set to find the pixel occupied by the virtual spot 302, wherein the acquisition module 12 detects the peak intensity 503 at each real spot 301 and the stray light signal intensity 501 of the pixel occupied by the virtual spot 302. The control and processor 13 may find the pixel occupied by the virtual spot by setting the detection threshold 502. It is understood that the peak intensity 503 (i.e. the aforementioned first reflected light signal) is the sum of the intensity of the light signal directly reflected by the real light spot and the intensity of the stray light signal 501, and the intensity of the stray light signal 501 is the aforementioned second reflected light signal. Therefore, the stray light signal contained in the peak intensity 503 is filtered out based on the stray light signal intensity 501, so as to obtain the light signal directly reflected by the real light spot.
As shown in fig. 6, assuming that the pixel 601 occupied by the first optical signal reflected by the real spot and the pixel 602 occupied by the stray light signal are both occupied, the pixel 601 occupied by the first optical signal is filtered according to the average value of the pixels of the stray light signal to obtain a pixel value 603 of the direct reflected optical signal of the real spot, so as to improve the signal-to-noise ratio of the image.
In one embodiment, the control and processor 13 may calculate the phase difference based on the optical signal directly reflected by the real light spot to obtain a first depth map, calculate the depth value on the pixel corresponding to the real light spot in the first depth map, and interpolate the pixel corresponding to the virtual light spot by using the depth value of the real light spot to obtain a second depth map with higher resolution. It will be appreciated that the control and processor 13 may set a detection threshold for the depth values according to the method shown in fig. 5, pixels above the detection threshold are valid pixels, i.e. pixels corresponding to real light spots, and then find pixels below the detection threshold around the pixels to interpolate the pixels below the detection threshold.
Referring to fig. 7, another embodiment of the present invention further provides a TOF depth measuring method, and fig. 7 is a flowchart illustration of the TOF depth measuring method of the present embodiment, where the method includes the following steps:
s701, projecting a dot matrix pattern to a target object by an emission module, wherein the dot matrix pattern comprises a real dot matrix formed by real light spots and a virtual dot matrix formed by a region without light spot irradiation;
specifically, the transmitting module projects lattice patterns to the target object, and the number of real lattices in the lattice patterns is larger than that of virtual lattices; the real dot matrix and the virtual dot matrix are regularly arranged and are arranged in a crossed manner, and dot matrix patterns formed by a plurality of real light spots around a single light spot of the virtual dot matrix can be quadrangle or hexagon.
S702, receiving a reflected light signal reflected by a target object by an acquisition module, wherein the acquisition module comprises an image sensor consisting of a pixel array, one part of pixels of the pixel array detect a first reflected light signal reflected by the real light spot, and the other part of pixels detect a second reflected light signal which is not directly reflected by the real light spot;
in some embodiments, a portion of the pixel detection of the pixel array comprises at least a portion of the reflected light signal of the real spot directly reflected back through the target object, while another portion of the pixel detection comprises the light beam reflected through the background light or scattered through the real spot.
And S703, the control and processor filters the first reflected light signal based on the second reflected light signal to obtain a third reflected light signal, and calculates a phase difference based on the third reflected light signal to obtain a first depth map of the target object.
Specifically, the control and processor may calculate a phase difference based on the optical signal directly reflected by the real light spot to obtain a first depth map, calculate a depth value on a pixel corresponding to the real light spot in the first depth map, and interpolate the pixel corresponding to the virtual light spot using the depth value of the real light spot to obtain a second depth map with a higher resolution. It will be appreciated that the control and processor 13 may set a threshold value for the depth values above which pixels are valid pixels, i.e. pixels corresponding to real spots, and then look for pixels below the threshold around this pixel to interpolate the pixels below the threshold.
As another embodiment of the present invention, there is also provided an electronic device, which may be a desktop, a desktop installation device, a portable device, a wearable device, an in-vehicle device, a robot, or the like. In particular, the device may be a laptop or an electronic device to allow gesture recognition or biometric recognition. In other examples, the device may be a head-mounted device for identifying objects or hazards in the user's surroundings for safety, e.g., a virtual reality system that obstructs the user's vision of the environment, objects or hazards in the surroundings may be detected to provide the user with warnings about nearby objects or obstacles. In other examples, which may be a mixed reality system that mixes virtual information and images with the user's surroundings, objects or people in the user's environment may be detected to integrate virtual information with the physical environment and objects. In other examples, the device may be applied to the field of unmanned driving and the like. Referring to fig. 8, taking a mobile phone as an example for explanation, the electronic device 800 includes a housing 81, a screen 82, and the TOF depth measuring device according to the foregoing embodiment; the emission module 11 and the collection module 12 of the TOF depth measuring device are disposed on the same surface of the electronic device 800, and are configured to emit a flood light beam to the target object and receive the flood light beam reflected by the target object to form an electrical signal.
Embodiments of the present application also provide a storage medium for storing a computer program, which when executed performs at least the method described above.
The storage medium may be implemented by any type of volatile or non-volatile storage device, or combination thereof. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an erasable Programmable Read-Only Memory (EPROM), an electrically erasable Programmable Read-Only Memory (EEPROM), a magnetic random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data rate Synchronous Dynamic Random Access Memory (DDRSDRAM, Double Data rate Synchronous Dynamic Random Access Memory), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM, Enhanced Synchronous Dynamic Random Access Memory), Synchronous link Dynamic Random Access Memory (SLDRAM, Synchronous Dynamic Random Access Memory (DRAM), Direct Memory (DRM, Random Access Memory). Storage media described herein in connection with embodiments of the invention are intended to comprise, without being limited to, these and any other suitable types of memory.
It is to be understood that the foregoing is a more detailed description of the invention, and specific/preferred embodiments thereof are described, and it is not intended that the invention be limited to the specific embodiments disclosed. For those skilled in the art to which the invention pertains, a plurality of alternatives or modifications can be made to the described embodiments without departing from the concept of the invention, and these alternatives or modifications should be considered as belonging to the protection scope of the invention. In the description herein, references to the description of the term "one embodiment," "some embodiments," "preferred embodiments," "an example," "a specific example," or "some examples" or the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention.
In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction. Although the embodiments of the present invention and their advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope of the invention as defined by the appended claims.
Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. One of ordinary skill in the art will readily appreciate that the above-disclosed, presently existing or later to be developed, processes, machines, manufacture, compositions of matter, means, methods, or steps, that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.

Claims (8)

1. A TOF depth measurement apparatus, comprising:
the transmitting module is used for projecting a dot matrix pattern to a target object; the dot matrix pattern comprises a real dot matrix formed by real light spots and a virtual dot matrix formed by virtual light spots;
the acquisition module is used for receiving a reflected light signal reflected by the target object; the acquisition module comprises an image sensor consisting of a pixel array, wherein one part of pixels of the pixel array are used for detecting a first reflected light signal reflected back by the real light spot, and the other part of pixels are used for detecting a second reflected light signal not directly reflected back by the real light spot;
and the control and processor is respectively connected with the transmitting module and the collecting module, filters the first reflected light signal according to the second reflected light signal to obtain a third reflected light signal, and calculates a phase difference based on the third reflected light signal to obtain a first depth map of the target object.
2. The TOF depth measuring device of claim 1, wherein: the dot matrix pattern comprises a plurality of real light spots and a plurality of virtual light spots; wherein the number of real light spots is larger than the number of virtual light spots.
3. The TOF depth measuring device of claim 1, wherein: the real dot matrix and the virtual dot matrix are regularly arranged.
4. The TOF depth measuring device of claim 1, wherein: the lattice pattern formed by a plurality of real light spots around a single light spot of the virtual light spots is hexagonal, quadrangular or other arbitrary shapes.
5. The TOF depth measuring device of claim 1, wherein: the real dot matrix and the virtual dot matrix are arranged in a staggered mode.
6. The TOF depth measuring device of claim 2, wherein: the dot matrix pattern is configured to arrange a virtual light spot between every two real light spots on even rows of the dot matrix pattern, and the virtual dot matrix pattern formed by a plurality of virtual light spots is arranged in a cross mode or a plurality of squares.
7. The TOF depth measuring device of claim 1, wherein: the light received by the pixel corresponding to the real light spot comprises a light signal directly reflected by the real light spot and a stray light signal; the photons received by the pixel corresponding to the virtual light spot only include stray light signals.
8. An electronic device, comprising: a housing, a screen, and a TOF depth measuring device of any one of claims 1-7; the emission module and the collection module of the TOF depth measuring device are arranged on the same surface of the electronic equipment and used for emitting floodlight beams to a target object and receiving the floodlight beams reflected back by the target object and forming electric signals.
CN202020596333.6U 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment Active CN212694034U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020596333.6U CN212694034U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020596333.6U CN212694034U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Publications (1)

Publication Number Publication Date
CN212694034U true CN212694034U (en) 2021-03-12

Family

ID=74893447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020596333.6U Active CN212694034U (en) 2020-04-20 2020-04-20 TOF depth measuring device and electronic equipment

Country Status (1)

Country Link
CN (1) CN212694034U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111366941A (en) * 2020-04-20 2020-07-03 深圳奥比中光科技有限公司 TOF depth measuring device and method
CN113687371A (en) * 2021-08-09 2021-11-23 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
WO2023036131A1 (en) * 2021-09-10 2023-03-16 维沃移动通信有限公司 Image information acquisition module, information processing method and apparatus, and electronic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111366941A (en) * 2020-04-20 2020-07-03 深圳奥比中光科技有限公司 TOF depth measuring device and method
CN113687371A (en) * 2021-08-09 2021-11-23 Oppo广东移动通信有限公司 Light emission module, depth camera and terminal
WO2023036131A1 (en) * 2021-09-10 2023-03-16 维沃移动通信有限公司 Image information acquisition module, information processing method and apparatus, and electronic device

Similar Documents

Publication Publication Date Title
CN111366941A (en) TOF depth measuring device and method
CN212694034U (en) TOF depth measuring device and electronic equipment
JP6854366B2 (en) Detector that optically detects at least one object
CA3017819C (en) Lidar based 3-d imaging with varying illumination intensity
CN110596721B (en) Flight time distance measuring system and method of double-shared TDC circuit
KR101652393B1 (en) Apparatus and Method for obtaining 3D image
EP4025934A1 (en) Processing of lidar images
CN212694038U (en) TOF depth measuring device and electronic equipment
WO2012147496A1 (en) Object detection device and information acquisition device
CN110187359A (en) Method and apparatus for optical measurement distance
WO2021212916A1 (en) Tof depth measurement apparatus and method, and electronic device
CN111538024B (en) Filtering ToF depth measurement method and device
CN105933603A (en) Focus detection apparatus, image pickup apparatus, image pickup system, and focus detection method
CN110689577B (en) Active rigid body pose positioning method in single-camera environment and related equipment
CN110596725A (en) Time-of-flight measurement method and system based on interpolation
JPS62297705A (en) Electro-optic position sensing system
CN111123289A (en) Depth measuring device and measuring method
CN110187360A (en) Method and apparatus for optical measurement distance
US20120162370A1 (en) Apparatus and method for generating depth image
JP2004191092A (en) Three-dimensional information acquisition system
CN111427048A (en) ToF depth measuring device, method for controlling ToF depth measuring device and electronic equipment
CN113780349A (en) Method for acquiring training sample set, model training method and related device
JP7348414B2 (en) Method and device for recognizing blooming in lidar measurement
CN115248440A (en) TOF depth camera based on dot matrix light projection
WO2020195755A1 (en) Distance measurement imaging system, distance measurement imaging method, and program

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant