CN106997603B - Depth camera based on VCSEL array light source - Google Patents

Depth camera based on VCSEL array light source Download PDF

Info

Publication number
CN106997603B
CN106997603B CN201710359556.3A CN201710359556A CN106997603B CN 106997603 B CN106997603 B CN 106997603B CN 201710359556 A CN201710359556 A CN 201710359556A CN 106997603 B CN106997603 B CN 106997603B
Authority
CN
China
Prior art keywords
structured light
light pattern
pattern
window
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710359556.3A
Other languages
Chinese (zh)
Other versions
CN106997603A (en
Inventor
肖振中
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Orbbec Co Ltd
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201710359556.3A priority Critical patent/CN106997603B/en
Publication of CN106997603A publication Critical patent/CN106997603A/en
Application granted granted Critical
Publication of CN106997603B publication Critical patent/CN106997603B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The invention provides a depth camera based on a VCSEL array light source, which comprises: the structured light projection module comprises a VCSEL array light source and is used for emitting a structured light pattern into a space; the structured light pattern comprises one or a combination of a sparse structured light pattern, a dense structured light pattern; the acquisition module is used for acquiring the structured light pattern reflected by the target; a processor for computing a depth image from the structured light pattern. Structured light patterns including one or a combination of sparse structured light patterns and dense structured light patterns can be emitted into space through the VCSEL array light source, and the structured light patterns are collected and subjected to matching calculation, so that a high-precision and high-frame-rate depth image can be obtained, and the depth camera can be widely applied.

Description

Depth camera based on VCSEL array light source
Technical Field
The invention relates to the technical field of optics and electronics, in particular to a depth camera based on a VCSEL array light source.
Background
The 3D imaging technology, especially applied to the consumer field, will continuously impact the conventional 2D imaging technology, the 3D imaging technology has the capability of performing 2D imaging on a target object and can also acquire depth information of the target object, and functions such as 3D scanning, scene modeling, gesture interaction and the like can be further realized according to the depth information. Depth cameras, in particular structured light depth cameras or TOF (time of flight) depth cameras, are hardware devices that are currently commonly used for 3D imaging.
The core component in the depth camera is a laser projection module, and the structure and function of the laser projection module are also different according to the type of the depth camera, for example, the projection module disclosed in patent CN201610977172A is used for projecting a speckle pattern into a space to realize structured light depth measurement, and the speckle structured light depth camera is also a mature and widely adopted scheme at present. With the continuous expansion of the application field of depth cameras, the optical projection module will be continuously developed to smaller and smaller volume and higher performance.
The VCSEL (vertical cavity surface emitting laser) array light source is the first choice of the light source of the depth camera laser projection module due to the advantages of small volume, large power, small beam divergence angle, stable operation and the like. The VCSEL light source array can be laser projected on an extremely small substrate by arranging a plurality of VCSEL light sources, such as 100 or more VCSEL light sources on a 2mmx2mm semiconductor substrate. For the projection module of the structured light depth camera, especially for the projection module based on the speckle pattern, the VCSEL not only provides illumination, but also directly affects the structured light speckle pattern finally projected to the target in the arrangement form, and further affects the measurement accuracy and speed of the depth camera.
The prior art has used more VCSEL array light sources in irregular arrangements, where the density of the arrangement also affects the pattern of the structured light projection. For example, compared with the arrangement of the VCSEL array light sources which are arranged sparsely, the density of the generated pattern is relatively small, and the accuracy of the obtained depth image is also low; however, when the depth of the dense structured light pattern is calculated by using the structured light triangulation method, the required calculation time is long, which reduces the frame rate of the depth image output. Generally, in the current scheme, the accuracy and the frame rate of the depth image are one spear, and it is difficult to acquire the depth image with high accuracy and high frame rate at the same time.
Disclosure of Invention
The invention provides a depth camera based on a VCSEL array light source, aiming at solving the problem that in the prior art, a depth image with high precision and high frame rate is difficult to obtain simultaneously.
In order to solve the above problems, the technical solution adopted by the present invention is as follows:
the invention provides a depth camera based on a VCSEL array light source, which comprises: the structured light projection module comprises a VCSEL array light source and is used for emitting a structured light pattern into a space; the structured light pattern comprises one or a combination of a sparse structured light pattern, a dense structured light pattern; the acquisition module is used for acquiring the structured light pattern reflected by the target; a processor for computing a depth image from the structured light pattern.
In some implementations, the processor performs a matching calculation with a first window using the structured light pattern comprising a sparse structured light pattern resulting in a coarse deviation value and/or a coarse depth image; and performing matching calculation by using the structured light pattern containing the dense structured light pattern through a second window to obtain a fine deviation value and/or a fine depth image.
In some implementations, the structured light pattern includes a hybrid structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern. The processor performs matching calculation by using the mixed structure light pattern and a first window to obtain a rough deviation value and/or a rough depth image; and the processor performs matching calculation by using the mixed structure light pattern and a second window to obtain a fine deviation value and/or a fine depth image.
In some embodiments, the structured light pattern comprises a sparse structured light pattern and a dense structured light pattern. The processor performs matching calculation according to the sparse structured light pattern and by using a first window to obtain a rough deviation value and/or a rough depth image; and the processor performs matching calculation by using the dense structured light pattern and a second window to obtain a fine deviation value and/or a fine depth image.
In some embodiments, the structured light pattern comprises a sparse structured light pattern and a mixed structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern. The processor performs matching calculation by using the sparse structured light pattern and a first window to obtain a rough deviation value and/or a rough depth image; and the processor reuses the mixed structure light pattern and performs matching calculation by using a second window to obtain a fine deviation value and/or a fine depth image.
In some embodiments, the structured light pattern comprises a mixed structured light pattern and a dense structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern. The processor utilizes the mixed structure light pattern and carries out matching calculation by a first window to obtain a rough deviation value and/or a rough depth image; and the processor reuses the mixed structured light pattern formed by the dense structured light pattern and performs matching calculation by using a second window to obtain a fine deviation value and/or a fine depth image.
In some implementations, the brightness of the pattern units in the sparse structured light pattern is greater than the brightness of the pattern units in the dense structured light pattern.
In some embodiments, the first window is larger than the second window. In some implementations, the fine deviation value and the fine depth image are calculated by matching a second window in conjunction with the coarse deviation value or the coarse depth image.
The invention has the beneficial effects that: the invention provides a depth camera based on a VCSEL array light source, which can emit structured light patterns comprising one or a combination of sparse structured light patterns and dense structured light patterns into a space through the VCSEL array light source, and can obtain a depth image with high precision and high frame rate by collecting the structured light patterns and performing matching calculation on the structured light patterns, so that the depth camera has wider application.
Drawings
FIG. 1 is a side view of a structured light depth camera system of an embodiment of the present invention.
FIG. 2 is a side view of a structured light projection module in accordance with an embodiment of the present invention.
FIG. 3 is a schematic diagram of a structured light projection apparatus according to an embodiment of the present invention.
FIG. 4 is a schematic diagram of a sparse structured light pattern of an embodiment of the present invention.
FIG. 5 is a schematic diagram of a dense structured light pattern according to an embodiment of the present invention.
FIG. 6 is a schematic diagram of a combined structured light pattern according to an embodiment of the present invention.
Fig. 7 is a timing diagram illustrating control of a structured light projection module and a structured light collection module according to an embodiment of the present invention.
FIG. 8 is a timing diagram illustrating the control of a structured light projection module and an acquisition module according to another embodiment of the present invention
Fig. 9 is a depth image obtaining step according to an embodiment of the present invention.
FIG. 10 is a timing diagram illustrating the control of the light-configuring projection module and the light-collecting module according to another embodiment of the present invention.
FIG. 11 is a further depth image acquisition step of an embodiment of the present invention.
FIG. 12 is a timing diagram illustrating the control of the projection module and the collection module according to the fourth embodiment of the present invention.
FIG. 13 illustrates yet another depth image acquisition step according to an embodiment of the present invention.
FIG. 14 is a timing diagram illustrating the control of the projection module and the collection module according to the fifth embodiment of the present invention.
Fig. 15 is a fourth depth image acquisition step of the embodiment of the present invention.
FIG. 16 is a timing diagram illustrating the control of the sixth optical projection module and the collection module according to the present invention.
101-depth camera, 102-processor, 103-circuit board, 104-structured light projection module, 105-acquisition module, 106-interface, 107-RGB camera, 108-light in/out window, 201-substrate, 202-light source, 203-lens unit, 204-spot pattern generator, 301-VCSEL array light source, 302-subarray, 303-further subarray, 304-diffractive optical element, 305-diffractive optical element unit, 306-further diffractive optical element unit, 307-projection area, 308-speckle pattern, 309-further speckle pattern, 401-larger window, 501-smaller window, 601-hybrid speckle pattern.
Detailed Description
The present invention will be described in detail below with reference to the following embodiments in order to better understand the present invention, but the following embodiments do not limit the scope of the present invention. It should be noted that the drawings provided in the following embodiments are only for illustrating the basic concept of the present invention, and the drawings only show the components related to the present invention rather than the number, shape and size of the components in actual implementation, the shape, number and proportion of the components in actual implementation can be changed freely, and the layout of the components can be more complicated.
FIG. 1 is a schematic side view of a structured light based depth camera. The depth camera 101 is mainly composed of a structured light projection module 104, an acquisition module 105, a circuit board 103, and a processor 102, and an RGB camera 107 is also provided in some depth cameras. The structured light projection module 104, the collection module 105, and the RGB camera 107 are typically mounted on a fixed support and are located on the same depth camera plane, and the other three are typically located on the same baseline, with each module or camera corresponding to a light entrance/exit window 108. Generally, the processor 102 is mounted on the circuit board 103, and the structured light projection module 104 and the collection module 105 are respectively connected to the motherboard through an interface 106, which may be a DVP interface, an MIPI interface, or the like. The circuit board 103 may be a circuit board such as a PCB or a semiconductor substrate. The structured light projection module 104 is configured to project the coded structured light pattern into the target space, and the acquisition module 105 acquires the structured light image and then processes the structured light image by the processor to obtain a depth image of the target space. In one embodiment, the structured light image is an infrared laser speckle pattern, and the pattern particles are distributed relatively uniformly but have high local irrelevancy, where local irrelevancy means that each sub-region in the pattern along a certain directional dimension (generally, along the direction where the laser projection module and the collection module are connected) has high uniqueness. The corresponding collection module 105 is an infrared camera corresponding to the structured light projection module 104. The processor is used for acquiring the depth image, specifically receiving the speckle pattern acquired by the acquisition module, and calculating the deviation value between the speckle pattern and the reference speckle pattern to further obtain the depth image. The processor 102 is used for controlling the operation of the components, such as the synchronous turning on of modules at a specific frequency, in addition to the depth calculation.
The depth camera shown in FIG. 1 may be a standalone depth camera device or may be an embedded depth camera. The depth camera further includes an output interface (not shown), such as a USB, MIPI, or the like, connected to the processor, for outputting the depth image to another host device or another module in the same device.
Structured light projection module
FIG. 2 is an embodiment of the structured light projection module 104 of FIG. 1. The structured light projection module 104 includes a substrate 201, a light source 202, a lens 203, and a speckle pattern generator 204. The substrate 201 is typically a semiconductor substrate, such as a wafer, on which a plurality of light sources 202 are arranged, the substrate 201 and the light sources 202 together constituting a laser array, e.g. a VCSEL array chip. The light source 202 includes a plurality of sub-light sources for emitting a plurality of sub-light beams, the light source may be a visible light, an invisible light such as an infrared laser, an ultraviolet laser, or other laser light source, the light source may be an edge emitting laser or a vertical cavity surface laser emitter (VCSEL), and in order to make the overall projection apparatus smaller in size, an optimal scheme is to select a vertical cavity surface laser emitter array (VCSEL array) as the light source. In addition, different types of VCSEL array light sources can be arranged on the same substrate, for example, the shapes, sizes and brightness of the VCSEL array light sources can be different. For convenience of illustration, 3 sub-light sources are listed in only one dimension, and in fact, the VCSEL array light sources are two-dimensional light sources arranged in a fixed two-dimensional pattern. The VCSEL array chip can be a bare chip or a packaged chip, and the difference between the VCSEL array chip and the packaged chip is that the bare chip has smaller volume and thickness, and the packaged chip has better stability and more convenient connection.
In order to make the pattern emitted from the structured light projection apparatus 104 uniform and irrelevant, the arrangement pattern of the VCSEL array chips is required to be irregular, that is, the light sources are not arranged in a regular array, but arranged in a certain irregular pattern. In one embodiment, the VCSEL array chip has an overall size of only millimeter, such as 2mmX2mm, with tens or even hundreds of light sources arranged thereon, with the distance between the individual light sources being in the order of micrometers, such as 30 μm.
The lens unit 203 is used to receive the light beams emitted by the VCSEL array light source 202 and converge the light beams, and in one embodiment, collimate the diverging VCSEL array light source light beams into parallel beams to ensure a more concentrated emitted spot energy. In addition to using a single lens, in another embodiment, a Micro Lens Array (MLA) may be used, where each micro lens unit in the MLA corresponds to each light source 202, or where one micro lens unit corresponds to multiple light sources 202; in another embodiment, a lens group may be used to achieve beam convergence.
The speckle pattern generator 204 is used for receiving the lens beam and emitting the beam capable of forming the speckle pattern outwards, in an embodiment, the speckle pattern generator 204 is a Diffractive Optical Element (DOE), and the DOE plays a role of beam splitting, for example, when the number of the light sources 202 is 100, that is, the beam transmitted to the DOE via the lens is 100, the DOE can expand the lens beam with a certain number (for example, 200) of magnifications, and finally emit 20000 beams into the space, ideally 20000 spots will be seen (in some cases, there is some spots overlapping, resulting in the number of spots being reduced). Instead of DOEs, any other optical element that can form spots may be used, such as MLA, gratings or a combination of optical elements.
The lens unit 203 and the speckle pattern generator 204 may be fabricated on the same optical element in some embodiments to achieve the effect of reducing the volume.
FIG. 3 is a schematic diagram of a structured light projection module 104 according to yet another embodiment of the invention. Figure 3 will illustrate the principles of the present invention more intuitively than figure 2. The structured light projection module 104 is composed of a VCSEL array light source 301 and a diffractive optical element 304, and generally includes a lens, which is not shown in the figure for clarity, and can be placed between the light source 301 and the diffractive optical element 304 as shown in fig. 2 for collimating the light beam of the light source; alternatively, the lens may be placed on the outside, i.e. on the side of the diffractive optical element 304 facing away from the light source 301, where the lens acts as a projection, i.e. projects the DOE304 beam into the target space.
In the present embodiment, the VCSEL array light source 301 is formed by arranging a plurality of VCSEL light sources in an irregular pattern on a semiconductor substrate. In addition, the VCSEL array light source 301 is divided into a sub-array 302 and a further sub-array 303, the sub-arrays being spatially separated from each other, for example, two sub-arrays are separated from each other left and right in the figure, and a separation line is shown in the middle of the figure, which is merely for illustrative purposes and is not necessarily present in the VCSEL array light source. In other embodiments, there may be more than 2 number of sub-arrays. The further sub-array 303 has a larger number of more closely spaced VCSEL light sources than the sub-array 302, and in addition, different sub-arrays may differ in wavelength, light source shape, light emission power, etc.
The diffractive optical element 304 is also composed of a subunit 305 and a further subunit 306, each subunit corresponds to a sub-array one by one, in this embodiment, the subunit 305 and the further subunit 306 correspond to the sub-array 302 and the further sub-array 303, respectively, and the subunits copy (split) the light beams emitted from the corresponding sub-arrays by a certain multiple and project the light beams to a spatial region. The one-to-one correspondence is the correspondence between the sub-units and the beams emitted by the corresponding sub-arrays, i.e. the sub-unit 305 only splits the beams emitted by the sub-array 302, and the sub-unit 306 only splits the beams emitted by the sub-array 303. In fig. 3, similar to the light source 301, the sub-unit 305 is physically separated from the further sub-unit 306 in a left-right manner so as to receive the light beams of the sub-array 302 and the further sub-array 303. It is not excluded, however, that the diffractive optical element 304 may have other forms of physical or non-physical arrangements.
In an alternative embodiment of this embodiment, the VCSEL array light source can be combined in a variety of irregular pattern arrangements with a separation between each pattern arrangement; the subunits of the diffractive optical element correspond one-to-one to each light source arrangement. The speckle pattern 308 and the further speckle pattern 309 cover the projection area 307 of the structured light projection module 104, in other words, the sub-unit 305 and the further sub-unit 306 have the same projection field of view, so that the projected speckle patterns overlap on the projection area 307. The speckle pattern 308 and the further speckle pattern 309 are formed by the sub-array 302 and the further sub-array 303 via the sub-unit 305 and the further sub-unit 306, respectively. In one embodiment, the subarray 302 is formed by an irregular arrangement of 50 VCSEL light sources, and the replication multiple of the corresponding diffractive optical element subunit 305 is 100 times, so that the irregular arrangement pattern of 50 VCSEL light sources is replicated into 100 identical irregular arrangement patterns, and the 100 irregular arrangement patterns can be adjacently or overlappingly distributed according to the performance of the diffractive optical element 304 to form the speckle pattern 308, and in principle, 5000 speckle grains will be contained in the speckle pattern 308, but it is not excluded that when the irregular arrangement patterns are overlappingly distributed, a small number of grains will be overlapped to reduce the total number slightly. Similarly, the sub-array 303 is formed by irregularly arranging 200 VCSEL light sources, the replication multiple of the sub-unit 306 corresponding to the diffractive optical element is 200 times, and the formed further speckle pattern 309 contains 40000 speckle particles, which also does not exclude that when the irregularly arranged patterns are distributed in an overlapping manner, a few particles overlap to reduce the total number slightly.
If the VCSEL source power is the same, the smaller the number of times the diffractive optical element 304 replicates, the higher the intensity of the resulting speckle pattern, and in the embodiment shown in fig. 3 the replication number of the diffractive optical element subunit 305 is smaller than the replication number of the further subunit 306, so that the speckle pattern 308 will have a stronger intensity than the further speckle pattern 309, which is represented by the larger circular area in fig. 3. In a preferred embodiment, the replication multiple of the DOE subunit corresponding to the subarray with the small number of light sources in the VCSEL light source is also small, so that the density of the speckle pattern 308 is small and the brightness is large, while the density of the speckle pattern 309 formed by the subarray with the large number of light sources in the VCSEL light source is high and the brightness is small, so that when the density is too high, the contrast of the speckle pattern is affected if the brightness is too large, and the depth calculation accuracy is affected. A further reason is that the depth calculation accuracy can be improved by the matching calculation of multiple windows, which will be described later. In other embodiments, for a large number of VCSEL sources, a small copy multiple of the corresponding DOE subunit may also produce a low density, high intensity speckle pattern.
In one embodiment, the light emitting power of each VCSEL light source in the sub-array 302 is greater than the power of each VCSEL light source in the further sub-array 303, so that the high-brightness speckle pattern 308 and the low-brightness speckle pattern 309 can be obtained.
The VCSEL light source sub-array 302, the further sub-array 303 may be controlled individually or simultaneously, and when the sub-arrays 302 are independently turned on, the speckle pattern in the projection area 307 is as shown in fig. 4; when the further sub-array 303 is independently opened, the speckle pattern in the projection area 307 is as shown in fig. 5; a hybrid speckle pattern 601 as shown in fig. 6 can be obtained when the sub-arrays 302 and 303 are synchronously turned on.
As described above, the structured light projection module 104 of the embodiment shown in fig. 3 can operate in at least three modes, i.e., independently and simultaneously turn on two light source sub-arrays, so as to form a speckle pattern with high brightness but low density, low brightness but high density, and mixture. The three modes can be respectively suitable for different applications, for example, for the speckle pattern shown in fig. 5 or the mixed speckle pattern shown in fig. 6, because of the high density, when performing depth calculation, a smaller window 501 can be selected for performing matching calculation, so as to obtain a depth image with high precision and high resolution, but this depth calculation method has a disadvantage that each pixel needs to be iterated for finding the best match for a plurality of times when performing matching calculation, and generally, the smaller the window, the more times of iteration, so that it is difficult to achieve depth image output with high frame rate; for the speckle pattern shown in fig. 4, because the particles are sparse, a window 401 large enough needs to be selected to ensure that the speckle sub-patterns in the window have irrelevancy, the matching calculation performed at this time is usually simpler and faster, the accuracy of the obtained depth image is lower, and sometimes the resolution is also reduced.
Depth image calculation
By using the various modes of the structured light projection module 104 and combining the collection module 105, the depth measurement with high precision can be realized. The following will be described in detail with reference to fig. 7 to 15.
FIG. 7 is a timing diagram illustrating the control of the structured light projection module 104 and the collection module 105 according to an embodiment of the present invention. The abscissa in the figure represents time T, and K represents the acquisition frame rate of the acquisition module, which may also be the depth image output frame rate of the depth camera, and the time required for each frame is 1/K. For example, K is 30fps, the period of each frame is 1/30 s. In the ordinate, C Expo represents the exposure time of the pickup module, VCSEL1 and VCSEL2 represent VCSEL light source subarray 302 and VCSEL light source subarray 303 in the structured light projection module 104, respectively, and the corresponding lines represent the control timing diagrams of the VCSEL light source subarray 302 and VCSEL light source subarray 303 over time, respectively, and the high level represents the exposure of the camera and the VCSEL light source is turned on, and the low level represents the off.
According to the timing chart shown in fig. 7, the capturing module 105 is exposed in each frame period to capture an image, and the exposure time is generally less than the frame period because in each frame period, in addition to the exposure, signals need to be transmitted, and the exposure time is schematically set in the middle of each frame period, and the exposure time may be set in other positions.
According to the timing diagram shown in fig. 7, the sub-array 302 and the further sub-array 303 are turned on synchronously in each frame period (only 6 frame periods are schematically shown in the figure), so that the hybrid speckle pattern shown in fig. 6 will be acquired by the acquisition module 105 every frame. In fact, when an object is present in the target space, the hybrid speckle pattern will be modulated (or otherwise distorted) by the presence of the object, and the collected image will be a distorted hybrid speckle pattern.
In the timing chart shown in fig. 7, the sub-array 302 and the further sub-array 303 are continuously emitting light, but actually only need to be opened within the exposure time of the acquisition module 105, so that there is another control situation, namely, pulse emission. As shown in fig. 8, a timing control diagram of a structured light projection module and a collection module according to another embodiment of the present invention is shown. In each frame period, the subarrays 302 and 303 synchronously send out pulses consistent with the exposure time, the duration of the two subarray pulses should not be lower than the exposure time, and the duration of different subarray pulses may be the same or different. In the case of the pulse-synchronized illumination shown in fig. 8, the hybrid speckle pattern shown in fig. 6 is also acquired every frame by the acquisition module 105.
The pulse light emission has obvious advantages compared with the continuous light emission, and on one hand, under the condition that the power of the light source is the same, the power consumption of the pulse light emission is lower; if under the condition of the same power consumption, the pulse light-emitting can have larger power so as to enable the illumination distance to be longer, and the measurement precision and the distance can be improved. In the following embodiments, pulsed light emission will be described as an example, and it is understood that each case is also applicable to the case of continuous light emission.
Fig. 9 shows a depth image acquisition step of the embodiment of the present invention, which is the depth image acquisition step in the case shown in fig. 8. It will be understood that each step is executed by the processor 102 in the depth camera sending out a corresponding control signal or directly receiving a program stored in the memory by the processor, and the description of the processor will be omitted in the following description.
In step 901, the sparse and dense subarrays 302 and 303 in the VCSEL array light source are turned on simultaneously, and at this time, the structured light projection module 104 projects the speckle pattern 308 and another speckle pattern 309 into the projection area 307. Synchronization as used herein means that both subarrays 302 and 303 are turned on during the exposure time of each frame of acquisition module 105. It is understood that the on mode may be either pulsed or continuous.
The blended speckle pattern is captured by the capture module 105 in step 902, typically the captured image is a distorted blended speckle image modulated by the object in the target.
In step 903, a matching calculation is performed on the acquired hybrid speckle image by using the large window 401(MxM pixels) in fig. 6, where the matching calculation refers to performing a matching calculation on the hybrid speckle pattern and the hybrid reference speckle pattern to obtain a rough offset value of each pixel or a part of the pixels.
In step 904, according to the collected mixed speckle image, the small window 501(NxN pixels) in fig. 6 is further utilized and the coarse deviation value is directly used as a search initial value for matching calculation to obtain a fine deviation value of each pixel or a part of pixels, and finally, a high-precision depth value is calculated according to the fine deviation value in combination with a trigonometry. The depth values of some or all of the pixels constitute a depth image, which will be described for convenience hereinafter.
In the above steps, the matching speed in step 903 is fast, the obtained deviation value is not accurate, and then a high-precision deviation value is obtained by performing further matching calculation in step 904, so as to obtain a high-precision depth image. Compared with the method which directly uses a small window for calculation, the method can not only accelerate the calculation speed, but also ensure the calculation precision to be greater, on one hand, the speckle pattern 308 ensures the high irrelevance of the large window, thereby ensuring the matching calculation to be fast and accurate. And during further small window matching calculation, the rough deviation value obtained in the last step is directly used as an initial value, and the matching calculation at the moment can be quickly converged, so that high-precision calculation is realized.
On the other hand, it is difficult to achieve a dense speckle pattern with uniform brightness, such as the further speckle pattern 309, by using a depth calculation scheme with a large window and a small window. The main reason is that when speckle particles in the pattern are dense, the irrelevance of the large window is reduced, so that the problems that high efficiency is difficult to achieve when the large window is used for matching calculation and mismatching is easy to occur are caused. Efficient and highly accurate measurement can be achieved with the hybrid speckle pattern because the high brightness speckle particles in the large window 401 in fig. 6 ensure a high degree of irrelevancy over the large window.
In addition, in order to further reduce the power consumption of the structured light projection module 105, the present invention provides a further control timing diagram in which the sub-arrays 302, 303 are switched at regular frequency intervals. Fig. 10 is a timing diagram illustrating a control sequence of a structured light projection module and a collection module according to another embodiment of the present invention. During the first frame period of the acquisition module 105, the subarray 302 is pulsed (or may be continuously illuminated during the period) for the exposure time, so that the image acquired by this frame will be the sparse structured light pattern 308 shown in fig. 4; during the next frame period, a further sub-array 303 is pulsed for the exposure time, when a dense structured light pattern 309 as shown in fig. 5 is acquired.
Under the control timing sequence shown in fig. 10, the step of acquiring the depth image by the depth camera is shown in fig. 11, and fig. 11 is another depth image acquiring step according to the embodiment of the present invention. The steps actually involve the control switching of the light source sub-arrays, but this is explicitly illustrated in the control timing diagram and is therefore omitted here for clarity.
As shown in fig. 11, in step 1101, a frame of deformed sparse-structured light pattern 308 modulated by the target object is acquired by the acquisition module 105 under the condition that the sub-arrays 302 emit light alone.
In step 1102, matching calculation is performed according to the sparse structured light pattern 308 and the reference sparse speckle image, a sub-window MxM is selected for matching search, a rough pixel deviation value can be obtained, and further a rough depth image can be calculated according to a trigonometry (the trigonometry calculation principle is the prior art, and is not described in detail herein). In this step, the speckle grain shape in the window is less due to the sparse speckle pattern, so the matching calculation accuracy is also lower, but this step has a very high calculation speed.
In step 1103, under the condition that the sub-arrays 303 emit light individually, the collection module 105 collects a frame of deformed dense structured-light patterns 309 modulated by the target object.
In step 1104, performing matching calculation according to the dense structured light pattern 309 and the reference dense structured light pattern, selecting a sub-window NxN for matching search, directly using the coarse deviation value obtained in step 1102 as a search initial value of the matching calculation to obtain a fine deviation value of each pixel or a part of pixels, and finally calculating a high-precision fine depth value according to the fine deviation value in combination with a trigonometry.
According to the embodiment shown in fig. 11, the coarse depth image is obtained in step 1102, and the fine depth image is obtained in step 1104, compared to the embodiment shown in fig. 9, the frame rate of the fine depth image is decreased by one time on the premise that the frame rates of the acquisition modules are the same.
FIG. 12 is a timing diagram illustrating the control of the projection module and the collection module according to the fourth embodiment of the present invention. In contrast to fig. 10, the sparse subarray 302 is pulsed (continuously illuminated) during each frame period of the acquisition module, while the dense subarray 303 is pulsed (continuously illuminated) one frame period apart. Thus, in the first frame period, the sparse structured light pattern 308 will be acquired by the acquisition module 105, and in the next frame period, the mixed structured light pattern composed of the sparse structured light pattern 308 and the dense structured light pattern 309 will be acquired by the acquisition module 106.
In the control sequence shown in fig. 12, the step of acquiring the depth image by the depth camera is shown in fig. 13, and fig. 13 is a further depth image acquiring step according to the embodiment of the present invention.
In step 1301, a frame of deformed sparse structured light pattern 308 modulated by the target object is acquired by the acquisition module 105 under the condition that the sub-array 302 emits light alone.
In step 1302, a matching calculation is performed according to the sparse structured light pattern 308 and the reference sparse speckle image, and a sub-window MxM is selected for matching search, so that a rough pixel deviation value can be obtained, and further a rough depth image can be calculated according to a trigonometry method.
In step 1303, the collection module 105 collects a frame of deformed mixed structure light pattern modulated by the target object under the condition that the sub-array 302 and the other sub-array 303 emit light synchronously.
In step 1304, performing matching calculation according to the mixed structure light pattern and the reference mixed structure light pattern, selecting a sub-window NxN for matching search, directly using the coarse deviation value obtained in step 1302 as a search initial value of the matching calculation to obtain a fine deviation value of each pixel or a part of pixels, and finally calculating a high-precision fine depth value according to the fine deviation value and a trigonometry method.
FIG. 14 is a timing diagram illustrating the control of the projection module and the collection module according to the fifth embodiment of the present invention. The dense subarray 303 is pulsed (continuously illuminated) during each frame period of the acquisition module, while the sparse subarray 302 is pulsed (continuously illuminated) one frame period apart. Thus, in the first frame period, the collection module 105 will collect a mixed structured light pattern, and in the next frame period, the collection module 106 will collect a dense structured light pattern 309.
Under the control timing shown in fig. 14, the step of acquiring the depth image by the depth camera is shown in fig. 15, and fig. 15 is a fourth depth image acquiring step according to the embodiment of the present invention.
In step 1501, a frame of deformed mixed structured light pattern modulated by the target object is collected by the collection module 105 under the synchronous illumination of the sub-arrays 302 and 303.
In step 1502, a matching calculation is performed according to the mixed structured light pattern and the reference mixed structured light pattern, and a sub-window MxM is selected for a matching search, so that a rough pixel deviation value can be obtained, and further, a rough depth image can be calculated according to a trigonometry method.
In step 1503, a frame of the deformed dense structured light pattern 309 modulated by the target object is acquired by the acquisition module 105 under the condition that the sub-array 303 emits light alone.
In step 1504, performing matching calculation according to the dense structured light pattern and the reference dense structured light pattern, selecting a sub-window NxN for matching search, directly using the rough deviation value obtained in step 1502 as a search initial value of the matching calculation to obtain a fine deviation value of each pixel or a part of pixels, and finally calculating a high-precision fine depth value according to the fine deviation value and a trigonometry.
In the embodiment shown in fig. 10 to 15, two adjacent frames are used for calculating the coarse deviation value and the fine deviation value respectively, and the cycle is repeated, and the ratio of the frame rate of the coarse deviation value to the frame rate of the fine deviation value is 1: 1. in fact, if the frame rate is high enough, the number of fine deviation frames can be increased, for example, one frame is collected for calculating the coarse deviation value, and the structured light patterns collected in the next 2 frames or more are used as the initial search value when performing the matching calculation. FIG. 16 is a timing diagram illustrating control of a sixth structured-light projection module and an acquisition module according to an embodiment of the present invention, wherein the frame ratio of the coarse deviation value to the fine deviation value is 1: 2.
the above embodiments each have advantages and are suitable for a variety of applications. For example, to acquire a high-precision and high-frame-rate depth image, the embodiments shown in fig. 8 and 9 are applicable. However, when the requirement on the frame rate is not high, but the embodiments shown in fig. 10 to 15 may be adopted to achieve low power consumption, where when the brightness difference between the sparse subarray and the dense subarray is too large, and when the matching calculation is performed by using the mixed structured light pattern, the dense part information in the pattern is covered by the brightness of the sparse part, the embodiments shown in fig. 10 and 11 are suitably adopted, and when the difference is not large, the embodiments shown in fig. 12 to 15 may be adopted. In fact, various embodiments may be evaluated for different applications to find the most appropriate solution.
It is to be noted here that, in the embodiments described in fig. 7 to 15, in order to increase the speed of the acquisition of the coarse deviation value, it can be achieved by lowering the resolution, that is, lowering the resolution appropriately when calculating the coarse deviation value/depth image. In addition, when calculating the rough deviation value, the deviation value cannot be calculated at the edge position of the region 307 because the window selection is large. Therefore, when the coarse deviation value is subsequently used in calculating the fine deviation value, when there is no value on some pixels, the coarse deviation value can be obtained by interpolation, and the interpolation is performed by various methods, such as median, mean, spline interpolation, and the like.
The reference structured light pattern described in the above description is a structured light pattern projected by the structured light projection module 104 in the depth camera, and the structured light pattern collected by the collection module 105 is the reference structured light pattern. The reference sparse structured light pattern, the reference dense structured light pattern and the reference mixed structured light pattern mentioned herein are obtained by the method, that is, the reference sparse structured light pattern, the reference dense structured light pattern and the reference mixed structured light pattern are obtained by the acquisition module 105 under the condition that the structured light projection module 104 projects the sparse structured light pattern, the dense structured light pattern and the mixed structured light pattern. These reference structured light patterns are typically stored in the memory of the depth camera and are recalled by the processor at the time of computation.
In the above embodiments, the VCSEL array light source is exemplified to include two sub-arrays, in other embodiments, the number of sub-arrays may be 3 or more, and there may be a plurality of corresponding DOE sub-units, so that the types of the operation modes of the structured light projection module are more, but all of them may be extended by 2 sub-arrays, and thus all of them are included in the scope of the present invention.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several equivalent substitutions or obvious modifications can be made without departing from the spirit of the invention, and all the properties or uses are considered to be within the scope of the invention.

Claims (12)

1. A depth camera based on a VCSEL array light source, comprising:
structured light projection module, includes: the light source device comprises VCSEL array light sources of at least two groups of VCSEL sub-arrays and diffraction light source elements of at least two diffraction optical element sub-units, wherein the diffraction optical element sub-units correspond to the VCSEL sub-arrays one to one, and the at least two diffraction optical element sub-units copy corresponding VCSEL sub-array light beams by different multiples and then emit structured light patterns outwards; the structured light pattern comprises one or a combination of sparse structured light patterns and dense structured light patterns with different brightness;
the acquisition module is used for acquiring the structured light pattern reflected by the target;
a processor for computing a depth image from the structured light pattern;
the processor performs matching calculation with a first window by using the structured light pattern comprising a sparse structured light pattern to obtain a coarse deviation value and/or a coarse depth image; and performing matching calculation by using the structured light pattern containing the dense structured light pattern through a second window to obtain a fine deviation value and/or a fine depth image.
2. The VCSEL array light source based depth camera of claim 1, wherein the structured light pattern comprises a hybrid structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern.
3. The VCSEL array light source based depth camera of claim 2,
the processor performs matching calculation by using the mixed structure light pattern and a first window to obtain a rough deviation value and/or a rough depth image;
and the processor performs matching calculation by using the mixed structure light pattern and a second window to obtain a fine deviation value and/or a fine depth image.
4. The VCSEL array light source based depth camera of claim 1, wherein the structured light pattern comprises a sparse structured light pattern and a dense structured light pattern.
5. The VCSEL array light source based depth camera of claim 4,
the processor performs matching calculation according to the sparse structured light pattern and by using a first window to obtain a rough deviation value and/or a rough depth image;
and the processor performs matching calculation by using the dense structured light pattern and a second window to obtain a fine deviation value and/or a fine depth image.
6. The VCSEL array light source based depth camera of claim 1, wherein the structured light pattern comprises a sparse structured light pattern and a mixed structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern.
7. The VCSEL array light source based depth camera of claim 6,
the processor performs matching calculation by using the sparse structured light pattern and a first window to obtain a rough deviation value and/or a rough depth image;
and the processor reuses the mixed structure light pattern and performs matching calculation by using a second window to obtain a fine deviation value and/or a fine depth image.
8. The VCSEL array light source based depth camera of claim 1, wherein the structured light pattern comprises a mixed structured light pattern and a dense structured light pattern consisting of a sparse structured light pattern and a dense structured light pattern.
9. The VCSEL array light source based depth camera of claim 8,
the processor utilizes the mixed structure light pattern and carries out matching calculation by a first window to obtain a rough deviation value and/or a rough depth image;
and the processor reuses the mixed structured light pattern formed by the dense structured light pattern and performs matching calculation by using a second window to obtain a fine deviation value and/or a fine depth image.
10. The VCSEL array light source based depth camera of claim 1, wherein a brightness of pattern units in the sparse structured light pattern is greater than a brightness of pattern units in the dense structured light pattern.
11. The VCSEL array light source based depth camera of any of claims 1, 3, 5, 7 or 9, wherein the first window is larger than the second window.
12. The VCSEL array light source based depth camera of any of claims 1, 3, 5, 7 or 9, wherein the fine deviation value and the fine depth image are calculated by matching a second window with the coarse deviation value or the coarse depth image.
CN201710359556.3A 2017-05-19 2017-05-19 Depth camera based on VCSEL array light source Active CN106997603B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710359556.3A CN106997603B (en) 2017-05-19 2017-05-19 Depth camera based on VCSEL array light source

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710359556.3A CN106997603B (en) 2017-05-19 2017-05-19 Depth camera based on VCSEL array light source
PCT/CN2018/071987 WO2018209988A1 (en) 2017-05-19 2018-01-09 Depth camera based on vcsel array light source

Publications (2)

Publication Number Publication Date
CN106997603A CN106997603A (en) 2017-08-01
CN106997603B true CN106997603B (en) 2020-04-17

Family

ID=59434940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710359556.3A Active CN106997603B (en) 2017-05-19 2017-05-19 Depth camera based on VCSEL array light source

Country Status (2)

Country Link
CN (1) CN106997603B (en)
WO (1) WO2018209988A1 (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106997603B (en) * 2017-05-19 2020-04-17 深圳奥比中光科技有限公司 Depth camera based on VCSEL array light source
CN109756725A (en) * 2017-08-25 2019-05-14 华为技术有限公司 Structured light projection device, three-dimensional camera mould group and terminal device
CN108107549A (en) * 2017-11-03 2018-06-01 玉晶光电(厦门)有限公司 Optical lens group
CN108107548A (en) 2017-11-03 2018-06-01 玉晶光电(厦门)有限公司 Optical lens group
WO2019134672A1 (en) * 2018-01-06 2019-07-11 Oppo广东移动通信有限公司 Laser emitter, optoelectronic device, and depth camera
CN108107662A (en) * 2018-01-06 2018-06-01 广东欧珀移动通信有限公司 Laser emitter, optoelectronic device and depth camera
WO2019134671A1 (en) * 2018-01-06 2019-07-11 Oppo广东移动通信有限公司 Laser emitter, optoelectronic device, and depth camera
CN108107661A (en) * 2018-01-06 2018-06-01 广东欧珀移动通信有限公司 Laser emitter, optoelectronic device and depth camera
CN108333858A (en) * 2018-01-23 2018-07-27 广东欧珀移动通信有限公司 Laser emitter, optoelectronic device, depth camera and electronic device
CN108107663A (en) * 2018-01-23 2018-06-01 广东欧珀移动通信有限公司 Laser emitter, optoelectronic device, depth camera and electronic device
CN108490628B (en) * 2018-03-12 2020-01-10 Oppo广东移动通信有限公司 Structured light projector, depth camera and electronic device
CN108646429A (en) * 2018-06-21 2018-10-12 深圳市光鉴科技有限公司 A kind of structured light projection instrument
CN109597211A (en) * 2018-12-25 2019-04-09 深圳奥比中光科技有限公司 A kind of projective module group, depth camera and depth image acquisition method
WO2020177007A1 (en) * 2019-03-01 2020-09-10 Shenzhen Raysees Technology Co., Ltd. Pattern projector based on vertical cavity surface emitting laser (vcsel) array

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2133619A1 (en) * 2008-06-10 2009-12-16 Sick Ag Three-dimensional monitoring and securing of an area
CN104581124A (en) * 2013-10-29 2015-04-29 汤姆逊许可公司 Method and apparatus for generating depth map of a scene
CN104798271A (en) * 2012-11-29 2015-07-22 皇家飞利浦有限公司 Laser device for projecting a structured light pattern onto a scene
CN205002744U (en) * 2014-07-28 2016-01-27 苹果公司 Electro -optical device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9870068B2 (en) * 2010-09-19 2018-01-16 Facebook, Inc. Depth mapping with a head mounted display using stereo cameras and structured light
JP6563022B2 (en) * 2015-01-29 2019-08-21 ヘプタゴン・マイクロ・オプティクス・プライベート・リミテッドHeptagon Micro Optics Pte. Ltd. Apparatus for generating patterned illumination
CN106568396A (en) * 2016-10-26 2017-04-19 深圳奥比中光科技有限公司 Laser projector and depth camera thereof
CN206833079U (en) * 2017-05-09 2018-01-02 深圳奥比中光科技有限公司 Array laser projection arrangement and depth camera
CN106997603B (en) * 2017-05-19 2020-04-17 深圳奥比中光科技有限公司 Depth camera based on VCSEL array light source

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2133619A1 (en) * 2008-06-10 2009-12-16 Sick Ag Three-dimensional monitoring and securing of an area
CN104798271A (en) * 2012-11-29 2015-07-22 皇家飞利浦有限公司 Laser device for projecting a structured light pattern onto a scene
CN104581124A (en) * 2013-10-29 2015-04-29 汤姆逊许可公司 Method and apparatus for generating depth map of a scene
CN205002744U (en) * 2014-07-28 2016-01-27 苹果公司 Electro -optical device

Also Published As

Publication number Publication date
CN106997603A (en) 2017-08-01
WO2018209988A1 (en) 2018-11-22

Similar Documents

Publication Publication Date Title
US9983297B2 (en) LIDAR based 3-D imaging with varying illumination field density
US10218963B2 (en) Scanning projectors and image capture modules for 3D mapping
US10444359B2 (en) Light ranging device with electronically scanned emitter array and synchronized sensor array
US10119808B2 (en) Systems and methods for estimating depth from projected texture using camera arrays
JP2019164144A (en) Three-dimensional depth mapping using dynamic structured light
US9344705B2 (en) Time of flight camera with rectangular field of illumination
US20200319343A1 (en) Multiple pixel scanning lidar
EP2926422B1 (en) Laser device for projecting a structured light pattern onto a scene
US20170003122A1 (en) Projectors of structured light
JP6644892B2 (en) Light detection distance measuring sensor
JP2019505818A (en) Real-time object position detection
US10539661B2 (en) Three dimensional LIDAR system with targeted field of view
US20150331112A1 (en) Laser ranging,tracking and designation using 3-d focal planes
US10288417B2 (en) Overlapping pattern projector
US10097741B2 (en) Camera for measuring depth image and method of measuring depth image using the same
US10795001B2 (en) Imaging system with synchronized scan and sensing
JP4485365B2 (en) Ranging device
CA3017819A1 (en) Lidar based 3-d imaging with varying illumination intensity
US10571709B2 (en) Integrated structured-light projector
KR101854188B1 (en) 3D image acquisition apparatus and method of acqiring depth information in the 3D image acquisition apparatus
US10055854B2 (en) Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions
US9674415B2 (en) Time-of-flight camera system with scanning illuminator
US7237919B2 (en) Range finder, three-dimensional measuring method and light source apparatus
KR101613133B1 (en) Method and apparatus for processing 3-dimensional image
EP3175201B1 (en) Overlapping pattern projector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant