CN210803719U - Depth image imaging device, system and terminal - Google Patents
Depth image imaging device, system and terminal Download PDFInfo
- Publication number
- CN210803719U CN210803719U CN201921405778.5U CN201921405778U CN210803719U CN 210803719 U CN210803719 U CN 210803719U CN 201921405778 U CN201921405778 U CN 201921405778U CN 210803719 U CN210803719 U CN 210803719U
- Authority
- CN
- China
- Prior art keywords
- light
- imaging
- filtering
- imaging device
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The application provides a degree of depth image device, system and terminal, this image device includes: a light source device located at one side of the target scene for emitting at least predetermined light; the filtering device is positioned on one side of the target scene and used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, wherein the first filtered light is obtained by reflecting and filtering predetermined light through the target scene; and the first imaging device and the second imaging device are positioned on the light outlet side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band. The imaging device can improve the imaging effect on the target scene.
Description
Technical Field
The application relates to the field of imaging, in particular to a depth image imaging device, a depth image imaging system and a terminal.
Background
Currently, a depth image imaging system that depth-images a subject in addition to a conventional color image and combines depth information with color image information for various scenes such as biometrics, AR games, three-dimensional map construction, robots, and the like has appeared in the market, for example: time Of Flight techniques TOF (TOF for short), structured light, binocular vision, etc. At present, the TOF technology has the advantages of high frame rate, low software and hardware cost, long ranging distance and the like, so that the TOF technology is the most widely applied depth information acquisition mode. However, because the current TOF technology also has some inherent disadvantages, such as low resolution, partial image missing, depth value detection requiring de-duplication, etc., especially when a single TOF lens is used for imaging, various blanks or errors of the depth image are easily caused.
The above information disclosed in this background section is only for enhancement of understanding of the background of the technology described herein and, therefore, certain information may be included in the background that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
SUMMERY OF THE UTILITY MODEL
The application mainly aims to provide a depth image imaging device, a depth image imaging system and a depth image imaging terminal, and aims to solve the technical problem that partial pixel point measurement is lost or errors are large easily caused when a single TOF lens is adopted to image a complex scene with more reflecting surfaces.
In order to achieve the above object, according to one aspect of the present application, there is provided a depth image imaging apparatus including: a light source device located at one side of the target scene for emitting at least predetermined light; the filtering device is positioned on one side of the target scene and used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, wherein the first filtered light is obtained by reflecting and filtering the predetermined light through the target scene; the first imaging device and the second imaging device are positioned on the light emitting side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band.
Optionally, the method further comprises: and the polarizing equipment is positioned on one side of the filtering equipment and is used for enabling the preset light entering the first imaging equipment and the second imaging equipment to have mutually orthogonal polarization directions so as to obtain corresponding polarized light.
Optionally, the method further comprises: and the polarization analyzing device is positioned between the polarization generating device and the imaging device and is used for enabling the corresponding polarized light to respectively enter the first imaging device and the second imaging device.
Optionally, the first filtered light incident to the first imaging device and the predetermined light incident to the second imaging device have different modulation parameters.
Optionally, the first imaging device and the second imaging device are arranged at intervals on the light exit side of the filtering device.
Optionally, the predetermined light is modulated infrared light.
Optionally, the light corresponding to the second filtered light is visible light.
Optionally, the filter device is a fabry-perot interference cavity with an adjustable cavity thickness.
In order to achieve the above object, according to an aspect of the present application, there is provided a depth image imaging system including the depth image imaging apparatus of any one of the above.
In order to achieve the above object, according to one aspect of the present application, there is provided a terminal including a depth image imaging system, which is the depth image imaging system described in the above.
By applying the technical scheme of the application, in the depth image imaging device, the light source equipment is positioned on one side of the target scene and is used for at least emitting the predetermined light; the filtering device is positioned on one side of the target scene and used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, and the first filtered light is obtained by reflecting the predetermined light by the target scene and filtering the predetermined light; the first imaging device and the second imaging device are positioned on the light emitting side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band. The method can realize imaging information from multiple angles, avoids image blank or error caused by a single TOF lens, and improves the imaging effect, thereby solving the technical problem that partial pixel point measurement is lost or the error is large when a single TOF lens is adopted to image a complex scene with more reflecting surfaces in the related art.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
fig. 1 is a schematic structural diagram of a depth image imaging apparatus according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a depth image imaging system according to an embodiment of the present application;
FIG. 3(a) is a schematic diagram of a TOF lens modulating light in a depth image imaging system according to an embodiment of the present application;
FIG. 3(b) is a schematic diagram of another TOF lens modulating light in a depth image imaging system according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a terminal including a depth image imaging system according to an embodiment of the application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be understood that when an element such as a layer, film, region, or substrate is referred to as being "on" another element, it can be directly on the other element or intervening elements may also be present. Also, in the specification and claims, when an element is described as being "connected" to another element, the element may be "directly connected" to the other element or "connected" to the other element through a third element.
In an embodiment of the present application, there is provided a depth image imaging apparatus, and fig. 1 is a schematic structural diagram of a depth image imaging apparatus according to an embodiment of the present application, and as shown in fig. 1, the depth image imaging apparatus includes:
a light source device 20 located at one side of the target scene 70 for emitting at least predetermined light;
the filtering device 30 is located on one side of the target scene 70, and is configured to perform filtering processing on reflected light of the target scene 70 to obtain first filtered light and second filtered light, where the first filtered light is obtained by performing reflection and filtering processing on the target scene 70;
a first imaging device 40 and a second imaging device 50, located on the light exit side of the filtering device 30, for respectively imaging the first filtered light and the second filtered light of predetermined wavelength bands;
as an alternative embodiment, the light source device has a function of emitting multiple types of light, for example, different modulated light including, but not limited to, infrared light and the like can be emitted. It should be noted that, when the light source device emits the predetermined light to the target scene, the frequency, the amplitude, and other related parameters of the predetermined light may be flexibly adjusted, and in the specific implementation process, the light source device may emit the required predetermined light according to different application scenes. The light source device is not limited to the above configuration, and may further include a plurality of light source devices, and the plurality of light source devices emit different predetermined lights, that is, predetermined lights having different frequencies and different amplitudes may be obtained.
As an alternative embodiment, the light source device emits at least two predetermined lights, and the emission periods of the predetermined lights are spaced from each other. Then, one or at least two kinds of preset light are used for imaging, in the specific implementation process, the two kinds of preset light are used for imaging, the two kinds of preset light can be mutually combined to reduce errors and make up for the lack of images, or the two kinds of preset light are used for depth ranging, and the method can be used for removing the overlapped measurement result of additional k lambda/2 (k is a positive integer) in the depth ranging.
As an alternative embodiment, the filtering device may implement filtering processing on the reflected light of the target scene, where the filtering processing is to filter the reflected light of the target scene based on a predetermined wavelength band, and of course, the predetermined wavelength band may be set manually, and a default form of the imaging apparatus may also be adopted. For example, the predetermined wavelength range of visible light can be selected from the wavelength range of 380nm to 780nm, and the predetermined wavelength range of infrared light can be selected from the wavelength range of 780nm to 3000nm, and in the specific implementation, the above description is not limited.
The first filtered light corresponds to predetermined light, and may be infrared light or the like; the light corresponding to the second filtered light is visible light, which may be natural visible light or visible light emitted by the light source device. In specific implementation, taking visible light as an example, when the reflected light of the target scene is visible light, the visible light is filtered by the filtering device to obtain second filtered light; taking infrared light as an example, when the reflected light of the target scene is infrared light, the infrared light is filtered by the filtering device to obtain first filtered light.
Moreover, the filtering according to the preset wave band can be realized no matter visible light or infrared light, so that light required by imaging can be obtained. For the imaging device, the filtering processing can filter out useless or interference, effectively retains light required by imaging, and improves the imaging effect and the accuracy of subsequent depth calculation.
By applying the technical scheme of the application, in the depth image imaging device, the light source equipment is positioned at one side of the target scene and is used for at least emitting the predetermined light; the filtering device is positioned at one side of the target scene and is used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, and the first filtered light is obtained by reflecting and filtering the target scene; the first imaging device and the second imaging device are respectively located on the light emitting side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band. The device can realize the imaging information that can follow a plurality of angles, has avoided the image blank or the error that caused by single TOF camera lens, has improved the formation of image effect to when adopting single TOF camera lens to image the complicated scene that has more plane of reflection among the prior art, caused partial pixel easily and lacked or the great technical problem of error.
Optionally, the method further comprises: and the polarizing device is positioned at one side of the filtering device and is used for enabling the preset light entering the first imaging device and the second imaging device to have mutually orthogonal polarization directions so as to obtain corresponding polarized light.
As an optional embodiment, the polarization device may adjust different predetermined lights to corresponding polarized lights, and the polarized lights obtained after adjustment may not only reduce mutual interference of depth measurements of two imaging devices when imaging time overlaps, but also avoid other influences such as ambient light on an imaging result, thereby effectively ensuring a high-quality imaging effect.
Optionally, the method further comprises: and the polarization analyzing device is positioned between the polarization starting device and the imaging device and is used for respectively leading the corresponding polarized light to enter the first imaging device and the second imaging device.
As an optional embodiment, the polarization analyzing device may adjust a polarization direction of the polarized light, eliminate multipath reflection, ensure that the polarized light corresponding to the predetermined light is accurately incident on the imaging device, and further improve an imaging effect.
Optionally, the first filtered light incident to the first imaging device and the predetermined light incident to the second imaging device have different modulation parameters.
As an alternative embodiment, two imaging devices are made to image with different modulation parameters, respectively. And subsequently applied to depth measurements so that the measurement results can be used for de-duplication of the detection results.
Optionally, the first imaging device and the second imaging device are arranged at intervals on the light exit side of the filtering device.
It should be noted that, the first imaging device and the second imaging device are arranged at intervals on the light-emitting side of the filtering device, so that depth ranging can be performed on a target scene from different positions or angles, measurement errors are reduced, and the problem that in the related art, when a single imaging device is used for imaging a complex scene with more reflecting surfaces, measurement of partial pixels is easy to lose or errors are large can be effectively avoided.
Optionally, the predetermined light is modulated infrared light.
As an alternative embodiment, the predetermined light may be modulated infrared light, wherein different modulated infrared light has differences in frequency, amplitude, and other characteristics. In order to make the imaging device suitable for target scenes with different depths, the modulated infrared light can be set according to the requirements of the target scenes. It should be noted that, in a specific implementation, the amplitude of the modulated infrared light with the larger frequency is smaller than the amplitude of the modulated infrared light with the smaller frequency.
Optionally, the light corresponding to the second filtered light is visible light.
As an alternative embodiment, the light corresponding to the second filtered light is visible light, and the visible light may be ambient light or natural light, that is, the second filtered light may be obtained by performing filtering processing on the visible light reflected in the target scene.
Optionally, the filtering device is a fabry-perot interference cavity with an adjustable cavity thickness, and the adjustable cavity thickness means that the height of the cavity can be adjusted.
As an alternative embodiment, the cavity thickness of the filtering device is adjusted, that is, the predetermined wavelength band of the filtering device is adjusted, that is, the wavelength range is adjustable, so as to perform filtering imaging in the visible light and infrared light wavelength ranges, respectively.
As an alternative embodiment, the filter device preferably uses a Fabry-Perot (FP) interference cavity mode, and changes the cavity thickness participating in resonance in a mode of piezoelectric ceramics, Micro-Electro-Mechanical Systems (MEMS), angle definition or liquid crystal filling, etc. to achieve the purpose of changing the transmission wavelength peak value, for example, the transmission peak wavelength of the FP interference cavity under the vertical incidence condition is 2nd/k, where n is the cavity refractive index, d is the cavity thickness, k is a positive integer, and the half-height width of the transmission peak can also be adjusted by n, d, k and the reflectivity of the cavity surface.
The filtering device comprises a liquid crystal layer or a thickness variable layer, and the surface of the liquid crystal layer or the thickness variable layer is a high-reflection surface. Preferably, the angle or thickness of the filtering device is adjusted by using voltage, and the refractive index of the cavity material of the filter can be changed, so that light with different wave bands can be effectively filtered.
In an embodiment of the present application, there is provided a depth image imaging system including a depth image imaging apparatus of any one of the above, fig. 2 is a schematic structural diagram of a depth image imaging system according to an embodiment of the present application, and as shown in fig. 2, the imaging system 10 includes, in addition to: a light source device 20 located at one side of the target scene 70 for emitting at least predetermined light; the filtering device 30 is located on one side of the target scene 70, and is configured to perform filtering processing on reflected light of the target scene 70 to obtain first filtered light and second filtered light, where the first filtered light is obtained by performing reflection and filtering processing on the target scene 70; a first imaging device 40 and a second imaging device 50, located on the light exit side of the filtering device 30, for respectively imaging the first filtered light and the second filtered light of predetermined wavelength bands; a processing device 60 may also be included for determining a predetermined light from the second filtered light and calculating a depth of the target scene 70 from the first filtered light.
As an alternative embodiment, the processing device includes a processor and a memory, the processing unit is stored in the memory as a program unit, and the processor executes the program unit stored in the memory to implement the corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory. The kernel can be set to be one or more, and the depth of the measured target scene is calculated by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip. Through which computational data relating to the depth of the target scene, as well as corresponding algorithms, programs, etc., may be stored.
In the depth image imaging system of the present application, the light source device is located at one side of the target scene and is configured to emit at least predetermined light; the filtering device is positioned on one side of the target scene and used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, and the first filtered light is obtained by reflecting and filtering the target scene; the first imaging device and the second imaging device are positioned on the light emitting side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band; and a processing device for determining the predetermined light from the second filtered light and calculating the depth of the target scene from the first filtered light. Visible imaging and infrared imaging can be switched through the filtering device, the depth is estimated under the visible imaging, then appropriate modulated light is selected according to the depth estimation under the infrared imaging, and then the scene depth is calculated.
It should be noted that, in the specific implementation process, the filtering device is adjusted to be transparent to visible light, the depth is estimated by using a trigonometric parallax method, and then infrared light with two appropriate modulation frequencies is set according to the estimated depth, at this time, the filtering device is adjusted to be transparent to infrared light, the two imaging devices respectively perform depth measurement through the two modulation frequencies, and then perform processing such as deduplication and mixing according to the result of the depth measurement, so as to obtain the depth of the target scene. Therefore, image blanks or errors caused by a single TOF depth imaging lens can be reduced from depth information corresponding to imaging information of multiple angles.
Furthermore, the above-mentioned imaging system of this application can be applied to the target scene of a plurality of plane of reflection, realizes the depth information that can follow the imaging information correspondence of a plurality of angles, has avoided the blank or the error of image that causes by single TOF camera lens, improves the accuracy and the reliability of depth measurement and imaging effect to when adopting single TOF camera lens to carry out the depth range finding to the complicated scene that has more plane of reflection among the correlation technique, caused partial pixel to measure the technical problem that lacks or the error is great easily.
Optionally, the processing device further comprises: the acquisition unit is used for acquiring a first image and a second image obtained by imaging the second filtered light by the first imaging device and the second imaging device; the first processing unit is used for determining the estimated depth according to the coordinate position difference of the first image and the second image; and the second processing unit is used for determining the predetermined light according to the estimated depth.
As an optional embodiment, for the visible light reflected from the target scene, before the visible light enters the imaging device, the visible light is filtered by the filtering device, and then a second filtered light is obtained, and then the second filtered light enters the first imaging device and the second imaging device, and further, the first imaging device and the second imaging device respectively image the second filtered light reflected from the target scene. The first imaging device images the second filtered light to obtain a first image, and the second imaging device images the second filtered light to obtain a second image.
As an alternative embodiment, the first image and the second image include a target object in the same target scene, and further a difference between coordinate positions of the target object in the first image and the second image may be obtained, and then an estimated depth may be obtained by using a triangulation parallax method, where the estimated depth is depth measurement data obtained based on visible light.
As an alternative embodiment, determining the estimated depth according to the coordinate position difference between the first image and the second image further includes: and calculating and generating the estimated depth by using a trigonometric formula, wherein z is f (1+ b/delta l), wherein f is the focal length of the lens, b is the length of the base line, delta l is the coordinate position difference of the pixel in the first image and the second image, and z is the depth.
As an alternative embodiment, the predetermined light is determined according to the estimated depth, wherein the predetermined light is modulated infrared light. In a specific implementation, at least two predetermined lights are determined according to the estimated depth, and the at least two predetermined lights have different frequencies and/or amplitudes. Preferably, the predetermined light is two different modulated infrared lights.
Optionally, the processing device further comprises: a first calculation unit configured to calculate a first depth of the first imaging device based on the first filtered light incident on the first imaging device; a second calculation unit for calculating a second depth of the second imaging device based on the first filtered light incident to the second imaging device; and the third processing unit is used for obtaining the depth of the target scene according to the first depth, the second depth and the weight corresponding to the first depth and the second depth.
As an optional embodiment, for infrared light reflected from a target scene, the infrared light is filtered by a filtering device before being incident on an imaging device, and then first filtered light is obtained and then is incident on the first imaging device and the second imaging device, and further, the first imaging device and the second imaging device respectively image the first filtered light reflected from the target scene. The first imaging device images the first filtered light to obtain a third image, and the second imaging device images the first filtered light to obtain a fourth image.
The two imaging lenses are used for respectively calculating the depth and combining the depth into a depth map, so that the data blank can be made up and the total depth detection precision can be improved.
Optionally, the third processing unit further comprises: a determining module, configured to determine weights corresponding to the first depth and the second depth, where the weights are determined by at least one of: a modulation parameter of the predetermined light generated by the light source device, wherein the modulation parameter includes at least one of a modulation amplitude and a modulation frequency; imaging parameters of the first imaging device and the second imaging device, wherein the imaging parameters include at least one of optical parameters and imaging quality.
As an alternative embodiment, there are various ways to determine the weights, including: the modulation parameters of the predetermined light generated by the light source device and the imaging parameters of the first imaging device and the second imaging device are not limited to the above-described modes in the implementation process.
Firstly, depth and error of the depth are estimated according to the difference of imaging of two lenses in visible light imaging, then infrared light is modulated according to the estimated depth and the error in a targeted mode, and modulation amplitude and frequency suitable for the estimated depth are output, so that the error can be reduced, and energy consumption is reduced.
An alternative embodiment of the present application is described below.
Note that, in the following alternative embodiment, the TOF lens corresponds to an imaging device, and the filter corresponds to a filtering device.
When a single TOF lens is used for imaging or depth ranging of a complex scene with more reflecting surfaces, measurement of partial pixels is easy to miss or have a large error, which is one of the reasons that the resolution of the TOF lens is inferior to that of other depth measurement methods.
Based on this, in order to make it suitable for scenes with higher accuracy requirements, two TOF lenses can be used to perform depth ranging from different positions simultaneously, and the measured depth information is combined to reduce the measurement error of each other. In addition, the advantages of the two TOF lenses can be utilized to firstly estimate the depth in the visible light band by utilizing the ambient light in natural conditions.
Further, in a preferred embodiment of the present application, the filter preferably uses a fabry-perot (FP) interference cavity, and changes the thickness of the cavity participating in resonance by piezoelectric ceramics, MEMS, angle resolution or liquid crystal filling, etc. to achieve the purpose of changing the peak value of the transmission wavelength, for example, the peak wavelength of the transmission wavelength of the FP interference cavity is 2nd/k under normal incidence, where n is the refractive index of the cavity, d is the thickness of the cavity, k is a positive integer, and the full width at half maximum of the transmission peak can also be adjusted by n, d, k and the reflectivity of the surface of the cavity. By adjusting d, the transmission peak appears in the visible light range, such as 587nm, 486nm, 656nm and the like.
After being processed by the filter, the scene is subjected to visible light imaging by using two TOF lenses, and the two TOF lenses respectively obtain a first image and a second image. The first image and the second image can generate a preliminary depth map according to a trigonometric formula by the coordinate position difference of the same feature in the map, namely z ═ f (1+ b/Δ l), wherein f is the focal length of the lens, b is the length of the base line, Δ l is the position difference of the pixel in the first image and the second image, and z is the depth.
Since switching between infrared light imaging and visible light imaging is performed, it is necessary to perform this with a filter whose transmission peak value is variable. As a preferred embodiment, the angle or thickness of the filter is adjusted by using voltage, or the refractive index of the filter cavity material (e.g. liquid crystal, etc.) is changed.
Wherein the estimated depth z is then used to determine a modulation frequency of the modulated infrared light for TOF suitable for the current scene, and wherein the amplitude of the modulation can be further modified after the determination of the modulation frequency. Since z is θ λ/4 pi in phase TOF detection, where θ is the phase difference between reflected light and modulated light, and is maximum 2 pi, and λ is the modulation wavelength, the maximum detection distance should be controlled within λ/2. After the estimated depth is obtained, the modulation frequency of the modulated light source can be changed to ensure that the maximum detection range is not exceeded. Modulation amplitudes that provide sufficient signal-to-noise ratio in the current scenario may also be selected based on the determined modulation frequency to reduce power consumption. After obtaining the estimated depth map, the user may be preliminarily prompted about the problems in the scene and corresponding suggestions, such as prompting that the scene is too close to the lens or too far away from the lens, too many objects in the scene may increase errors, too monotonous a scene lacks depth changes, and the like, to suggest that the user moves the shooting direction. This process may be continued until the user has moved to a scene suitable for applications such as AR, before only visible light detection is performed without entering the stage of infrared light emission.
Thereafter, infrared light having the determined modulation frequency and amplitude is emitted toward the scene, and the filter is adjusted as above until the transmission peak occurs in the infrared band, such as 940nm or the like, and TOF ranging is started. Due to the simultaneous use of the two TOF lenses, at least two modulated lights which are mutually distinguishable by being temporally spaced or orthogonally polarized can also be output, the modulation frequencies and amplitudes of the two modulated lights can be substantially different, and the two TOF lenses can respectively utilize one of the two modulated lights for depth ranging. Fig. 3(a) is a schematic diagram of a TOF lens modulating light in a depth image imaging system according to an embodiment of the present disclosure, and fig. 3(b) is a schematic diagram of another TOF lens modulating light in a depth image imaging system according to an embodiment of the present disclosure, as shown in fig. 3(a) and fig. 3(b), so that depth ranging results of two modulated lights can be used for combining with each other to reduce the lack of error compensation images, and can also be used for removing overlapping measurement results of additional k λ/2(k is a positive integer) in depth ranging.
Finally, the depths respectively detected by the two TOF will form the final depth map through weighted combination, and if the modulated light with the same frequency and amplitude is imaged, the weights can be equal, and the weights are only used for compensating errors such as image blank. However, when two TOF lenses respectively image different frequencies and amplitudes that are distinguished from each other in time or polarization, weights of the two TOF lenses should be different, for example, a lens weight in which a modulation frequency more corresponds to a depth estimated at the time of visible light detection is increased, a lens weight in which a modulation amplitude is higher is increased, or a lens weight in which an overall error of imaging is smaller is increased. While the resulting depth values of the final weighted combination will enable reduced errors and higher image resolution. Furthermore, the two TOF lenses should be calibrated and calibrated in advance so that their imaged coordinates can be assigned to one another and combined into one image, while the calibration matrix can be used directly after the determination at each depth measurement.
An embodiment of the present application provides a terminal, including a depth image imaging system, where the depth image imaging system is the depth image imaging system in the foregoing. FIG. 4 is a schematic diagram of a terminal including a depth image imaging system according to an embodiment of the application.
The terminal described above may be applied to, but not limited to, a server, a PC, a PAD, a mobile phone, and the like. In a specific application, the device shown in fig. 4 is a mobile phone, the depth image imaging system 10 is located in the mobile phone, only the external structure of the mobile phone is shown in fig. 4, and the position of the depth image imaging system 10 is shown in fig. 4 when viewed from the outside.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (10)
1. A depth image imaging apparatus characterized by comprising:
a light source device located at one side of the target scene for emitting at least predetermined light;
the filtering device is positioned on one side of the target scene and used for filtering reflected light of the target scene to obtain first filtered light and second filtered light, wherein the first filtered light is obtained by reflecting and filtering the predetermined light through the target scene;
the first imaging device and the second imaging device are positioned on the light emitting side of the filtering device and are used for respectively imaging the first filtering light and the second filtering light of the preset wave band.
2. The apparatus of claim 1, further comprising:
and the polarizing equipment is positioned on one side of the filtering equipment and is used for enabling the preset light entering the first imaging equipment and the second imaging equipment to have mutually orthogonal polarization directions so as to obtain corresponding polarized light.
3. The apparatus of claim 2, further comprising:
and the polarization analyzing device is positioned between the polarization generating device and the imaging device and is used for enabling the corresponding polarized light to respectively enter the first imaging device and the second imaging device.
4. The apparatus of claim 1, wherein the first filtered light incident to the first imaging device and the predetermined light incident to the second imaging device have different modulation parameters.
5. The apparatus of claim 1, wherein the first imaging device and the second imaging device are spaced apart on an exit side of the filtering device.
6. The apparatus of claim 1, wherein the predetermined light is modulated infrared light.
7. The apparatus of claim 1, wherein the light corresponding to the second filtered light is visible light.
8. The apparatus of claim 1, wherein the filtering device is a Fabry-Perot interferometric cavity with adjustable cavity thickness.
9. A depth image imaging system comprising the depth image imaging apparatus according to any one of claims 1 to 8.
10. A terminal comprising a depth image imaging system, wherein the depth image imaging system is the depth image imaging system of claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201921405778.5U CN210803719U (en) | 2019-08-27 | 2019-08-27 | Depth image imaging device, system and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201921405778.5U CN210803719U (en) | 2019-08-27 | 2019-08-27 | Depth image imaging device, system and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN210803719U true CN210803719U (en) | 2020-06-19 |
Family
ID=71227517
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201921405778.5U Active CN210803719U (en) | 2019-08-27 | 2019-08-27 | Depth image imaging device, system and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN210803719U (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110441784A (en) * | 2019-08-27 | 2019-11-12 | 浙江舜宇光学有限公司 | Depth image imaging system and method |
CN112842180A (en) * | 2020-12-31 | 2021-05-28 | 深圳市杉川机器人有限公司 | Sweeping robot, distance measurement and obstacle avoidance method and device thereof, and readable storage medium |
-
2019
- 2019-08-27 CN CN201921405778.5U patent/CN210803719U/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110441784A (en) * | 2019-08-27 | 2019-11-12 | 浙江舜宇光学有限公司 | Depth image imaging system and method |
CN112842180A (en) * | 2020-12-31 | 2021-05-28 | 深圳市杉川机器人有限公司 | Sweeping robot, distance measurement and obstacle avoidance method and device thereof, and readable storage medium |
WO2022143285A1 (en) * | 2020-12-31 | 2022-07-07 | 深圳市杉川机器人有限公司 | Cleaning robot and distance measurement method therefor, apparatus, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230392920A1 (en) | Multiple channel locating | |
JP2022132492A (en) | Method and system for robust and extended illumination waveforms for depth sensing in three-dimensional imaging | |
JP2020115630A (en) | Eye tracking using optical flow | |
CA2805443C (en) | Method and apparatus for imaging | |
CN111121722A (en) | Binocular three-dimensional imaging method combining laser dot matrix and polarization vision | |
AU2016327918A1 (en) | Unmanned aerial vehicle depth image acquisition method, device and unmanned aerial vehicle | |
CN210803719U (en) | Depth image imaging device, system and terminal | |
US20230204724A1 (en) | Reducing interference in an active illumination environment | |
KR20190053799A (en) | Method for processing a raw image of a time-of-flight camera, image processing apparatus and computer program | |
CN103477644A (en) | Method of recording an image and obtaining 3D information from the image, and camera system | |
US11902494B2 (en) | System and method for glint reduction | |
CN209676383U (en) | Depth camera mould group, depth camera, mobile terminal and imaging device | |
CN107483815B (en) | Method and device for shooting moving object | |
US20230199324A1 (en) | Projection unit and photographing apparatus comprising same projection unit, processor, and imaging device | |
US11803982B2 (en) | Image processing device and three-dimensional measuring system | |
WO2019217110A1 (en) | Phase wrapping determination for time-of-flight camera | |
CN110441784A (en) | Depth image imaging system and method | |
US9835444B2 (en) | Shape measuring device using frequency scanning interferometer | |
CN112379389B (en) | Depth information acquisition device and method combining structured light camera and TOF depth camera | |
CN111510700A (en) | Image acquisition device | |
CN109741384A (en) | The more distance detection devices and method of depth camera | |
US20220311985A1 (en) | Image capture device and depth information calculation method thereof | |
CN103697825A (en) | Super-resolution 3D laser measurement system and method | |
US11885880B2 (en) | Unwrapped phases for time-of-flight modulation light | |
TW201807443A (en) | Device and system for capturing 3-D images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |