CN110220494B - Double-aperture ranging system and operation method thereof - Google Patents

Double-aperture ranging system and operation method thereof Download PDF

Info

Publication number
CN110220494B
CN110220494B CN201910292985.2A CN201910292985A CN110220494B CN 110220494 B CN110220494 B CN 110220494B CN 201910292985 A CN201910292985 A CN 201910292985A CN 110220494 B CN110220494 B CN 110220494B
Authority
CN
China
Prior art keywords
frame
sub
color
frames
blurred
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910292985.2A
Other languages
Chinese (zh)
Other versions
CN110220494A (en
Inventor
王国振
陈以雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201910292985.2A priority Critical patent/CN110220494B/en
Publication of CN110220494A publication Critical patent/CN110220494A/en
Application granted granted Critical
Publication of CN110220494B publication Critical patent/CN110220494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Input (AREA)

Abstract

The invention provides a double-aperture ranging system and an operation method thereof, wherein the ranging system comprises: a first aperture stop; a second aperture stop; a pixel array for receiving light passing through the first aperture stop and the second aperture stop; and the processing unit is used for performing convolution operation on the first color sub-frame of the image frame and the first fuzzy kernels respectively to generate a plurality of fuzzy first sub-frames, performing convolution operation on the second color sub-frame of the image frame and the second fuzzy kernels respectively to generate a plurality of fuzzy second sub-frames, performing subtraction operation on the fuzzy first sub-frame and the fuzzy second sub-frame corresponding to the first relative object distance to obtain a first differential frame, performing subtraction operation on the fuzzy first sub-frame and the fuzzy second sub-frame corresponding to the second relative object distance to obtain a second differential frame, and selecting a group of fuzzy first sub-frames and fuzzy second sub-frames corresponding to the minimum sum of pixel values in the first differential frame and the second differential frame as a group of optimal matching sub-frames.

Description

Double-aperture ranging system and operation method thereof
The application is a divisional application of Chinese patent application with the application number of 201510715859.5, the application date of 2015, 10 and 29 and the name of a double-aperture distance measuring system.
Technical Field
The present invention relates to an optical system, and more particularly, to a dual-aperture distance measuring system.
Background
The image system has become one of the indispensable applications of the mobile devices. The signals collected by the image system can be applied to gesture recognition (gesture recognition), proximity sensing (proximity sensing), distance detection, and the like, besides recording images.
In the application of distance detection, for example, time of flight (time of flight), structured light (structured light), light field camera (light field camera), coded aperture camera (adaptive coded camera), etc. can be used to calculate the depth.
However, the above methods have their limitations. For example, the time-of-flight method and the structured light method consume higher power and are therefore less suitable for mobile devices. Light field cameras can reduce image resolution. Coded aperture cameras reduce the signal-to-noise ratio (snr).
In view of the above, a solution to the above problem needs to be provided.
Disclosure of Invention
The present invention provides a dual-aperture ranging system, which compares the convolution result of a green light image and a plurality of blur kernels (blur kernels) with the convolution result of an infrared light image and a plurality of blur kernels to generate a depth map.
Another objective of the present invention is to provide a dual-aperture ranging system, which compares the convolution result of the blue light image and the blur kernels with the convolution result of the infrared light image and the blur kernels to generate a depth map.
To achieve the above objective, the present invention provides a distance measuring system, which includes a first aperture stop, a second aperture stop, a pixel array and a processing unit. The first aperture stop includes a thin sheet of infrared light cut filter material and has a first aperture opening. The second aperture stop comprises a thin sheet made of opaque material and is provided with a second aperture opening, wherein the second aperture opening is larger than the first aperture opening. The pixel array is composed of infrared pixels and green pixels or composed of infrared pixels and blue pixels and used for receiving light passing through the first aperture diaphragm and the second aperture diaphragm. The processing unit is used for: respectively carrying out convolution operation on a first color subframe of an image frame output by the pixel array and a plurality of first fuzzy kernels measured in advance according to different relative object distances to generate a plurality of fuzzy first subframes corresponding to the different relative object distances, wherein the different relative object distances at least comprise a first relative object distance and a second relative object distance, and the first color is green light or blue light; performing convolution operation on the second color sub-frames of the image frame and a plurality of second fuzzy kernels measured in advance according to the different relative object distances to generate a plurality of fuzzy second sub-frames corresponding to the different relative object distances, wherein the second color is infrared light; carrying out subtraction operation on the fuzzified first sub-frame and the fuzzified second sub-frame corresponding to the first relative object distance to obtain a first differential frame; carrying out subtraction operation on the fuzzified first sub-frame and the fuzzified second sub-frame corresponding to the second relative object distance to obtain a second differential frame; and selecting a group of fuzzified first sub-frames and fuzzified second sub-frames corresponding to the minimum sum of pixel values in the first differential frame and the second differential frame as a group of best matching sub-frames.
The invention also provides an operation method of the distance measuring system. The distance measuring system comprises a first aperture diaphragm, a second aperture diaphragm, an image sensor and a system light source. The first aperture diaphragm is made of an infrared light cut-off filtering material, the second aperture diaphragm is made of an opaque material, and the aperture opening of the second aperture diaphragm is larger than that of the first aperture diaphragm. The operation method comprises the following steps: generating a first image frame with the image sensor receiving light through the first aperture stop and the second aperture stop when the system light source is illuminated; receiving light through the first aperture stop and the second aperture stop with the image sensor when the system light source is off to generate a second image frame; calculating a differential image frame of the first image frame and the second image frame; calculating a convolution of a first color sub-frame of the differential image frame and a plurality of first blur kernels to generate a plurality of blurred first sub-frames, respectively, the first color being green or blue; respectively calculating convolution of a second color sub-frame of the differential image frame and a plurality of second blurring kernels to generate a plurality of blurred second sub-frames, wherein the second color is infrared light; performing optimal matching on the plurality of blurred first subframes and the plurality of blurred second subframes to obtain a group of optimal matching subframes; and obtaining the depth according to the group of best matching subframes.
In order that the manner in which the above recited and other objects, features and advantages of the present invention are obtained will become more apparent, a more particular description of the invention briefly described below will be rendered by reference to the appended drawings. In the description of the present invention, the same components are denoted by the same reference numerals, and the description thereof will be made herein.
Drawings
FIG. 1 is a cross-sectional view of a dual-aperture ranging system in accordance with an illustrative embodiment of the invention.
Fig. 2 is a graph of the relative spectral response of different colored lights.
Fig. 3A and 3B are schematic diagrams illustrating a pixel distribution of a pixel array according to an embodiment of the invention.
Fig. 4A and 4B are another schematic diagrams illustrating a pixel distribution of a pixel array according to an embodiment of the invention.
FIG. 5 is a block diagram of a ranging system according to an embodiment of the present invention.
FIG. 6 is a diagram of a blur kernel for a ranging system applied to an illustrative embodiment of the invention.
Fig. 7 is a flowchart illustrating an operation method of the ranging system according to an embodiment of the present invention.
Description of the reference numerals
1 ranging system
10 base plate
11 first aperture diaphragm
12 second aperture diaphragm
13 lens holder
141-143 lens
15 image sensor
150 pixel array
16 infrared light cut-off filter
17 lens barrier
L1, L2 light Beam
Detailed Description
Fig. 1 is a cross-sectional view of a dual-aperture ranging system according to an embodiment of the disclosure. The distance measuring system 1 of the present embodiment includes a substrate 10, a first aperture stop 11, a second aperture stop 12, a lens holder 13, a lens set (for example, three lenses 141-143 are shown, but not limited thereto), an image sensor 15, an infrared cut filter 16, and a lens barrier 17.
The lens holder 13 has, for example, a substantially cylindrical structure, and the plurality of lenses 141 to 143, the first aperture stop 11, and the second aperture stop 12 are disposed in the lens holder 13; the number and types of lenses (such as convex lenses or concave lenses) included in the lens assembly shown in fig. 1 are only examples, and the number and types are determined according to different applications and are not particularly limited. The lens barrier 17 is used to protect the lens group disposed in the lens holder 13, and has an opening for external light to enter the lens holder 13 and to propagate to the image sensor 15 located in the lens holder 13.
The first aperture stop (aperture stop)11 comprises a sheet (sheet) of infrared light cut filter material and has a first aperture opening; the sheet is, for example, a ring shape (ring shape) and is used to block infrared light. More specifically, the first aperture is an opening of the sheet and is located substantially at the center of the sheet, but not limited thereto. The first aperture is, for example, a substantially circular aperture, but not limited thereto. In one embodiment, the cut-off wavelength of the infrared light cut-off filter material is 550 to 650 nm to block light beams (such as L2) above the cut-off wavelength from penetrating through the sheet; wherein the cut-off wavelength is determined according to the visible light spectrum to be received by the image sensor 15. Fig. 2 shows the relative spectral responses of different colors of light. For example, when the image sensor 15 is mainly used to sense blue light (blue), the cut-off wavelength may be selected to be short (e.g., about 550 nm), and when the image sensor 15 is mainly used to sense green light (green), the cut-off wavelength may be selected to be long (e.g., about 650 nm) to eliminate interference with infrared light.
The second aperture stop 12 comprises a thin sheet of opaque material and has a second aperture opening; wherein the thin sheet is, for example, ring-shaped to block all color lights (e.g., block all the spectrum shown in fig. 2). More specifically, the second aperture opening is an opening of the sheet and is located substantially at the center of the sheet, but not limited thereto. The second aperture opening is, for example, a substantially circular opening, but not limited thereto.
In the present embodiment, the second diaphragm aperture is larger than the first diaphragm aperture in order to form a double aperture. Thus, the aperture through which the infrared light passes is different from the aperture through which the green/blue light passes. For example, in fig. 1, the light beam L1 is a light beam containing red, green, blue and infrared light components, and the light beam L2 mainly contains the green and blue light components but does not contain the infrared light component (blocked by the second aperture stop 12) or contains a very low ratio of the infrared light component to the green/blue light component. Thus, the effective aperture for infrared light is the first aperture opening; and the effective apertures for green and blue light are the second aperture openings. As can be seen from fig. 2, the spectra of the Red light (Red) and the infrared light (IR) overlap each other more, so in the present invention, the image sensor 15 is configured not to sense the energy of the Red light spectrum, so as to reduce the interference with the infrared light.
It should be understood that although the second aperture stop 12 is shown closer to the image sensor 15 in FIG. 1, it is only used for illustration and not for limiting the invention. In one embodiment, the positions of the first aperture stop 11 and the second aperture stop 12 can be interchanged. In another embodiment, the first aperture stop 11 and the second aperture stop 12 may be formed on the same transparent plate (plate), and different light shielding layers (e.g. as the second aperture stop 12) and light filtering layers (e.g. as the first aperture stop 11) are formed (e.g. coated) on the transparent plate to form the first aperture opening and the second aperture opening.
The image sensor 15 and the lens holder 13 are provided on the substrate 10. The lens holder 13 may be bonded to the substrate 10 in a suitable manner, such as a clamping manner, an adhesion manner, etc., without any particular limitation. The image sensor 15 may be disposed on the substrate 10 in a suitable manner, as long as the image sensor 15 is electrically connected to the substrate 10 to transmit an electrical signal, and is not particularly limited. For example, fig. 1 shows that the image sensor 15 is electrically coupled to the substrate 10 by wire bonding, but not limited thereto. One end of the lens holder 13 (e.g., the end near the substrate 10 in the figure) has a receiving space, and the image sensor 15 and the pixel array 150 thereof are received in the receiving space.
The image sensor 15 is, for example, an image sensing chip (image sensing chip), which preferably includes a source image sensing element, such as a CMOS image sensing element, but not limited thereto. The image sensor 15 receives light passing through the first and second aperture stops 11 and 12 at a frame rate (frame rate) and outputs an image frame.
Please refer to fig. 3A and 3B, which are schematic diagrams illustrating a pixel distribution of the pixel array 150 according to an embodiment of the invention. The image sensor 15 includes a pixel array 150 including a plurality of pixels arranged in an array to receive light passing through the lens group, the first aperture stop 11, and the second aperture stop 12, and output an image frame; the size of the pixel array 150 is not particularly limited.
In this embodiment, the pixel array 150 is composed of a plurality of first color pixels 151 and a plurality of second color pixels 153. For example, the first color is green light or blue light, and the second color is infrared light. In the present description, the first color pixels 151 do not include red color pixels in order to reduce interference with infrared light. Therefore, the first color pixel 151 is selected as only one of a green light pixel and a blue light pixel. It can be understood that the pixels with different colors have the same circuit structure, and only the filter layer with different color lights is formed thereon, so that the light sensing device (e.g., photodiode) in the pixel circuit senses the light energy of different color lights.
In the pixel array 150, the plurality of first color pixels 151 and the plurality of second color pixels 153 are preferably arranged in a checkerboard distribution for subsequent interpolation. Referring to fig. 4A and 4B, the first color pixels 151 are arranged along a diagonal direction and are respectively adjacent to the second color pixels 153.
It should be noted that the plurality of first color pixels 151 and the plurality of second color pixels 153 do not include dummy pixels (dummy pixels) of the pixel array 150 in the description of the present invention. In some embodiments, the edge columns of the pixel array 150 may be configured with dummy pixels as one of the ways to eliminate noise.
The infrared light cut-off filter 16 is disposed above the pixel array 150 of the image sensor 15 and has a cut-off wavelength of about 810 nm. It can be appreciated that the cutoff wavelength depends on the target spectrum sensed by the plurality of second color pixels 153 and is not limited to 810 nm. In addition, although fig. 1 shows that the infrared light cut filter 16 is disposed on the lens holder 13, the present invention is not limited thereto. In other embodiments, the infrared light cut filter 16 may be an infrared light cut filter layer directly coated on the pixel array 150 of the image sensor 15, as long as the infrared light cut filter layer can block long wavelength infrared light from irradiating the pixel array 150 of the image sensor 15, and the arrangement manner thereof is not particularly limited.
As mentioned above, the image sensor 15 may be an image sensing chip and includes the pixel array 150 and a processing unit; wherein the processing unit directly post-processes the pixel data (e.g., digital gray-scale value data) collected by the pixel array 150.
Fig. 5 is a block diagram of a ranging system according to an embodiment of the present invention. The distance measuring system 6 of the present embodiment includes a pixel array 61, a reading circuit 63 and a processing unit 65. The pixel array 61 can be the pixel array 150 (fig. 3A and 3B), and includes a plurality of first color pixels 651 and a plurality of second color pixels 653. The reading circuit 63 sequentially reads the gray-scale data of each pixel of the pixel array 61 according to a clock signal (e.g., generated by a timing controller), for example, but not limited to, a correlated double sampling (correlated double sampling) method. The processing unit 65 is, for example, a Digital Signal Processor (DSP), and is configured to calculate and output depth data (e.g., a depth map) to the external electronic device 9 for corresponding control according to the read pixel data; the electronic device 9 is, for example, a portable electronic device or a wearable electronic device. The processing unit 65 for example comprises an analog-to-digital conversion unit for digital conversion.
The processing unit 65 receives the pixel data of the image frame F output by the reading circuit 63 and respectively processes the first color pixel data and the second color pixel data. For example, the processing unit 65 forms a plurality of first color pixel data into a first color sub-frame (subframe) and a plurality of second color pixel data into a second color sub-frame; wherein the pixel data is converted digital data. As previously mentioned, the first color is green or blue light and the second color is infrared light. For example, fig. 5 shows the first color sub-frame being a green/blue sub-frame (G/B subframe) and the second color sub-frame being an infrared light sub-frame (IR subframe).
The processing unit 65 includes a first interpolation unit 6511, a second interpolation unit 6521, a first denoising unit 6513, a second denoising unit 6523, a first compensation unit 6515, a second compensation unit 6525, a first convolution calculator 6517, a second convolution calculator 6527, a matching calculator 653, and a storage unit 655. It should be noted that although fig. 5 shows the first interpolation unit 6511, the second interpolation unit 6521, the first denoising unit 6513, the second denoising unit 6523, the first compensation unit 6515, the second compensation unit 6525, the first convolution calculator 6517, the second convolution calculator 6527 and the matching calculator 653 as different functional blocks, all of the functional blocks can be considered to be executed by the processing unit 65, and the functional blocks can be implemented by software and/or hardware without specific limitation.
The storage unit 655 is, for example, a memory or a buffer, and is configured to pre-store a plurality of first blur kernels, a plurality of second blur kernels, and depth information related to the plurality of first blur kernels and the plurality of second blur kernels. For example, referring to FIG. 6, a schematic diagram of a blur kernel applied to a ranging system of an illustrative embodiment of the invention is shown. The storage unit 655 prestores blue kernel (blue kernel) K1, such as N1 to N4, with respect to different distances, and prestores infrared kernel K2, such as N1 to N4 with respect to different distances. Each of the blur kernels N1 to N4 is, for example, a blur kernel measured and stored in advance by monochromatic light (e.g., blue light and infrared light) at different distances from the distance measuring system 1. For example, N1 is 5 cm relative to the object distance, N2 is 10 cm relative to the object distance, N3 is 15 cm relative to the object distance, and N4 is 20 cm relative to the object distance. It should be understood that the number of blur kernels K1 and K2 and the corresponding object distances in fig. 6 are only for illustration and are not intended to limit the present invention. In addition, the green blur kernel can be pre-stored in the same way.
The reading circuit 63 reads each pixel data of the pixel array 61 sequentially, for example, sequentially from the first row of the first row to the last row of the last row, and transmits the pixel data to the processing unit 65. The processing unit 65 divides an image frame F (which includes first color pixel data and second color pixel data) output by the pixel array 61 into first color sub-frames and second color sub-frames; for example, pixel data for the plurality of first color pixels 651 can be formed into the first color sub-frame and pixel data for the plurality of second color pixels 652 can be formed into the second color sub-frame.
Since the plurality of first color pixels 651 are only a portion of the pixels of the pixel array 61, the first interpolation unit 6511 of the processing unit 65 interpolates the image frame F (i.e., with respect to the pixel data of the plurality of first color pixels 651) to generate first color sub-frames, such that the first color sub-frames have the same size as the effective pixels of the pixel array 61. For example, the pixel data of the position (1,1) is interpolated using the pixel data of the positions (1,2) and (2, 1); interpolating pixel data at the position (1,3) by using pixel data at the positions (1,2), (2,3) and (1, 4); and so on. However, the manner of interpolation is not limited thereto.
Since the second color pixels 653 are only a portion of the pixels of the pixel array 61, the second interpolation unit 6521 of the processing unit 65 interpolates the image frame F (i.e., the pixel data corresponding to the second color pixels 653) to generate second color sub-frames, so that the second color sub-frames have the same size as the effective pixels of the pixel array 61. For example, the pixel data of the position (1,2) is interpolated using the pixel data of the positions (1,1), (2,2), and (1, 3); interpolating pixel data at the position (1,4) by using pixel data at the positions (1,3), (2,4) and (1, 5); and so on. However, the manner of interpolation is not limited thereto.
The first denoising unit 6513 of the processing unit 65 performs a filtering (denoising) process on the first color sub-frame, for example, with a digital filter (digital filter), to improve image quality. Similarly, the second denoising unit 6523 of the processing unit 65 performs filtering (denoising) processing on the second color sub-frame, for example, with a digital filter. The digital filter may be a filter well known for processing images, and is not particularly limited.
The first compensation unit 6515 of the processing unit 65 is configured to perform lens shading compensation on the first color sub-frame, and the second compensation unit 6525 of the processing unit 65 is configured to perform lens shading compensation on the second color sub-frame. Information of lens shading compensation (lens shading compensation) is stored in the processing unit 65, for example, in the storage unit 655, in advance based on the plurality of lenses 141 to 143 (see fig. 1).
In this embodiment, the processing unit 65 can optionally perform at least one of a filtering (denoising) process and a lens shading compensation process to increase the accuracy of the distance calculation.
The first convolution calculator 6517 of the processing unit 655 respectively convolves (convolution) the first color sub-frames (e.g., denoised and compensated first color sub-frames) of the image frame F with a plurality of first blur kernels to generate a plurality of blurred first sub-frames. For example, referring to fig. 6, the first convolution calculator 6517 sequentially reads the blue blur kernel K1 (e.g., N is 1 to 4) from the storage unit 655, and respectively calculates the convolution of the first color sub-frame and the blue blur kernel N is 1 to 4 to generate a plurality of (here, 4) blurred first sub-frames. It can be appreciated that the number of generated blurred first subframes is the same as the number of pre-stored blur kernels K1.
The second convolution calculator 6527 of the processing unit 655 respectively convolves the second color sub-frames (e.g., denoised and compensated second color sub-frames) of the image frame F with a plurality of second blur kernels to generate a plurality of blurred second sub-frames. For example, referring to fig. 6, the second convolution calculator 6527 sequentially reads the infrared light blur kernel K2 (for example, N is 1 to 4) from the storage unit 655, and respectively calculates the convolution of the second color sub-frame and the infrared light blur kernel N is 1 to 4 to generate a plurality of blurred second sub-frames (blurred second sub-frames). It will be appreciated that the number of generated blurred second sub-frames is the same as the number of pre-stored blur kernels K2, here for example 4.
The match calculator 653 of the processing unit 65 then compares the plurality of obscured first subframes with the corresponding plurality of obscured second subframes to find a set of best matching subframes. In an embodiment, comparing the plurality of obscured first subframes and the plurality of obscured second subframes refers to performing an optimal matching on the plurality of obscured first subframes and the plurality of obscured second subframes. The matching calculator 653 performs a subtraction operation on the blurred first sub-frame and the blurred second sub-frame corresponding to the blurring kernel N1, for example, the pixel-by-pixel data (pixel-by-pixel) in the blurred first sub-frame and the blurred second sub-frame are subtracted from each other to obtain a first differential frame, and the first differential frame is stored in the storage unit 655; subtracting the blurred first sub-frame and the blurred second sub-frame of the blurring kernel N2 to obtain a second differential frame, and storing the second differential frame in the storage unit 655; subtracting the blurred first sub-frame and the blurred second sub-frame of the blurring kernel N3 to obtain a third differential frame, and storing the third differential frame in the storage unit 655; and subtracting the blurred first sub-frame and the blurred second sub-frame of the blurring kernel N4 to obtain a fourth differential frame, and storing the fourth differential frame in the storage unit 655. The matching calculator 653 selects a group of the blurred first sub-frame and the blurred second sub-frame corresponding to the one of the first to fourth differential frames in which the sum of the pixel values is the smallest as a group of best matching sub-frames.
In this embodiment, the first color sub-frame, the second color sub-frame, the plurality of blue light blur kernels K1, the plurality of infrared light blur kernels K2, the plurality of blurred first sub-frames, the plurality of blurred second sub-frames, and the first to fourth difference frames have the same size.
The matching calculator 653 determines the depth according to the depth information associated with the first blur kernel and/or the second blur kernel corresponding to the set of best matching subframes. For example, when the set of best matching subframes corresponds to the blur kernel N ═ 1, and the depth information associated with the blur kernel N ═ 1 is, for example, 5 centimeters as described above, the determined depth D is 5 centimeters. Similarly, when the set of best matching subframes is the corresponding blur kernel N is 2-4, the associated depth D can be determined.
Furthermore, the optimal matching is not limited to computing a subtraction of the plurality of blurred first subframes and the plurality of blurred second subframes. In other embodiments, the optimized matching may also be calculated by calculating a correlation (correlation) between the plurality of blurred first subframes and the corresponding blurred second subframes, or by using other well-known methods, and is not limited to the disclosure in the description of the present invention. As mentioned above, the term "corresponding" herein refers to the fuzzy kernels N being 1-4 with respect to the same object distance. The above-mentioned differential frame calculation is only used for illustration and not for limiting the description of the present invention, and a set of best matching subframes refers to a set of blurred first subframes and blurred second subframes with the highest similarity.
In some embodiments, when the similarity (e.g., the sum of the pixel values of the differential frames) between the two groups of the blurred first sub-frame and the blurred second sub-frame is not very different, the depth may also be calculated by using the differential method. For example, when the similarity is between N-1 and N-2, the depth D may be selected to be 7.5 cm.
Finally, the processing unit 65 outputs the determined depth D to the external electronic device 9.
Fig. 7 is a flowchart illustrating an operating method of a ranging system according to an embodiment of the present invention, which includes the following steps: receiving light passing through the first aperture stop and the second aperture stop with an image sensor to generate an image frame (step S71); calculating a convolution of the first color sub-frame of the image frame with a plurality of first blur kernels, respectively, to generate a plurality of blurred first sub-frames (step S72); calculating convolutions of second color sub-frames of the image frame with a plurality of second blur kernels, respectively, to generate a plurality of blurred second sub-frames (step S73); optimally matching the plurality of blurred first sub-frames with the plurality of blurred second sub-frames to obtain a set of best matching sub-frames (step S74); and finding a depth from the set of best matching subframes (step S75); the steps S72-S75 are performed by the processing unit 36 using software and/or hardware, for example, using the functional blocks of fig. 5. The detailed implementation of the present embodiment has been described above, and therefore, the detailed description thereof is omitted.
As mentioned above, in order to increase the operation accuracy, the operation method of the embodiment further includes the steps of: and performing at least one of filtering (denoising) processing and lens shading compensation processing on the first color subframe and the second color subframe by using a processing unit.
In addition, before processing the first color sub-frame and the second color sub-frame, the processing unit 36 performs interpolation (interpolation) on the pixel data of the plurality of first color pixels 651 and the pixel data of the plurality of second color pixels 651 to generate the first color sub-frame and the second color sub-frame, respectively. As described above, the processing unit 36 digitizes the grayscale data of the image frame F and then performs interpolation processing.
It should be noted that the numerical values given in the description of the present invention, such as the size of the pixel array, the number of blur kernels, the number and distance of lenses, and the like, are merely illustrative and are not intended to limit the present invention. In addition, the reading circuit 63 is not limited to sequentially reading each pixel of the pixel array 61 from the first pixel to the last pixel, and may read the pixel data of the first color pixels 651 first and then the pixel data of the second color pixels 653, without any specific limitation.
In addition, in combination with turning on and off of the system light source, the processing unit 65 may first calculate a difference image frame between a first image frame acquired when the system light source is turned on and a second image frame acquired when the system light source is turned off, and then perform the operation method of the distance measuring system on the difference image frame. More specifically, the image frame F of the above embodiment can be replaced by a differential image frame, and other operations are the same, so they are not described herein again.
As described above, various conventional ranging systems have different application limitations. Therefore, the present invention also provides a dual-aperture ranging system (fig. 1 and 6) and an operating method thereof (fig. 7), which can calculate a depth map from only one image; i.e. the depth map comprises a plurality of depth values. In addition, since the present invention only uses the sensing result of the blue/green pixel without including the red pixel, it can reduce the interference with the infrared light to improve the detection accuracy.
Although the present invention has been described with reference to the above examples, it should be understood that various changes and modifications can be made by one skilled in the art without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A ranging system, comprising:
a first aperture stop including a sheet of an infrared light cut filter material and having a first aperture opening;
a second aperture stop comprising a sheet of opaque material and having a second aperture opening, wherein the second aperture opening is larger than the first aperture opening;
a pixel array, which is composed of infrared pixels and green pixels, or composed of infrared pixels and blue pixels, and is used for receiving the light passing through the first aperture diaphragm and the second aperture diaphragm; and
a processing unit for
Respectively convolving a first color sub-frame of an image frame output by the pixel array with a plurality of first blur kernels measured in advance according to different relative object distances to generate a plurality of blurred first sub-frames corresponding to the different relative object distances, wherein the different relative object distances at least comprise a first relative object distance and a second relative object distance, and the first color is green light or blue light,
performing convolution operation on the second color sub-frames of the image frame and a plurality of second blurring kernels measured in advance according to the different relative object distances to generate a plurality of blurred second sub-frames corresponding to the different relative object distances, wherein the second color is infrared light,
subtracting the fuzzified first sub-frame and the fuzzified second sub-frame corresponding to the first relative object distance to obtain a first differential frame,
subtracting the blurred first sub-frame and the blurred second sub-frame corresponding to the second relative object distance to obtain a second difference frame, an
Selecting a group of fuzzified first sub-frames and fuzzified second sub-frames corresponding to the minimum sum of pixel values in the first differential frame and the second differential frame as a group of best matching sub-frames,
wherein the first blur kernel is a blue light blur kernel and the second blur kernel is an infrared light blur kernel.
2. The range finding system of claim 1, wherein the infrared light cut filter material has a cut-off wavelength of 550 to 650 nm.
3. The range finding system of claim 1, wherein the processing unit is further configured to perform at least one of filtering and lens shading compensation on the first color sub-frame and the second color sub-frame.
4. The ranging system according to claim 1, wherein the processing unit is further configured to interpolate the image frame to generate the first color sub-frame and the second color sub-frame.
5. The ranging system according to claim 1, wherein the processing unit further comprises a storage unit configured to pre-store the plurality of first blur kernels, the plurality of second blur kernels, and depth information associated with the plurality of first blur kernels and the plurality of second blur kernels.
6. An operation method of a distance measurement system, the distance measurement system comprising a first aperture stop, a second aperture stop, an image sensor and a system light source, the first aperture stop being made of an infrared light cut-off filtering material, the second aperture stop being made of an opaque material, an aperture opening of the second aperture stop being larger than an aperture opening of the first aperture stop, the operation method comprising:
generating a first image frame with the image sensor receiving light through the first aperture stop and the second aperture stop when the system light source is illuminated;
receiving light through the first aperture stop and the second aperture stop with the image sensor when the system light source is off to generate a second image frame;
calculating a differential image frame of the first image frame and the second image frame;
respectively calculating convolution of a first color sub-frame of the differential image frame and a plurality of first blurring kernels corresponding to different relative object distances to generate a plurality of blurred first sub-frames, wherein the first color is green light or blue light;
respectively calculating convolution of a second color sub-frame of the differential image frame and a plurality of second blurring kernels corresponding to different relative object distances to generate a plurality of blurred second sub-frames, wherein the second color is infrared light;
performing optimal matching on the plurality of blurred first subframes and the plurality of blurred second subframes to obtain a group of optimal matching subframes; and
finding a depth from the set of best matching subframes,
wherein the first blur kernel is a blue light blur kernel and the second blur kernel is an infrared light blur kernel.
7. The method of claim 6, further comprising:
and performing at least one of filtering processing and lens shading compensation processing on the first color sub-frame and the second color sub-frame.
8. The method of claim 6, further comprising:
interpolating the image frame to generate the first color sub-frame and the second color sub-frame.
9. The method of claim 6, wherein the optimal matching is a subtraction or correlation of each of the blurred first sub-frames with a corresponding one of the blurred second sub-frames.
10. The operating method of claim 6, wherein the IR cut filter material has a cut-off wavelength of 550-650 nm.
CN201910292985.2A 2015-10-29 2015-10-29 Double-aperture ranging system and operation method thereof Active CN110220494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910292985.2A CN110220494B (en) 2015-10-29 2015-10-29 Double-aperture ranging system and operation method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510715859.5A CN106643642B (en) 2015-10-29 2015-10-29 Based on Dual-Aperture range-measurement system
CN201910292985.2A CN110220494B (en) 2015-10-29 2015-10-29 Double-aperture ranging system and operation method thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201510715859.5A Division CN106643642B (en) 2015-10-29 2015-10-29 Based on Dual-Aperture range-measurement system

Publications (2)

Publication Number Publication Date
CN110220494A CN110220494A (en) 2019-09-10
CN110220494B true CN110220494B (en) 2021-11-09

Family

ID=58829682

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910292985.2A Active CN110220494B (en) 2015-10-29 2015-10-29 Double-aperture ranging system and operation method thereof
CN201510715859.5A Active CN106643642B (en) 2015-10-29 2015-10-29 Based on Dual-Aperture range-measurement system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201510715859.5A Active CN106643642B (en) 2015-10-29 2015-10-29 Based on Dual-Aperture range-measurement system

Country Status (1)

Country Link
CN (2) CN110220494B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2463480A (en) * 2008-09-12 2010-03-17 Sharp Kk Camera Having Large Depth of Field
CN102695939B (en) * 2010-11-09 2015-06-17 松下电器产业株式会社 Distance measurement device and distance measurement method
CN103053168B (en) * 2011-02-01 2015-11-25 松下电器产业株式会社 Stereo image shooting device and endoscope
EP2718666A1 (en) * 2011-06-09 2014-04-16 Eastman Kodak Company Coupled multi-wavelength confocal systems for distance measurements
CN103827920B (en) * 2011-09-28 2018-08-14 皇家飞利浦有限公司 It is determined according to the object distance of image
CN102595171B (en) * 2012-02-03 2014-05-14 浙江工商大学 Imaging method and imaging system of dynamic optical fields of multichannel space-time coding apertures
CN103808305B (en) * 2012-11-07 2017-11-07 原相科技股份有限公司 Detecting system
JP6292790B2 (en) * 2013-08-08 2018-03-14 キヤノン株式会社 Distance detection device, imaging device, and distance detection method

Also Published As

Publication number Publication date
CN110220494A (en) 2019-09-10
CN106643642B (en) 2019-05-10
CN106643642A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
US10778915B2 (en) Dual-aperture ranging system
EP3354008B1 (en) Phase detection autofocus noise reduction
EP3354009B1 (en) Phase detection autofocus arithmetic
US9721344B2 (en) Multi-aperture depth map using partial blurring
CN107071234B (en) Lens shadow correction method and device
JP2015164284A (en) Solid-state image sensor, movement information acquisition apparatus and imaging apparatus
CN110275606B (en) Sensing element
US10354413B2 (en) Detection system and picture filtering method thereof
US20190139189A1 (en) Image remosaicing
CN109788216A (en) For the anti-interference method of TOF, device and TOF sensor chip
JP2016063540A (en) Apparatus and method for imaging with reflection suppression
US20200389576A1 (en) Display-based camera apparatus and methods
US12015835B2 (en) Multi-sensor imaging color correction
US10803625B2 (en) Detection system and picturing filtering method thereof
CN117355768A (en) System and method for obtaining color images using single photon avalanche diodes
CN110220494B (en) Double-aperture ranging system and operation method thereof
CN107925719B (en) Imaging device, imaging method, and non-transitory recording medium
CN118119966A (en) System and method for obtaining dark current images
JP7259660B2 (en) Image registration device, image generation system and image registration program
CN113959346A (en) Displacement detection module and mobile device
US7898591B2 (en) Method and apparatus for imaging using sensitivity coefficients
US11831859B2 (en) Passive three-dimensional image sensing based on referential image blurring with spotted reference illumination
US20240054659A1 (en) Object detection in dynamic lighting conditions
US20230032323A1 (en) Image processing method and system
CN117911525A (en) Multi-mode multi-path complementary visual data calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant