CN117528063A - Optical device and method for measuring depth of field of mobile phone camera - Google Patents

Optical device and method for measuring depth of field of mobile phone camera Download PDF

Info

Publication number
CN117528063A
CN117528063A CN202311557830.XA CN202311557830A CN117528063A CN 117528063 A CN117528063 A CN 117528063A CN 202311557830 A CN202311557830 A CN 202311557830A CN 117528063 A CN117528063 A CN 117528063A
Authority
CN
China
Prior art keywords
depth
light
light source
mobile phone
optical device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311557830.XA
Other languages
Chinese (zh)
Inventor
李庆康
周小雄
宋柳良
龚燕英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Trigiants Technology Co ltd
Original Assignee
Trigiants Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Trigiants Technology Co ltd filed Critical Trigiants Technology Co ltd
Priority to CN202311557830.XA priority Critical patent/CN117528063A/en
Publication of CN117528063A publication Critical patent/CN117528063A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Abstract

The invention discloses an optical device for measuring depth of field of a mobile phone camera, which comprises an illumination module and an imaging module, wherein the illumination module comprises a first shell, a light source is arranged in the first shell, a driver is arranged in the light source, a diffuser is arranged on the outer wall of the top of the first shell, the imaging module comprises a second shell and a bottom plate, the second shell is covered on the outer part of the bottom plate, an imaging lens is inserted in the second shell, a band-pass filter is arranged on the outer wall of the bottom of the second shell, and an imager is arranged in the bottom plate. The beneficial effect of the invention is that the light source is a narrow-band light source with lower wavelength and temperature dependence, the driver is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL), the VCSE has lower cost, small outline size and high reliability, and is easy to integrate into the ToF module of the camera imaging technology of the mobile phone.

Description

Optical device and method for measuring depth of field of mobile phone camera
Technical Field
The invention relates to the technical field of mobile phone cameras, in particular to an optical device and method for measuring depth of field of a mobile phone camera.
Background
The mobile phone camera can perform functions of video recording, photographing, character recognition, panoramic photographing and the like, wherein panoramic photographs, also called panoramic, generally refer to photographs which are photographed in accordance with normal effective visual angles of two eyes of a person or include above the visual angles of two eyes and even 360-degree complete scene ranges;
the depth of field refers to the range of distances between the front and rear of a subject measured to enable clear images to be obtained at the front of a camera lens or other imager. The distances between the aperture, the lens and the photographed object are important factors influencing the depth of field; after focusing, the distance range before and after the focus is called depth of field, there is a certain length of space in front of the lens (before and after the focus is adjusted), when the object is positioned in the space, its image on the film is just between the two circles of diffusion, the length of the space in which the object is positioned is called depth of field, in other words, the object in the space, its image blur degree on the surface of the film is in the limit of the allowable circle of diffusion, the length of the space is depth of field
Optics play a key role in time-of-flight (ToF) depth of field measurement cameras, the optical design determining the complexity and feasibility of the final system and its performance, 3D ToF cameras have some unique characteristics 1, and therefore some special requirements in optics, and depth of field measurement optical system architecture is presented herein consisting of imaging optics subassemblies, toF sensors on the receiver and illumination modules on the transmitter, and discussing how to optimize each of the subassemblies to improve sensor and systemicity.
The depth of field measurement method according to the application number of CN10248660A comprises the following steps: capturing images respectively on a plurality of focusing scales, wherein the images respectively comprise image areas corresponding to the same image position; selecting an image area from the plurality of image areas as an optimal depth image area; and finding out the depth of field value corresponding to the focusing scale of the optimal depth of field image area according to the lookup table.
The above and in the prior art, the light source typically uses LED lighting tubes, which are too slow, costly, bulky, and of low reliability. Therefore, there is a need to design an optical device and a method for depth measurement of field of a mobile phone camera to solve the above problems.
Disclosure of Invention
The invention aims to provide an optical device and a method for measuring the depth of field of a mobile phone camera so as to solve the defects in the prior art.
In order to achieve the above object, the present invention provides the following technical solutions:
the utility model provides a mobile phone camera depth of field measuring optical device, includes lighting module and imaging module, lighting module includes first casing, and the inside of first casing is provided with the light source, the inside of light source is provided with the driver, and is provided with the diffuser on the outer wall at first casing top, imaging module includes second casing and bottom plate, and the second casing cover is in the outside of bottom plate, imaging lens has been inserted to the inside of second casing, be provided with band-pass filter on the outer wall of second casing bottom, the inside of bottom plate is provided with the imager, and the inside of imager is provided with the microlens array.
Further, the light source is a narrow-band light source with a low wavelength dependence on temperature, and the driver is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL).
Further, the light source affects the TOF performance due to different wavelengths of light, and the 820nm or 940nm wavelength is selected according to the following conditions: sensor quantum efficiency and responsivity, QE measures the ability of a photodetector to convert photons to electrons, R measures the ability of a photodetector to convert optical power to current, human perception, and sunlight.
Further, the sensor quantum efficiency and responsivity: quantum Efficiency (QE) and responsivity (R) are interrelated, the QE measuring the ability of a photodetector to convert photons to electrons, the R measuring the ability of a photodetector to convert optical power to current, the formula: r=r=qe×q/hc/λ (qe=number of electrons collected/number of photons impinging on the photo-detection plate (%); r=current on the photo-detection plate/photo-power on the photo-detection plate (a/W); r=qe×q/hc/λ; where q is the electron charge, h is the planck constant, c is the speed of light, λ is the wavelength);
the QE of the silicon-based sensor is more than 2 times better at 850nm than at 940 nm; for example, the ADICW TOF sensor has a QE of 44% at 850nm and only 27% at 940 nm.
Further, the human perception: the human eye can perceive 850nm light, but cannot see 940nm light; the sunlight is as follows: when the solar energy sensor is used outdoors, the solar radiation intensity in the range of 920nm to 960nm is reduced due to the absorption of the atmosphere, and compared with the range of 850nm, the solar radiation intensity in the range of 940nm is less than half, namely, in the outdoor application, the ToF system is operated at 940nm, so that better light environment interference resistance and better depth sensing performance can be provided.
Further, the radiation intensity of the light source is one of the reasons for influencing the signal-to-noise ratio of the ToF system, and the beam profile integrity of the light source should satisfy the following characteristics: the illumination profile shape within the FOI, the profile width, the optical efficiency (i.e., the blocking energy within a particular FOV) and the optical power outside the FOI decrease, and should satisfy the following equation: i=dΦ/dΩ (dΦ is the power incident into the solid angle dΩ).
Further, the illumination profile shape within the FOI: the most common radiation intensity distribution in TOF flood illumination is batwing shaped with a profile that varies with cos-n (θ) to compensate for the attenuation of the imaging lens, and the cos3 (θ) reduction factor [ W/m2 ] in irradiance (E) between the target center and the target edge]The definition formula of (2) is: e=dΦ/da=i (θ) cos (θ)/R (θ) 2=i (θ) cos3 (θ)/R0 2 The method comprises the steps of carrying out a first treatment on the surface of the (where E is the irradiance, dA is the surface area illuminated by the optical power dΦ, R (θ) is the distance between the light source and dA, dΩ=dacos (θ)/R (θ) 2).
Further, the profile width: the width of the profile is the convolution of the intensity profile of the light source with the diffuser alignment beam response; the wider the input divergence angle of the diffuser, the wider the width and the slower the transition slope, the following two criteria can be used to determine an acceptable criterion for such losses: optical efficiency-the enclosed energy within the FOV of an imaging lens, and the specification defines how much energy the imaging module will receive, defined as: optical efficiency = 2D aggregate optical power/2D aggregate optical power% x-diffuser transmission efficiency of the entire illumination profile within the lens FOV;
the optical power drop outside the FOI is defined as: light power drop outside FOI = total integral of illumination profile-integral of illumination profile inside FOI/total integral% of illumination profile.
Further, the efficiency and uniformity of light collection on the pixel array of the microlens array greatly affect the overall performance, and the light collection efficiency is proportional to 1/(f/#) 2, where f/# = (focal length)/(aperture size), the efficiency and uniformity of light collection can be affected by stray light, and the factors that influence stray light are: halo, antireflection film, number of lenses, bandpass filter, and microlens array.
The method for measuring the depth of field of the mobile phone camera comprises the optical device for measuring the depth of field of the mobile phone camera, and further comprises the following steps: the light of the light source inside the illumination module is back scattered by an object in the field of view of the camera, the imager and the micro lens array of the imaging module receive the reflected light, the phase shift between the transmitted waveform and the reflected received waveform is measured, the depth value of each pixel can be calculated by measuring the phase shift of a plurality of modulation frequencies, the phase shift is obtained by using photon mixing demodulation in the pixel and measuring the correlation between the transmitted waveform and the received waveform under different relative delays, and the depth value can be obtained.
In the above technical solution, the optical device and method for measuring depth of field of mobile phone camera provided by the invention (1) the light source is a narrow-band light source with low wavelength and temperature dependence, the driver is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL), the VCSE is a VCSEL with low cost, small external dimension, high reliability and easy to integrate into the ToF module of the mobile phone camera imaging technology, and compared with EEL (light emitted from side) and LED (light emitted from side and top), the light emitted from VCSEL is perpendicular to its surface, so the production yield is higher and the manufacturing cost is lower.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic structural diagram of an illumination module and an imaging module provided by an embodiment of an optical device and a method for measuring depth of field of a mobile phone camera.
Fig. 2 is a schematic perspective view of an imaging module provided by an embodiment of an optical device and a method for measuring depth of field of a camera of a mobile phone according to the present invention.
Fig. 3 is a schematic diagram of a solar spectrum irradiance structure provided by an embodiment of an optical device and a method for measuring depth of field of a mobile phone camera.
Fig. 4 is a schematic diagram of a relationship structure between irradiance distribution and intensity provided by an embodiment of an optical device and a method for measuring depth of field of a mobile phone camera according to the present invention.
Fig. 5 is a schematic view of an exemplary structure of an illumination section provided by an embodiment of an optical device and a method for measuring depth of field of a camera of a mobile phone according to the present invention.
Fig. 6 is a schematic diagram of an example structure of relative illuminance provided by an embodiment of an optical device and method for measuring depth of field of a mobile phone camera according to the present invention
Reference numerals illustrate:
1. a lighting module; 2. an imaging module; 3. a light source; 4. an imaging lens; 5. a band-pass filter; 6. an imager; 7. a microlens array; 8. a second housing; 9. a bottom plate; 10. a first housing; 11. a diffuser; 12. a driver.
Detailed Description
In order to make the technical scheme of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the accompanying drawings.
As shown in fig. 1-6, an optical device for measuring depth of field of a mobile phone camera according to an embodiment of the present invention includes an illumination module 1 and an imaging module 2, where the illumination module 1 includes a first housing 10, a light source 3 is disposed inside the first housing 10, a driver 12 is disposed inside the light source 3, a diffuser 11 is disposed on an outer wall of a top of the first housing 10, the imaging module 2 includes a second housing 8 and a bottom plate 9, the second housing 8 covers an outer portion of the bottom plate 9, an imaging lens 4 is inserted inside the second housing 8, a band-pass filter 5 is disposed on an outer wall of a bottom portion of the second housing 8, an imager 6 is disposed inside the bottom plate 9, and a microlens array 7 is disposed inside the imager 6.
The invention provides an optical device for measuring depth of field of a mobile phone camera, which comprises an illumination module 1 and an imaging module 2, wherein the illumination module 1 comprises a first shell 10, a light source 3 is arranged in the first shell 10, a driver 12 is arranged in the light source 3, a diffuser 11 is arranged on the outer wall of the top of the first shell 10, the imaging module 2 comprises a second shell 8 and a bottom plate 9, the second shell 8 is covered outside the bottom plate 9, an imaging lens 4 is inserted in the second shell 8, a band-pass filter 5 is arranged on the outer wall of the bottom of the second shell 8, an imager 6 is arranged in the bottom plate 9, a micro lens array 7 is arranged in the imager 6, the light source 3 uses a narrow-band light source with low wavelength and temperature dependence, the driver 12 is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL), the vertical cavity surface emitting laser VCSE is low in cost, small in appearance size, high in reliability and easy to integrate into a ToF module of the camera imaging technology, compared with the EEL (emitting light beams from the side and LEDs from the side and the top of the camera, the Vertical Cavity Surface Emitting Laser (VCSEL) is more popular, and the light beams are produced at lower cost and better and lower cost.
In one embodiment of the present invention, the light source 3 is a narrow-band light source with a low wavelength dependence on temperature, and the driver 12 is a Vertical Cavity Surface Emitting Laser (VCSEL) or an Edge Emitting Laser (EEL).
In another embodiment provided by the present invention, the light source 3 affects the TOF performance due to different wavelengths of light, and the wavelength of 820nm or 940nm is selected according to the following conditions: sensor quantum efficiency and responsivity, QE measures the ability of a photodetector to convert photons to electrons, R measures the ability of a photodetector to convert optical power to current, human perception, and sunlight.
In yet another embodiment provided by the present invention, the sensor quantum efficiency and responsivity: quantum Efficiency (QE) and responsivity (R) are interrelated, the QE measuring the ability of a photodetector to convert photons to electrons, the R measuring the ability of a photodetector to convert optical power to current, the formula: r=r=qe×q/hc/λ (qe=number of electrons collected/number of photons impinging on the photo-detection plate (%); r=current on the photo-detection plate/photo-power on the photo-detection plate (a/W); r=qe×q/hc/λ; where q is the electron charge, h is the planck constant, c is the speed of light, λ is the wavelength);
the QE of the silicon-based sensor is more than 2 times better at 850nm than at 940 nm; for example, an ADICW ToF sensor has a QE of 44% at 850nm and only 27% at 940nm, with higher QEs and R resulting in better signal-to-noise ratio (SNR) for the same intensity of illumination light power, especially when less light is returned to the sensor (as is the case when long distance or low reflectivity objects are encountered).
In one embodiment provided by the present invention, as shown in FIG. 3, human perception: the human eye can perceive 850nm light, but cannot see 940nm light; sunlight: although sunlight in the visible spectrum region is strongest, the energy in the NIR region is still large, sunlight (more generally, ambient light) increases depth of field noise, shortens the effective distance of the ToF camera, and reduces the solar radiation intensity in the range of 920nm to 960nm due to atmospheric absorption in outdoor applications, and the solar radiation intensity in the 940nm range is less than half that in the 850nm range, i.e., in outdoor applications, running the ToF system at 940nm provides better resistance to light environmental interference and better depth sensing performance.
In another embodiment provided by the present invention, the radiation intensity of the light source 3 is one of the reasons for influencing the signal-to-noise ratio of the ToF system, and the light source generates a constant optical power, which is distributed into the three-dimensional space within the FOI generated by the diffusing optical element, and as the FOI increases, (as shown in table 1), table 1 lists several examples of FOIs and their corresponding radiation intensities (normalized to the radiation intensity of 60 ° x 45 ° FOIs), calculated as the optical power of each rectangular solid angle, and the energy undergone per solid arc (sr), i.e. the radiation intensity [ W/sr ], decreases; the beam profile integrity of the light source 3 should fulfil the following characteristics: the illumination profile shape within the FOI, the profile width, the optical efficiency (i.e., the blocking energy within a particular FOV) and the optical power outside the FOI decrease, and should satisfy the following equation: i=dΦ/dΩ (dΦ is the power incident into the solid angle dΩ).
In yet another embodiment provided by the present invention, as shown in FIG. 4, the illumination profile shape within the FOI: the most common radiation intensity distribution in TOF flood illumination is batwing shaped with a profile that varies with cos-n (θ) to compensate for the attenuation of the imaging lens, and the cos3 (θ) reduction factor [ W/m2 ] in irradiance (E) between the target center and the target edge]The definition formula of (2) is: e=dΦ/da=i (θ) cos (θ)/R (θ) 2=i (θ) cos3 (θ)/R0 2 The method comprises the steps of carrying out a first treatment on the surface of the (where E is the illuminance, dA is the surface area illuminated by the optical power dΦ, R (θ) is the distance between the light source 3 and dA, dΩ=dacos (θ)/R (θ) 2).
In another embodiment provided by the present invention, as shown in fig. 5, the profile width: the width of the profile is the convolution of the intensity profile of the light source with the diffuser alignment beam response; the wider the input divergence angle of the diffuser, the wider the width and the slower the transition slope, the following two criteria can be used to determine an acceptable criterion for such losses: optical efficiency-the enclosed energy within the FOV of an imaging lens, and the specification defines how much energy the imaging module will receive, defined as: optical efficiency = 2D aggregate optical power/2D aggregate optical power% x-diffuser transmission efficiency of the entire illumination profile within the lens FOV;
the optical power drop outside the FOI is defined as: light power drop outside FOI = total integral of illumination profile-illumination profile integral within FOI/total integral of illumination profile, optical efficiency is improved by providing a collimating lens between the light source and the diffuser to reduce the input angle of the diffuser, or selecting a light source with a smaller divergence angle.
In another embodiment provided by the present invention, as shown in fig. 6, the efficiency and uniformity of light collection on the pixel array of the microlens array 7 greatly affects the overall performance, and the light collection efficiency is proportional to 1/(f/#) 2, where f/# = (focal length)/(aperture size), and the optical system with small f/# requires trade-offs, and as the aperture size increases, more halation and aberrations occur, relative Illuminance (RI) and Chief Ray Angle (CRA), ri=the maximum illuminance within the single point illuminance/field of view on the sensor, and in a lens system without distortion and halation, the sensor illuminance falls according to (cosq) 4 law, where q is the CRA angle on the sensor plane. The result is a relatively darkened image toward the sensor boundary, which reduces the drop in irradiance by introducing negative distortion in the lens system; the maximum CRA of the sensor edge should be optimized according to the imager microlens array specification, smaller CRAs helping to reduce the bandwidth of the BPF, thus achieving better resistance to ambient light interference, the lens system of example 1 in fig. 6 has a larger CRA, and the imaging cone gradually decreases (i.e., f/# increases) as the field angle increases. The corresponding RI drops significantly with field angle, as shown in the corresponding RI diagram, example 2 in fig. 6 shows that RI can be well maintained by minimizing CRA and maintaining uniform f/# across the field of view; the efficiency and uniformity of light collection can be affected by stray light, which is what harmful light can be detected by the sensor in the system, which can come from inside or outside the field source, ghost (e.g., lens halo) can be formed by even number of reflections, stray light can also be emitted from the opto-mechanical structure and any scattering surface, the ToF system is particularly sensitive to stray light, because the multipath characteristics of stray light can lead to different optical path lengths to the pixels, resulting in inaccurate depth measurement, and the factors that affect stray light are: however, the cut-off light rays will typically bounce within the lens system, thus easily causing stray light problems, the antireflection film (on the optical element can reduce the reflectivity of the individual surfaces, thus effectively reducing the effect of lens reflection on depth calculations, the antireflection film should be carefully designed for the light source wavelength range and the angular range of the lens surface incidence angle), the number of lenses (although adding more lenses can provide more freedom for achieving design specifications and higher resolution image quality, it also increases unavoidable back reflection from the lenses and increases complexity and cost), the bandpass filter (BPF can cut off the influence of ambient light, critical for the ToF system, BPF design should be tailored for optimal performance for parameters such as f/# of the cross-field and CRA; light source parameters such as bandwidth, nominal wavelength tolerance and thermal offset; low incidence angle drift and wavelength or low thermal drift and wavelength substrate material characteristics (bsf) and the maximum number of photon sensors can be optimized for the light to reach the micro-image sensor array geometry of the micro-sensor array, and the maximum absorption rate of the light rays can be optimized for the micro-sensor array of pixels.
A method for measuring depth of field of a mobile phone camera comprises the following steps: the light of the light source 3 inside the illumination module 1 is backscattered by objects in the camera field of view, the reflected light is received by the imager 6 and the microlens array 7 of the imaging module 2, the phase shift between the transmitted and received waveforms is measured, the depth value of each pixel can be calculated by measuring the phase shift of a plurality of modulation frequencies, the phase shift is obtained by using intra-pixel photon mixing demodulation, and the correlation between the transmitted and received waveforms at different relative delays is measured, thus obtaining the depth value.
Working principle: when the optical device for measuring the depth of field of the mobile phone camera is used, light of the light source 3 in the lighting module 1 is backscattered by an object in a camera field of view, the imager 6 and the micro lens array 7 of the imaging module 2 receive the reflected light, phase shift between an emitted waveform and a reflected received waveform is measured, a depth value of each pixel can be calculated by measuring phase shift of a plurality of modulation frequencies, the phase shift is obtained by using photon mixing demodulation in the pixel and measuring correlation between the emitted waveform and the received waveform under different relative delays, the depth value can be obtained, the light source 3 uses a narrow-band light source with low wavelength and temperature correlation, the driver 12 is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL), the vertical cavity surface emitting laser VCSE has low cost, small outline size and high reliability and is more and more popular in a ToF module of the imaging technology of the mobile phone camera, and compared with the EEL (light rays are emitted from the side and the top), the light beam emitted from the VCSEL is perpendicular to the surface of the mobile phone camera, so that the production yield is higher and the manufacturing cost is lower.
While certain exemplary embodiments of the present invention have been described above by way of illustration only, it will be apparent to those of ordinary skill in the art that modifications may be made to the described embodiments in various different ways without departing from the spirit and scope of the invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive of the scope of the invention, which is defined by the appended claims.

Claims (10)

1. The utility model provides a mobile phone camera depth of field measuring optical device, includes illumination module (1) and imaging module (2), its characterized in that, illumination module (1) includes first casing (10), and the inside of first casing (10) is provided with light source (3), the inside of light source (3) is provided with driver (12), and is provided with diffuser (11) on the outer wall at first casing (10) top, imaging module (2) include second casing (8) and bottom plate (9), and second casing (8) cover in the outside of bottom plate (9), imaging lens (4) have been inserted to the inside of second casing (8), be provided with band-pass filter (5) on the outer wall of second casing (8) bottom, the inside of bottom plate (9) is provided with imager (6), and the inside of imager (6) is provided with microlens array (7).
2. An optical device for depth of field measurement of a camera of a mobile phone according to claim 1, characterized in that the light source (3) is a narrow-band light source with a low wavelength dependence on temperature, and the driver (12) is a Vertical Cavity Surface Emitting Laser (VCSEL) and an Edge Emitting Laser (EEL).
3. An optical device for depth of field measurement of a mobile phone camera according to claim 1, characterized in that the light source (3) affects the TOF performance due to different wavelengths of light, and the wavelength of 820nm or 940nm is selected according to the following conditions: sensor quantum efficiency and responsivity, QE measures the ability of a photodetector to convert photons to electrons, R measures the ability of a photodetector to convert optical power to current, human perception, and sunlight.
4. An optical device for depth of field measurement of a camera of a mobile phone according to claim 3, wherein the sensor quantum efficiency and responsivity: quantum Efficiency (QE) and responsivity (R) are interrelated, the QE measuring the ability of a photodetector to convert photons to electrons, the R measuring the ability of a photodetector to convert optical power to current, the formula: r=r=qe×q/hc/λ (qe=number of electrons collected/number of photons impinging on the photo-detection plate (%); r=current on the photo-detection plate/photo-power on the photo-detection plate (a/W); r=qe×q/hc/λ; where q is the electron charge, h is the planck constant, c is the speed of light, λ is the wavelength);
the QE of the silicon-based sensor is more than 2 times better at 850nm than at 940 nm; for example, the ADICW TOF sensor has a QE of 44% at 850nm and only 27% at 940 nm.
5. An optical device for depth of field measurement of a camera of a mobile phone according to claim 3, wherein the human perception: the human eye can perceive 850nm light, but cannot see 940nm light; the sunlight is as follows: when the solar energy sensor is used outdoors, the solar radiation intensity in the range of 920nm to 960nm is reduced due to the absorption of the atmosphere, and compared with the range of 850nm, the solar radiation intensity in the range of 940nm is less than half, namely, in the outdoor application, the ToF system is operated at 940nm, so that better light environment interference resistance and better depth sensing performance can be provided.
6. An optical device for depth of field measurement of a mobile phone camera according to claim 1, characterized in that the radiation intensity of the light source (3) is one of the reasons for influencing the signal-to-noise ratio of the ToF system (table 1), the beam profile integrity of the light source (3) should fulfil the following characteristics: the illumination profile shape within the FOI, the profile width, the optical efficiency (i.e., the blocking energy within a particular FOV) and the optical power outside the FOI decrease, and should satisfy the following equation: i=dΦ/dΩ (dΦ is the power incident into the solid angle dΩ).
7. The optical device for depth of field measurement of a camera of a mobile phone of claim 6, wherein the illumination profile shape within the FOI: the most common radiation intensity distribution in TOF flood illumination is batwing shaped with a profile that varies with cos-n (θ) to compensate for the attenuation of the imaging lens, and the cos3 (θ) reduction factor [ W/m2 ] in irradiance (E) between the target center and the target edge]The definition formula of (2) is: e=dΦ/da=i (θ) cos (θ)/R (θ) 2=i (θ) cos3 (θ)/R0 2 The method comprises the steps of carrying out a first treatment on the surface of the (where E is the illuminance, dA is the surface area illuminated by the optical power dΦ, R (θ) is the distance between the light source (3) and dA, dΩ=dacos (θ)/R (θ) 2).
8. The optical device for measuring depth of field of a camera of a mobile phone according to claim 6, wherein the profile width is: the width of the profile is the convolution of the intensity profile of the light source with the diffuser alignment beam response; the wider the input divergence angle of the diffuser, the wider the width and the slower the transition slope, the following two criteria can be used to determine an acceptable criterion for such losses: optical efficiency-the enclosed energy within the FOV of an imaging lens, and the specification defines how much energy the imaging module will receive, defined as: optical efficiency = 2D aggregate optical power/2D aggregate optical power% x-diffuser transmission efficiency of the entire illumination profile within the lens FOV;
the optical power drop outside the FOI is defined as: light power drop outside FOI = total integral of illumination profile-integral of illumination profile inside FOI/total integral% of illumination profile.
9. An optical device for measuring depth of field of a mobile phone camera according to claim 1, wherein the efficiency and uniformity of light collection on the pixel array of the microlens array (7) greatly affect the overall performance, and the light collection efficiency is proportional to 1/(f/#) 2, where f/# = (focal length)/(aperture size), the efficiency and uniformity of light collection are affected by stray light, and the factors affecting stray light are: halo, antireflection film, number of lenses, bandpass filter, and microlens array.
10. A method for measuring depth of field of a mobile phone camera, comprising an optical device for measuring depth of field of a mobile phone camera according to any one of claims 1-9, comprising the following steps:
the light of a light source (3) in the lighting module (1) is back scattered by an object in the field of view of the camera, the reflected light is received by an imager (3) and a micro lens array (7) of the imaging module (2), the phase shift between the transmitted waveform and the reflected received waveform is measured, the depth value of each pixel can be calculated by measuring the phase shift of a plurality of modulation frequencies, the phase shift is obtained by using photon mixing demodulation in the pixel and measuring the correlation between the transmitted waveform and the received waveform under different relative delays, and the depth value can be obtained.
CN202311557830.XA 2023-11-21 2023-11-21 Optical device and method for measuring depth of field of mobile phone camera Pending CN117528063A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311557830.XA CN117528063A (en) 2023-11-21 2023-11-21 Optical device and method for measuring depth of field of mobile phone camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311557830.XA CN117528063A (en) 2023-11-21 2023-11-21 Optical device and method for measuring depth of field of mobile phone camera

Publications (1)

Publication Number Publication Date
CN117528063A true CN117528063A (en) 2024-02-06

Family

ID=89758340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311557830.XA Pending CN117528063A (en) 2023-11-21 2023-11-21 Optical device and method for measuring depth of field of mobile phone camera

Country Status (1)

Country Link
CN (1) CN117528063A (en)

Similar Documents

Publication Publication Date Title
US11240422B2 (en) Method and system for multiple f-number lens
KR102165399B1 (en) Gated Sensor Based Imaging System With Minimized Delay Time Between Sensor Exposures
US9658109B2 (en) Non-contact thermal sensor module
US11438528B2 (en) System and method for short-wave-infra-red (SWIR) sensing and imaging
US8878114B2 (en) Apparatus and methods for locating source of and analyzing electromagnetic radiation
US20170082273A1 (en) Display device
US9869868B2 (en) Light splitting module for obtaining spectrums and dual-mode multiplexing optical device
Beck et al. Gated IR imaging with 128× 128 HgCdTe electron avalanche photodiode FPA
WO2019048548A1 (en) Sensor system and method to operate a sensor system
JP4359659B1 (en) Filter for light receiving element and light receiving device
CN117528063A (en) Optical device and method for measuring depth of field of mobile phone camera
CN111413687A (en) Laser radar optical system and laser radar
Beck et al. Gated IR imaging with 128× 128 HgCdTe electron avalanche photodiode FPA
JP2013201368A (en) Reflection light condensing type photodetector
CN111766697B (en) Fusion type telescope based on infrared and shimmer formation of image
CN110687667B (en) Coaxial internal reflection and coaxial beam-shaped distance measurement sighting telescope
CN114236499A (en) Laser radar
Sidorovich Optical countermeasures and security of free-space optical communication links
CN117706569A (en) Receiving module, ranging apparatus and vehicle
CN116413870A (en) Receiving lens, receiving module, distance measuring device and electronic equipment
NZ794045A (en) Method and system for multiple f-number lens

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination