CN112600995A - Camera assembly, calibration method thereof and electronic equipment - Google Patents

Camera assembly, calibration method thereof and electronic equipment Download PDF

Info

Publication number
CN112600995A
CN112600995A CN202011412014.6A CN202011412014A CN112600995A CN 112600995 A CN112600995 A CN 112600995A CN 202011412014 A CN202011412014 A CN 202011412014A CN 112600995 A CN112600995 A CN 112600995A
Authority
CN
China
Prior art keywords
light
gray
module
sensors
camera assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011412014.6A
Other languages
Chinese (zh)
Inventor
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011412014.6A priority Critical patent/CN112600995A/en
Publication of CN112600995A publication Critical patent/CN112600995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/63Noise processing, e.g. detecting, correcting, reducing or removing noise applied to dark current

Abstract

The disclosure relates to the technical field of electronic equipment, in particular to a camera assembly, a calibration method thereof and electronic equipment, wherein the camera assembly comprises a lens module, a light splitting module and a sensor module; the light splitting module is arranged on the light emitting side of the lens module and splits the light rays incident from the lens module into N light beams with different colors; the sensor module comprises N gray level sensors, the gray level sensors correspond to the light beams split by the light splitting module one by one, and the gray level sensors are used for receiving the corresponding light beams. The imaging quality of the electronic equipment can be improved.

Description

Camera assembly, calibration method thereof and electronic equipment
Technical Field
The disclosure relates to the technical field, in particular to a camera assembly, a calibration method thereof and electronic equipment.
Background
A camera of an electronic device generally includes a lens for collecting light and a sensor for receiving the light and generating an image signal. Color imaging is currently mainly achieved by providing a color filter array in an image sensor, such as by an RGB filter array. The single pixel unit often includes sub-pixel units of multiple colors, and the sub-pixel units of multiple colors are arranged in a staggered manner, so that the sub-pixel units often suffer interference from information of other colors when reading information of a target color. That is, noise signals exist in image signals acquired by the image sensor, so that noise and color cast appear in the images.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a camera assembly, a calibration method thereof, and an electronic device, so as to reduce noise signals in signals collected by an image sensor at least to a certain extent.
According to an aspect of the present disclosure, there is provided a camera assembly, including:
a lens module;
the light splitting module is arranged on the light emitting side of the lens module and splits the light rays incident from the lens module into N light beams with different wave bands;
the sensor module comprises N gray sensors, the gray sensors correspond to the light beams split by the light splitting module one by one, and the gray sensors are used for receiving the corresponding light beams;
wherein N is a positive integer greater than or equal to 2.
According to a second aspect of the present disclosure, there is provided a camera head assembly calibration method for the camera head assembly, the method including:
controlling the N gray level sensors to respectively acquire calibration images;
and adjusting the pose of the gray sensor according to the calibration image so as to enable the calibration images acquired by the N gray sensors to be the same.
According to a third aspect of the present disclosure, there is provided an electronic apparatus including the camera assembly described above.
The camera subassembly that this disclosure provided will through the beam splitting module the light that the incident light of lens module divide into the different light of N bundle of colours transmits to the grey level sensor who corresponds for the light that gets into every grey level sensor is monochromatic light, has avoided the light that single pixel unit received to receive receives the influence of other colour light, thereby reduces the noise signal in the signal that image sensor gathered to a certain extent, reduces noise point and colour cast in the image, promotes electronic equipment's imaging quality.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 is a schematic view of a first camera assembly provided in an exemplary embodiment of the present disclosure;
FIG. 2 is a block diagram of a camera assembly provided by an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic view of a second camera assembly provided by an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic view of a third camera assembly provided by an exemplary embodiment of the present disclosure;
FIG. 5 is a schematic view of a fourth camera assembly provided by an exemplary embodiment of the present disclosure;
FIG. 6 is a schematic view of a fifth camera assembly provided by an exemplary embodiment of the present disclosure;
fig. 7 is a schematic view of a first sensor module provided in an exemplary embodiment of the present disclosure;
figure 8 is a schematic diagram of a single photon avalanche diode provided by an exemplary embodiment of the present disclosure;
fig. 9 is a schematic view of a second sensor module provided in an exemplary embodiment of the present disclosure;
fig. 10 is a flowchart of a calibration method for a camera head assembly according to an exemplary embodiment of the present disclosure;
fig. 11 is a schematic view of an electronic device provided in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their detailed description will be omitted.
Although relative terms, such as "upper" and "lower," may be used in this specification to describe one element of an icon relative to another, these terms are used in this specification for convenience only, e.g., in accordance with the orientation of the examples described in the figures. It will be appreciated that if the device of the icon were turned upside down, the element described as "upper" would become the element "lower". When a structure is "on" another structure, it may mean that the structure is integrally formed with the other structure, or that the structure is "directly" disposed on the other structure, or that the structure is "indirectly" disposed on the other structure via another structure.
The disclosed embodiment first provides a camera assembly 10, as shown in fig. 1, the camera assembly 10 includes: the optical lens module comprises a lens module 100, a light splitting module 300 and a sensor module 400, wherein the light splitting module 300 is arranged on the light emitting side of the lens module 100, and the light splitting module 300 splits the light rays incident from the lens module 100 into N light beams with different colors; the sensor module 400 includes N gray sensors (mono sensors) 410, the gray sensors 410 correspond to the light beams split by the light splitting module 300 one by one, and the gray sensors 410 are used for receiving the corresponding light beams; wherein N is a positive integer greater than or equal to 2.
The camera assembly 10 that this disclosure provided, the light that divides into the different light of N bundle of wave bands with the incident light of lens module 100 through beam splitting module 300 transmits to corresponding grayscale sensor 410, make the light that gets into every grayscale sensor 410 be monochromatic light, avoided single pixel unit to receive the light and received the influence of other colour light, thereby reduce the noise signal in the signal that image sensor gathered to a certain extent, noise point and colour cast in the reduction image, promote electronic equipment's imaging quality.
Further, as shown in fig. 2, the camera assembly 10 provided in the embodiment of the present disclosure may further include a control module 200, where the control module 200 is connected to the N grayscale sensors 410, and the control module 200 is configured to synthesize a color image according to the grayscale images collected by the N grayscale sensors 410 and the color of the light beam corresponding to each grayscale sensor 410.
As shown in fig. 3, the camera assembly 10 provided in the embodiment of the present disclosure may further include a driving module 500, where the driving module 500 is respectively connected to the N grayscale sensors 410, and the driving module 500 is configured to adjust positions of the grayscale sensors 410 to achieve focusing and anti-shake of the camera assembly 10.
Portions of the camera head assembly 10 provided by the embodiments of the present disclosure will be described in detail below:
the lens module 100 may include a plurality of optical lenses, which are sequentially arranged on the light-entering side of the light-splitting module 300. The optical axes of the plurality of optical lenses may be coaxially arranged, and the plurality of optical lenses may include various lens combinations such as a concave lens, a convex lens, a plane mirror, and the like. The plurality of optical lenses may be plastic lenses or glass lenses; or a part of the optical lenses in the plurality of optical lenses are plastic lenses and the other part of the optical lenses are glass lenses. The plurality of optical lenses may be spherical lenses, aspherical lenses, or the like.
For example, the lens module 100 may include a first lens, a second lens, a third lens and a fourth lens, the first lens having a convex surface facing the light entering side; the second lens is arranged on one side of the first lens, which is far away from the light inlet side, and a concave surface is arranged on one side of the second lens, which is close to the first lens; the third lens is arranged on one side of the second lens, which is far away from the first lens, and the double surfaces of the third lens are aspheric surfaces; the fourth lens is arranged on one side of the third lens, which is far away from the second lens, and the double surfaces of the fourth lens are aspheric surfaces. The side of the third lens close to the second lens has a concave surface at the optical axis, the side of the third lens close to the fourth lens has a convex surface at the optical axis, the side of the fourth lens close to the third lens has a concave surface at the optical axis, and the side of the fourth lens far away from the third lens has a concave surface at the optical axis.
The first lens has a convex surface facing the light entrance side at the optical axis and has positive power. The second lens has a concave surface at the optical axis on a side close to the first lens, and has a negative power. The third lens has a concave surface facing the second lens side near the optical axis and has negative optical power. The fourth lens element has a concave surface facing the image side near the optical axis and has negative refractive power, and the image side surface of the fourth lens element is formed as an aspherical surface having a pole at a position other than the optical axis.
The first lens has positive power and is shaped such that a convex surface faces the object side in the vicinity of the optical axis. Therefore, spherical aberration, curvature of field, and distortion can be corrected well. The second lens has negative power, is shaped such that a concave surface faces the first lens side in the vicinity of the optical axis and the concave surface has a meniscus shape. Therefore, spherical aberration, curvature of field, and distortion can be corrected well. The third lens has positive power and is shaped such that a concave surface faces the second lens side and a convex surface faces the image side in the vicinity of the optical axis. Therefore, the incident angle of the light rays to the third lens becomes an appropriate value, and chromatic aberration, curvature of field, and distortion can be corrected well. The fourth lens has negative power, and is shaped such that the convex surface faces the third lens side near the optical axis, and the concave surface faces the side of the sensor module 400. Therefore, chromatic aberration, astigmatism, curvature of field, and distortion can be corrected well. The object-side surface and the image-side surface of the fourth lens are formed as aspherical surfaces having poles at positions other than the optical axis. Therefore, curvature of field and distortion are better corrected, and the incident angle of the light to the lens assembly can be appropriately controlled.
In addition, the combination of the plurality of optical lenses may be any one of 4P (Plastic Lens), 4G (Glass Lens), 3P +1G, 2P +2G, and P + 3G. Of course, in practical applications, the number of the optical lenses in the lens module 100 provided in the embodiment of the disclosure may also be three, five, or six, and the disclosure is not limited thereto.
In an exemplary embodiment of the disclosure, the light beam entering the lens module 100 may be a light beam generated by converging a composite light, where the composite light refers to a light beam composed of lights with different wavelength ranges, and the composite light includes white light, natural light, and the like.
The light splitting module 300 is disposed on the light emitting side of the lens module 100, and the light splitting module 300 splits the light incident from the lens module 100 into N light beams with different colors. For example, the light splitting module 300 may split incident natural light into RGB three colors of light, and transmit the three colors of light to the corresponding grayscale sensors 410.
As shown in fig. 4 and 5, the light splitting module 300 may include a reflector 310 and N filter films 320, where the N filter films 320 and the N gray-scale sensors 410 are in one-to-one correspondence and respectively disposed on light-entering sides of the corresponding gray-scale sensors 410; the reflector 310 has N preset deflection states, and the reflector 310 transmits the light incident from the lens module 100 to a filter 320 when in each preset deflection state.
Each of the filters 320 may be of one color, and the filters 320 are used for transmitting light of a predetermined color and filtering light of other colors. For example, the red filter 320 can transmit red light and filter out light other than red light; the green filter film 320 can transmit green light and filter light except the green light; the blue filter 320 can filter out light other than blue light by transmitting blue light.
For example, the reflecting surface of the reflecting mirror 310 may intersect the optical axis of the lens module 100, for example, in a plurality of preset deflection states, the included angle between the reflecting surface of the reflecting mirror 310 and the optical axis of the lens module 100 may be 45 degrees. For example, when the incident light is divided into three RGB colors, the reflector 310 may have three predetermined deflection states.
Each filter 320 may correspond to a deflection state of the mirror 310, and the filter 320 may be perpendicular to the light beam reflected by the mirror 310 in the corresponding deflection state. The plurality of filter films 320 may be distributed along the same circumference. For example, the filter films 320 may be uniformly distributed along the same circumference, and the rotation axis of the reflector 310 is the center of the circle. When the light splitting module 300 includes 3 filters 320, the three filters 320 are distributed along the same circumference, and the central angle corresponding to the two filters 320 may be 120 degrees.
The beam splitting module 300 further includes a driving motor (not shown) connected to the mirror 310 for driving the mirror 310 to switch between N preset deflection states. The drive motor may be coupled to the mirror 310 via a transmission. The drive motor and gear train can drive the mirror 310 to rotate universally. For example, a universal joint may be disposed at the back of the reflector 310, the driving motor may include a first motor, a second motor and a third motor, the first motor and the limiting device cooperate to drive the reflector 310 to a first preset deflection state, the second motor and the limiting device cooperate to drive the reflector 310 to a second preset deflection state, and the third motor and the limiting device cooperate to drive the reflector 310 to a third preset deflection state.
The N filter films 320 and the mirror 310 have the same rotation axis, each filter film 320 corresponds to one gray-scale sensor 410, the gray-scale sensor 410 is disposed on the light-emitting side of the filter film 320, and the light-entering surface of the gray-scale sensor 410 can be parallel to the filter film 320.
It should be noted that, when the light splitting module 300 includes the mirror 310 and N filters, the light splitting module described in the embodiment of the present disclosure splits the incident light into N beams, which is referred to as temporal beam splitting. One frame of picture acquisition can be divided into N time segments, and the mirror is in a preset deflection state in each time segment.
Or as shown in fig. 6, the light splitting module 300 includes N light splitting prisms 331 and N-1 light splitting films 332, where the N light splitting prisms 331 are sequentially disposed on the light exit side of the lens module 100; a light splitting film 332 is arranged between two adjacent prisms in the N light splitting prisms 331, and the N-1 light splitting films 332 are used for filtering incident light beams entering the light splitting module so as to split the incident light beams into N-1 reflected light beams and a transmitted light beam.
Wherein, the colors of the N-1 light splitting films 332 are different. Illustratively, the light beam entering the first-stage dichroic prism 331 is irradiated onto the dichroic film 332 of the first color, reflects the first-color reflected light, and transmits a light beam excluding the first-color reflected light; the light beam not including the first color reflected light is irradiated onto the light splitting film 332 of the second color, reflected light of the second color is reflected, and the light beam not including the second color reflected light and the second color reflected light is transmitted; until the light beam not including the first N-2 colors of reflected light irradiates on the light splitting film 332 with the (N-1) th color, the reflected light with the (N-1) th color is reflected, and the transmitted light with one color is transmitted; the first color reflected light, the second color reflected light, …, the (N-1) th color reflected light, and the transmitted light correspond to the N colors one by one.
The N spectroscopic prisms 331 are arranged in the irradiation direction in order from the start end of the irradiation direction of the light beam, and a first spectroscopic prism 331, second spectroscopic prisms 331 and …, and an nth spectroscopic prism 331 are obtained which are arranged in order in the irradiation direction. The light-sensing surface of the ith gray sensor 410 of the N gray sensors 410 covers the light-emitting surface corresponding to the ith beam splitter prism 331.
The N beam splitting prisms 331 are sequentially arranged according to the irradiation direction of the light beam, and the light beam is made to penetrate through at least two surfaces of each of the N beam splitting prisms 331. The light beam splitting surface is a light splitting surface. The light inlet surface and the light outlet surface of the first N-1 beam splitting prisms 331 are different from the light splitting surface, the light splitting surface corresponding to each of the first N-1 beam splitting prisms 331 is a surface through which a light beam does not pass, and the light splitting surface corresponding to the nth beam splitting prism 331 is a light outlet surface.
In the first N-1 spectral prisms 331, the spectral film 332 corresponding to the color is disposed on the light-emitting surface of the spectral prism 331. The light reflected by the splitting film 332 irradiates the light inlet surface, and the N-1 splitting prisms 331 are configured such that the light reflected by the splitting film 332 can be totally reflected on the light inlet surface, corresponding to the direction, the light inlet surface, and the light splitting surface of the light. In order to achieve total reflection, an air space may be provided between adjacent dichroic prisms 331. Of course, in practical applications, other medium with low refractive index may be filled between adjacent light splitting prisms 331, or a material with different refractive index may be selected for the light splitting prisms 331, and the refractive index increases from the first light splitting prism 331 to the nth light splitting prism 331 in sequence.
For example, when the color reference is a red, green and blue color reference, N is 3, a light beam of white light is emitted to the light splitting prism 331, a first light splitting prism, a second light splitting prism and a third light splitting prism are sequentially arranged along the irradiation direction of the light beam, a light outgoing surface corresponding to the first light splitting prism is covered with a first gray sensor, a light outgoing surface corresponding to the second light splitting prism is covered with a second gray sensor, and a light outgoing surface corresponding to the third light splitting prism is covered with a third image sensor. A splitting film of a first color is arranged on the surface of the first splitting prism adjacent to the second splitting prism or the surface of the second splitting prism adjacent to the second splitting prism; a splitting film of a second color is arranged on the surface of the second splitting prism adjacent to the third splitting prism or the surface of the third splitting prism adjacent to the second splitting prism; an air space with a preset width exists between the first light splitting prism and the second light splitting prism.
The two-color splitting films 332 disposed on two adjacent surfaces of the three splitting prisms 331 include: a blue dichroic splitting film for reflecting blue, a red dichroic splitting film for reflecting red, and a green dichroic splitting film for reflecting green. The filter wavelengths corresponding to the blue dichroic beam-splitting film, the red dichroic beam-splitting film, and the green dichroic beam-splitting film may be respectively set based on a wavelength range of blue light of approximately 400-.
It can be understood that, in the embodiments of the present disclosure, a dichroic long-wavelength-pass dichroic light splitting film) and a dichroic short-wavelength-pass dichroic light splitting film (short-wavelength-pass light splitting film) may be used instead of the red dichroic light splitting film and the blue dichroic light splitting film, which is not limited in the embodiments of the present disclosure; the long wavelength dispersion film allows light of a longer wavelength to pass therethrough, and the short wavelength dispersion film allows light of a shorter wavelength to pass therethrough.
The sensor module 400 includes N grayscale sensors 410, the grayscale sensors 410 correspond to the light beams output by the light splitting module 300 one by one, and the grayscale sensors 410 are disposed on the light-emitting sides of the corresponding light splitting films 332. A photosensitive array may be included in the grayscale sensor 410 for converting light signals into electrical signals. The light sensing units in the light sensing array respond to the light signals to generate electric signals, the intensity of the electric signals is positively correlated with the light intensity, and therefore the gray scale of the corresponding pixels can be determined through the sensing current generated by the light sensing units.
The distribution of the light sensing arrays in the N grayscale sensors 410 may be aligned, that is, the light sensing unit in each grayscale sensor 410 in the N grayscale sensors 410 corresponds to each other, and the signals detected by the corresponding light sensing units in the N grayscale sensors 410 are overlapped to form a signal of one pixel unit. Of course, the color of the corresponding splitting film 332 is also considered in the superposition, which is the superposition of N colors and gray scales.
In practical applications, the light splitting films 332 of different colors transmit light, so that the grayscale sensor 410 can be numbered. For example, the red light splitting film 332 corresponds to the first grayscale sensor 410, the green light splitting film 332 corresponds to the second grayscale sensor 410, and the blue light splitting film 332 corresponds to the third grayscale sensor 410. At this time, the gray signals collected by the three gray sensors 410 may be superimposed to form a final image, and during the superimposition, the corresponding pixels may be color-synthesized according to the numbers of the gray sensors 410 to form a color image.
Each of the grayscale sensors 410 may be connected to a circuit board on which a corresponding circuit is disposed. The circuitry on the circuit board may be connected to circuitry on a motherboard of the electronic device, such as a graphics processor, via a flexible circuit board. The driving module 500 may be connected to the gray sensor 410, for example, to a circuit board on which the gray sensor 410 is mounted.
The driving module 500 may include N motors, one motor being connected to each of the gray sensor 410. Focusing and anti-shake of the camera assembly 10 can be achieved by the corresponding driving of the grayscale sensor 410 by the motor. In the shooting process, a plurality of motors can be controlled to drive a plurality of gray sensors 410 to perform combined calibration. Then focusing and anti-shake are carried out during shooting.
The driving module 500 may include a focusing anti-shake motor, and the focusing anti-shake motor may be connected to the lens module 100 for driving the lens module 100 to focus and prevent shake. The driving of the lens module 100 may be to drive the lens module 100 to move integrally, or to drive one or more optical lenses in the lens module 100 to move.
Under weak light conditions, the photodiode cannot work in the reverse bias region of the diode, so that the photodiode cannot work when the light intensity is low. Thereby causing the image sensor to have poor or no imaging quality under low light irradiation.
To solve the above problem, the gray sensor 410 may be a single photon avalanche diode sensor, and the gray sensor 410 includes a plurality of single photon avalanche diodes 41, and the plurality of single photon avalanche diodes 41 are configured to receive photons transmitted by the corresponding optical filter 310 and generate a sensing signal.
The single photon avalanche diode 41 can respond to single photon to generate current, and can generate saturated large current signal only if single photon is arranged in an active region during operation. The gray sensor 410 may be a back-illuminated gray sensor or a front-illuminated gray sensor.
As shown in fig. 7, the gray sensor 410 may be a back-illuminated gray sensor, and the single photon avalanche diode 41 includes: the light splitting module comprises a substrate 401, an avalanche layer 402 and a cathode layer 403, wherein an anode region 4011 is arranged on the substrate 401, a first accommodating part 4012 is arranged on the substrate 401, and the first accommodating part 4012 is located on one side, away from the light splitting module 300, of the anode region 4011; the avalanche layer 402 is provided in the first accommodating portion 4012 of the substrate 401; a cathode layer 403 is provided on the avalanche layer 402 and the cathode layer 403 is on the side of the avalanche layer 402 remote from the anode region 4012.
As shown in fig. 8, the single photon avalanche diodes 41 can be separated by a guard ring 42, the guard ring 42 can be an insulating protective layer, the guard ring 42 can insulate and isolate the single photon avalanche diodes 41, and the guard ring 42 can protect the single photon avalanche diodes 41.
The guard rings 42 may be closed ring structures and one or more single photon avalanche diodes 41 may be disposed within one of the guard rings 42. When a plurality of single photon avalanche diodes 41 are provided in one guard ring 42, the plurality of single photon avalanche diodes 41 may be isolated by an isolation trench, for example, the plurality of single photon avalanche diodes 41 may be isolated by a Shallow trench isolation 43 (STI).
The single photon avalanche diode 41 in the n +/p-well structure is provided in the embodiments of the present disclosure, and is only an exemplary illustration, and the single photon avalanche diode 41 in the embodiments of the present disclosure may also be an avalanche type photodiode in other n +/p-well structures, and the embodiments of the present disclosure are not limited thereto.
When the plurality of single photon avalanche diodes 41 are isolated by shallow trench isolation 43, as shown in fig. 6, the single photon avalanche diodes 41 may further include a cathode diffusion layer 404, the cathode diffusion layer 404 being disposed between the avalanche layer 402 and the cathode layer 403. By making a further cathode diffusion layer 404 between the cathode layer 403 and the avalanche layer 402, the avalanche layer 402 is moved from the surface of the cathode layer 403 into a region remote from the surface, which enables the avalanche region to be remote from the shallow trench isolation 43. Because of the Si-SiO at the 43 interface of the shallow trench isolation2A large number of trap levels can trap carriers, so that an electric field of the avalanche layer 402 is strong, if the trapped carriers are very close to the avalanche layer 402, the trapped carriers can easily enter the avalanche layer 402 to initiate avalanche ionization, so that the device is subjected to false breakdown, and finally, the DCR (Dark count rate) of the device is too large, so that the problem can be solved through the cathode diffusion layer 404.
For example, the first accommodating portion 4012 has a first opening 4013 on a side away from the anode region 4011 (the opening is located on a surface of the substrate 401). A stepped hole, which may be a stepped square hole or a stepped circular hole, is provided on the substrate 401. The avalanche layer 402 may be disposed at the bottom of the step hole, where the step hole is a blind hole, and the bottom of the step hole refers to an end of the step hole far away from the first opening 4013. The cathode diffusion layer 404 is disposed on a side of the avalanche layer 402 away from the bottom of the step hole, and a side of the cathode diffusion layer 404 away from the avalanche layer 402 may be exposed to the first opening 4013 of the step hole. The cathode layer 403 is embedded in the cathode diffusion layer 404, and the cathode layer 403 is exposed to the surface of the cathode diffusion layer 404 remote from the avalanche layer 402. In the surface where the avalanche layer 402 and the cathode diffusion layer 404 contact each other, the area of the contact surface of the cathode diffusion layer 404 is larger than the area of the contact surface of the avalanche layer 402. The face of the stepped hole in the substrate 401 where the first opening 4013 is located may extend to be flush with the surface of the cathode layer 403 remote from the avalanche layer 402. The side of the substrate 401 where the first opening 4013 of the stepped hole is located may extend to the bottom end of the shallow trench isolation 43. The bottom end of the shallow trench isolation 43 refers to the end of the substrate 401 where the shallow trench isolation 43 is embedded. The top end face of the shallow trench isolation 43 is flush with the top end face of the cathode diffusion layer 404.
The depth of the shallow trench isolation 43 is greater than the depth of the cathode layer 403, and the depth of the shallow trench isolation 43 is less than the depth of the cathode diffusion layer 404. Here, the depth refers to a distance of each device in a direction from the cathode layer 403 to the avalanche layer 402. The depth of the shallow trench isolation 43 may be 1 to 3 microns.
The shallow trench isolation 43 may be formed by depositing, patterning, and etching silicon through a silicon nitride mask and filling the trench with a deposited oxide. In the process of forming the shallow trench isolation 43, a silicon nitride layer may be deposited on the semiconductor substrate 401 first, and then the silicon nitride layer may be patterned to form a hard mask; then, the substrate 401 is etched to form a trench between the adjacent cathode diffusion layers 404; finally, the trench is filled with an oxide to form a device shallow trench isolation 43. As an example, the shallow trench isolation 43 may have a trapezoidal cross-sectional shape and the filled oxide may be silicon dioxide.
The cathode layer 403 and the cathode diffusion layer 404 are doped with a first type of dopant and the avalanche layer 402 and the substrate 401 are doped with a second type of dopant. Illustratively, the cathode layer 403 may be a heavily n-doped semiconductor layer (e.g., a heavily n-doped silicon layer). The cathode diffusion layer 404 may be an n-type doped semiconductor layer (such as n-type silicon) with a doping concentration less than that of the cathode layer 403. The avalanche layer 402 can be a heavily p-doped semiconductor layer (such as a heavily p-doped silicon layer). The substrate 401 may be a semiconductor layer (such as p-type silicon) that may be p-doped with a doping concentration less than the avalanche layer 402.
In the embodiment of the disclosure, an n +/p-well type pn junction design is adopted, electron ionization is mainly used during n +/p-well avalanche breakdown, and the electron mobility is about 3 times higher than the hole mobility, so that the electron ionization is easier than the hole ionization. The sensitivity of the image sensor is improved, namely the photon detection efficiency is higher. The p-type substrate 401 is adopted, the p-type substrate 401 is usually selected in a CMOS process, firstly, an integrated circuit tends to mainly adopt an NMOS transistor, and the NMOS transistor is electron-conductive, so that the electron mobility is about 3 times of the hole mobility in the PMOS transistor under the same condition; secondly, the p-type substrate 401 can be directly used as an NMOS transistor, and the p-type silicon substrate 401 can be directly grounded, so that the bias voltage of the image sensor during operation can be reduced, and the noise signal can be stably reduced.
In back-illuminated single photon avalanche diodes 41 using n +/p-well technology, the avalanche region is mainly created by electron ionization in the p-well. The electron ionization probability is about 3 times higher than the hole ionization probability. The n +/p-well in the back-illuminated image sensor adopts electron avalanche ionization, the ionization rate is high, and the photon detection efficiency PDE is high.
When the gray-scale sensor is a back-illuminated image sensor, a signal acquisition circuit is arranged on one side of the single photon avalanche diode 41 array, which is far away from the light inlet side. The signal acquisition circuit is used for outputting the electric signals in the single-photon avalanche diode 41 array. For example, the electrical signals may be output row by row or column by column in a scanning manner.
Alternatively, as shown in fig. 9, the gray sensor 410 may be a front-illuminated gray sensor. The single photon avalanche diode 41 may include a substrate 401; the cathode layer 403 is arranged on one side, close to the light splitting module 300, of the substrate 401, and the cathode layer 403 is provided with a second accommodating portion; the avalanche layer 402 is embedded in the cathode layer 403 at the side far from the substrate 401; the anode layer is provided on the side of the avalanche layer 402 remote from the substrate 401.
The plurality of single photon avalanche diodes 41 may be separated by a guard ring 42, the guard ring 42 may be an insulating protective layer, and the guard ring 42 may insulate and isolate the plurality of single photon avalanche diodes 41 on the one hand and protect the single photon avalanche diodes 41 on the other hand.
The guard rings 42 may be closed ring structures and one or more single photon avalanche diodes 41 may be disposed within one of the guard rings 42. When a plurality of single photon avalanche diodes 41 are provided in one guard ring 42, the plurality of single photon avalanche diodes 41 may be isolated by an isolation trench, for example, the plurality of single photon avalanche diodes 41 may be isolated by a Shallow trench isolation 43 (STI).
When a plurality of single photon avalanche diodes 41 are included in one guard ring 42, the avalanche layers 402 in any two adjacent single photon avalanche diodes 41 in the plurality of single photon avalanche diodes 41 in the same guard ring 42 are isolated by shallow trench isolation 43, and the depth of the shallow trench isolation 43 is greater than the depth of the anode layer and less than the depth of the avalanche layers 402.
The cathode layer 403 comprises a first type of dopant, the avalanche layer 402, the anode layer and the substrate 401 comprise a second type of dopant, and the doping concentration of the avalanche layer 402 is less than the doping concentration of the anode layer. Illustratively, the cathode layer 403 may be a heavily n-doped semiconductor layer, the cathode layer 403 forming an n-well. The anode layer may be a heavily p-doped semiconductor layer and the avalanche layer 402 may be a p-doped semiconductor, the avalanche layer 402 having a lower doping concentration than the anode layer.
The single photon avalanche diode 41 with a p +/n-well structure is provided in the embodiments of the present disclosure, which is only an exemplary illustration, and the photosensitive pixel module provided in the embodiments of the present disclosure may also be used for avalanche type photodiodes with other p +/n-well structures, and the embodiments of the present disclosure are not limited thereto.
When the gray sensor 410 is a front-illuminated image sensor, a signal acquisition circuit is provided on the light-entering side of the array of single photon avalanche diodes 41. The signal acquisition circuit is used for outputting the electric signals in the single-photon avalanche diode 41 array. For example, the electrical signals may be output row by row or column by column in a scanning manner.
The utility model provides a camera subassembly, the beam splitting module will the light that the incident light of lens module divide into the different light of N bundle of wave bands transmits to corresponding grey level sensor for the light that gets into every grey level sensor is monochromatic light, has avoided the light that single pixel unit received to receive the influence of other colour light, thereby reduces the noise signal in the signal that image sensor gathered to a certain extent, reduces noise point and colour cast in the image, promotes electronic equipment's image quality.
And the grayscale sensor 410 may be a single photon avalanche diode grayscale sensor 410, which can realize imaging under weak light conditions. And the light is divided into a plurality of beams by the light splitting module 200, and the plurality of single-photon avalanche diode gray scale sensors 410 are used for imaging, so that the pixel density of the image can be improved, and the problem of low pixel density of the single-photon avalanche diode gray scale sensors 410 is solved.
The exemplary embodiment of the present disclosure also provides a calibration method for a camera assembly, which is used for the camera assembly, as shown in fig. 10, and the method includes:
step S110, controlling N gray level sensors to respectively acquire calibration images;
and step S120, adjusting the pose of the gray sensor according to the calibration image so as to enable the calibration images obtained by the N gray sensors to be the same.
According to the calibration method of the camera assembly provided by the embodiment of the disclosure, the poses of the gray sensors 410 are adjusted through the calibration images, so that the calibration images acquired by the N gray sensors 410 are the same, the alignment of the plurality of gray sensors 410 is realized, and the images acquired by the plurality of gray sensors 410 can be aligned during fusion.
In step S110, the N grayscale sensors 410 may be controlled to respectively acquire calibration images.
When the N grayscale sensors 410 acquire the calibration image, the position of the camera assembly 10 may be fixed, for example, the camera assembly 10 may be fixed on the calibration table. A calibration plate is provided in front of the camera assembly 10, and a calibration pattern, such as a checkerboard calibration pattern, is provided on the calibration plate. The N grayscale sensors 410 are controlled to respectively acquire images (calibration images) of the calibration patterns on the calibration plate. The calibration image may be acquired by a plurality of gray-scale sensors 410 simultaneously or by a plurality of gray-scale sensors 410 in a time-sharing manner.
In step S120, the poses of the grayscale sensors 410 may be adjusted according to the calibration images, so that the calibration images obtained by the N grayscale sensors 410 are the same.
Adjusting the pose of the grayscale sensor 410 according to the calibration image so that the calibration images acquired by the N grayscale sensors 410 are the same may include: and adjusting the distance and the offset angle between each gray sensor 410 and the lens module 100 according to the calibration image, so that the calibration images obtained by the N gray sensors 410 are the same.
The plurality of grayscale sensors 410 respectively acquire the calibration images, extract feature points (e.g., corner points of the checkerboard pattern) in the calibration images, and calculate coordinates of each feature point. Since the light collected by each gray sensor 410 is incident on the same lens module 100, the coordinate system of the lens module 100 can be used as a world coordinate system to calculate the coordinate of each feature point in each gray sensor 410 in the respective coordinate system, and the distance and offset angle to be adjusted for each gray sensor 410 can be calculated through the coordinate. The distance and offset angle may be a movement distance and offset angle with respect to a world coordinate system.
The exemplary embodiments of the present disclosure also provide an electronic apparatus, as shown in fig. 11, which includes the camera assembly 10 described above.
The camera assembly includes: the light splitting module is arranged on the light emitting side of the lens module and divides the light rays incident from the lens module into N light beams with different colors; the sensor module comprises N gray level sensors, the gray level sensors correspond to the light beams split by the light splitting module one by one, and the gray level sensors are used for receiving the corresponding light beams.
The utility model provides an electronic equipment, the beam splitting module will the light that the incident light of camera lens module divide into the light transmission of the different light of N bundle of wave bands to corresponding grayscale sensor for the light that gets into every grayscale sensor is monochromatic light, has avoided the light that single pixel unit received to receive the influence of other colour light, thereby reduces the noise signal in the signal that image sensor gathered to a certain extent, reduces noise point and colour cast in the image, promotes electronic equipment's image quality.
The electronic device in the embodiment of the present disclosure may be an electronic device with a camera module, such as a mobile phone, a tablet computer, a wearable device, a camera, or a video camera. The following description takes an electronic device as a mobile phone as an example:
the electronic device may further include a middle frame 20, a main board 30, a display screen 70, a battery 40, and the like, where the display screen 70, the middle frame 20, and the rear cover 50 form a receiving space for receiving other electronic components or functional modules of the electronic device. Meanwhile, the display screen 70 forms a display surface of the electronic device for displaying information such as images, texts, and the like. The Display screen 70 may be a Liquid Crystal Display (LCD) or an Organic Light-Emitting Diode (OLED) Display screen.
A glass cover plate may be provided on the display screen 70. Wherein the glass cover plate may cover the display screen 70 to protect the display screen 70 from being scratched or damaged by water.
The display screen 70 may include a display area as well as a non-display area. Wherein the display area performs the display function of the display screen 70 for displaying information such as images, text, etc. The non-display area does not display information. The non-display area can be used for arranging functional modules such as a camera, a receiver, a proximity sensor and the like. In some embodiments, the non-display area may include at least one area located at an upper portion and a lower portion of the display area.
The display screen 70 may be a full-face screen. At this time, the display screen 70 may display information in full screen, so that the electronic device has a larger screen occupation ratio. The display screen 70 includes only display areas and no non-display areas.
The middle frame 20 may be a hollow frame structure. The material of the middle frame 20 may include metal or plastic. The main board 30 is mounted inside the receiving space. For example, the main board 30 may be mounted on the middle frame 20 and be received in the receiving space together with the middle frame 20. The main board 30 is provided with a grounding point to realize grounding of the main board 30.
One or more of the functional modules such as a motor, a microphone, a speaker, a receiver, an earphone interface, a universal serial bus interface (USB interface), a proximity sensor, an ambient light sensor, a gyroscope, and a processor may be integrated on the main board 30. Meanwhile, the display screen 70 may be electrically connected to the main board 30.
Wherein, the sensor module can include degree of depth sensor, pressure sensor, gyroscope sensor, baroceptor, magnetic sensor, acceleration sensor, distance sensor, be close optical sensor, fingerprint sensor, temperature sensor, touch sensor, ambient light sensor and bone conduction sensor etc.. The Processor may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The main board 30 is also provided with a display control circuit. The display control circuit outputs an electrical signal to the display screen 70 to control the display screen 70 to display information. The light emitting control unit and the color change control unit may be provided on the main board.
The battery 40 is mounted inside the receiving space. For example, the battery 40 may be mounted on the middle frame 20 and be received in the receiving space together with the middle frame 20. The battery 40 may be electrically connected to the motherboard 30 to enable the battery 40 to power the electronic device. The main board 30 may be provided with a power management circuit. The power management circuit is used to distribute the voltage provided by the battery 40 to the various electronic components in the electronic device.
The rear cover 50 serves to form an outer contour of the electronic apparatus. The rear cover 50 may be integrally formed. In the forming process of the rear cover 50, a rear camera hole, a fingerprint identification module mounting hole and the like can be formed in the rear cover 50. The camera assembly 10 may be provided on a main board and a center frame, and the camera assembly 10 receives light from the rear camera hole. Of course, in practical applications, the camera head assembly 10 may also be a front camera head, and the embodiment of the present disclosure is not limited thereto.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (12)

1. A camera head assembly, characterized in that the camera head assembly comprises:
a lens module;
the light splitting module is arranged on the light emitting side of the lens module and splits the light rays incident from the lens module into N light beams with different colors;
the sensor module comprises N gray sensors, the gray sensors correspond to the light beams split by the light splitting module one by one, and the gray sensors are correspondingly arranged on paths of the split light beams of the light splitting module so as to receive the corresponding light beams;
wherein N is a positive integer greater than or equal to 2.
2. A camera assembly according to claim 1, wherein said beam splitting module comprises:
the N optical filters correspond to the N gray-scale sensors one by one and are respectively arranged on the light inlet sides of the corresponding gray-scale sensors;
the reflector is provided with N preset deflection states, and transmits the light rays incident from the lens module to the optical filter when the reflector is positioned in each preset deflection state.
3. A camera assembly according to claim 2, wherein said beam splitting module further comprises:
and the driving motor is connected with the reflecting mirror and used for driving the reflecting mirror to switch between N preset deflection states.
4. A camera assembly according to claim 1, wherein said beam splitting module comprises:
the N light splitting prisms are sequentially arranged on the light emitting side of the lens module;
the light splitting module comprises N-1 light splitting films, wherein one light splitting film is arranged between two adjacent prisms in the N light splitting prisms, and the N-1 light splitting films are used for filtering incident light beams entering the light splitting module so as to divide the incident light beams into N-1 reflected light beams and a bundle of transmitted light.
5. A camera assembly according to claim 1, wherein the grayscale sensor includes:
and the plurality of single photon avalanche diodes are used for receiving the photons transmitted by the corresponding optical filters and generating sensing signals.
6. A camera assembly according to claim 5, wherein said single photon avalanche diode comprises:
the light splitting module comprises a substrate, a first light splitting module and a second light splitting module, wherein the substrate is provided with an anode region, and a first accommodating part is arranged on the substrate and is positioned on one side, far away from the light splitting module, of the anode region;
an avalanche layer provided in the first accommodating portion of the substrate;
the cathode layer is arranged on the avalanche layer, and the cathode layer is positioned on one side of the avalanche layer, which is far away from the anode region.
7. A camera assembly according to claim 5, wherein said single photon avalanche diode comprises:
a substrate;
the cathode layer is arranged on one side, close to the light splitting module, of the substrate;
the avalanche layer is embedded on one side of the cathode layer far away from the substrate;
and the anode layer is arranged on one side of the avalanche layer far away from the substrate.
8. A camera assembly according to claim 1, wherein said camera assembly further comprises:
and the control module is respectively connected with the N gray sensors and is used for synthesizing a color image according to the gray images acquired by the N gray sensors and the color of the light beam corresponding to each gray sensor.
9. A camera assembly according to claim 1, wherein said camera assembly further comprises:
and the driving module is respectively connected with the N gray level sensors and is used for adjusting the positions of the gray level sensors so as to realize focusing and anti-shaking of the camera assembly.
10. A camera head assembly calibration method for a camera head assembly according to any one of claims 1 to 9, the method comprising:
controlling the N gray level sensors to respectively acquire calibration images;
and adjusting the pose of the gray sensor according to the calibration image so as to enable the calibration images acquired by the N gray sensors to be the same.
11. The camera assembly calibration method according to claim 10, wherein adjusting the pose of the gray sensor according to the calibration image so that the calibration images obtained by the N gray sensors are the same comprises:
and adjusting the distance and the offset angle between each gray sensor and the lens module according to the calibration image so as to enable the calibration images obtained by the N gray sensors to be the same.
12. An electronic device, characterized in that the electronic device comprises a camera assembly according to any of claims 1-9.
CN202011412014.6A 2020-12-04 2020-12-04 Camera assembly, calibration method thereof and electronic equipment Pending CN112600995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011412014.6A CN112600995A (en) 2020-12-04 2020-12-04 Camera assembly, calibration method thereof and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011412014.6A CN112600995A (en) 2020-12-04 2020-12-04 Camera assembly, calibration method thereof and electronic equipment

Publications (1)

Publication Number Publication Date
CN112600995A true CN112600995A (en) 2021-04-02

Family

ID=75188363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011412014.6A Pending CN112600995A (en) 2020-12-04 2020-12-04 Camera assembly, calibration method thereof and electronic equipment

Country Status (1)

Country Link
CN (1) CN112600995A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117395524A (en) * 2023-12-11 2024-01-12 荣耀终端有限公司 Image sensor, camera module, electronic equipment and display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242457A1 (en) * 2010-04-01 2011-10-06 Industrial Technology Research Institute Composite color separation system
CN204028004U (en) * 2014-07-25 2014-12-17 高秀敏 A kind of substance detecting apparatus based on Raman filtering
CN105973465A (en) * 2015-03-12 2016-09-28 精工爱普生株式会社 Spectrometry device and image forming apparatus
CN107809576A (en) * 2017-12-14 2018-03-16 信利光电股份有限公司 A kind of multi-cam module
CN208369689U (en) * 2018-04-13 2019-01-11 甘肃智呈网络科技有限公司 A kind of light splitting photographic device
CN211152041U (en) * 2020-02-25 2020-07-31 RealMe重庆移动通信有限公司 Electronic equipment and camera assembly thereof
CN111769126A (en) * 2020-06-16 2020-10-13 Oppo广东移动通信有限公司 Photosensitive pixel module, image sensor and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110242457A1 (en) * 2010-04-01 2011-10-06 Industrial Technology Research Institute Composite color separation system
CN204028004U (en) * 2014-07-25 2014-12-17 高秀敏 A kind of substance detecting apparatus based on Raman filtering
CN105973465A (en) * 2015-03-12 2016-09-28 精工爱普生株式会社 Spectrometry device and image forming apparatus
CN107809576A (en) * 2017-12-14 2018-03-16 信利光电股份有限公司 A kind of multi-cam module
CN208369689U (en) * 2018-04-13 2019-01-11 甘肃智呈网络科技有限公司 A kind of light splitting photographic device
CN211152041U (en) * 2020-02-25 2020-07-31 RealMe重庆移动通信有限公司 Electronic equipment and camera assembly thereof
CN111769126A (en) * 2020-06-16 2020-10-13 Oppo广东移动通信有限公司 Photosensitive pixel module, image sensor and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117395524A (en) * 2023-12-11 2024-01-12 荣耀终端有限公司 Image sensor, camera module, electronic equipment and display device

Similar Documents

Publication Publication Date Title
US11720059B2 (en) Method, apparatus and system providing holographic layer as micro-lens and color filter array in an imager
JP7292269B2 (en) Digital pixels with extended dynamic range
CN206759600U (en) Imaging system
CN206758436U (en) Pel array
EP1506679B1 (en) Color filter imaging array and method of formation
US5648655A (en) Sensing device for capturing a light image
US5340978A (en) Image-sensing display panels with LCD display panel and photosensitive element array
US10911738B2 (en) Compound-eye imaging device
CN101395926A (en) Fused multi-array color image sensor
JP2011176715A (en) Back-illuminated image sensor and imaging apparatus
US20230046521A1 (en) Control method, camera assembly, and mobile terminal
CN112235494B (en) Image sensor, control method, imaging device, terminal, and readable storage medium
CN112997478B (en) Solid-state image pickup device and electronic apparatus
CN103037180A (en) Image sensor and image pickup apparatus
WO2018221443A1 (en) Solid-state imaging device and electronic device
CN112839215B (en) Camera module, camera, terminal device, image information determination method and storage medium
JP2013157442A (en) Image pickup element and focal point detection device
US9276029B1 (en) Optical isolation grid over color filter array
JP2014003116A (en) Image pickup device
CN109151281A (en) A kind of pixel aperture offset camera obtaining depth information
CN112600995A (en) Camera assembly, calibration method thereof and electronic equipment
CN112600997A (en) Camera assembly, calibration method thereof and electronic equipment
US20190123075A1 (en) Color pixel and range pixel combination unit
US11817468B2 (en) Image sensing device
JP5060216B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210402

RJ01 Rejection of invention patent application after publication