WO2021077374A1 - 图像传感器、成像装置及移动平台 - Google Patents

图像传感器、成像装置及移动平台 Download PDF

Info

Publication number
WO2021077374A1
WO2021077374A1 PCT/CN2019/113097 CN2019113097W WO2021077374A1 WO 2021077374 A1 WO2021077374 A1 WO 2021077374A1 CN 2019113097 W CN2019113097 W CN 2019113097W WO 2021077374 A1 WO2021077374 A1 WO 2021077374A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel unit
photosensitive element
image sensor
pixel
tube
Prior art date
Application number
PCT/CN2019/113097
Other languages
English (en)
French (fr)
Inventor
徐泽
肖�琳
王彪
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2019/113097 priority Critical patent/WO2021077374A1/zh
Priority to CN201980032071.0A priority patent/CN112119624A/zh
Publication of WO2021077374A1 publication Critical patent/WO2021077374A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/46Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/709Circuitry for control of the power supply

Definitions

  • This application relates to the field of camera technology, and in particular to an image sensor, imaging device and mobile platform for phase detection and autofocus.
  • the camera in order for the camera to obtain a clear image, the camera has a focusing function.
  • One of the focusing techniques is phase detection auto focus (PDAF).
  • PDAF phase detection auto focus
  • some pixels of the image sensor are used as pixels for focusing. These focusing pixels include left half-masked pixels and right half-masked pixels.
  • a semi-masked pixel is formed by fabricating a metal shielding layer (metal shield) for light shielding above the photosensitive surface of the pixel.
  • the application provides an image sensor, an imaging device and a mobile platform.
  • the image sensor of the embodiment of the present application includes a plurality of pixel units, each of the pixel units includes a photosensitive element located in the pixel unit, and the plurality of pixel units includes a first pixel unit, a second pixel unit, and a third pixel unit.
  • the pixel unit, the first pixel unit is used for imaging, the second pixel unit and the third pixel unit are used for focusing, and the size of the photosensitive element of the first pixel unit is larger than that of the second pixel unit.
  • the size of the element is larger than the size of the photosensitive element of the third pixel unit, the photosensitive element of the second pixel unit is located on the right side of the second pixel unit, and the photosensitive element of the third pixel unit is located on the first pixel unit.
  • the additional metal light-shielding layer can be omitted, thus reducing the additional reflected light on the surrounding imaging.
  • the interference of the pixel unit, the image quality formed by the pixel unit used for imaging around the pixel unit used for focusing remains stable and consistent.
  • the imaging device of the embodiment of the present application includes the image sensor of the above-mentioned embodiment.
  • the additional metal light-shielding layer can be omitted, which reduces the additional reflection light on the surrounding imaging device.
  • the interference of the pixel unit, the image quality formed by the pixel unit used for imaging around the pixel unit used for focusing remains stable and consistent.
  • the mobile platform of the embodiment of the present application includes the imaging device of the above embodiment.
  • the additional metal light-shielding layer can be omitted, thus reducing the additional reflection light on the surrounding imaging.
  • the interference of the pixel unit, the image quality formed by the pixel unit used for imaging around the pixel unit used for focusing remains stable and consistent.
  • FIG. 1 is a schematic diagram of a pixel structure of an image sensor in the related art
  • FIG. 2 is a schematic diagram of pixel arrangement of an image sensor in the related art
  • FIG. 3 is a schematic diagram of the response results of the left half-covered pixel row and the right half-covered pixel row of the related art image sensor to light at adjacent positions;
  • FIG. 4 is a schematic diagram of the structure of the left half-shielded pixel of the related art image sensor
  • FIG. 5 is a schematic diagram of a pixel unit structure of an image sensor according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the arrangement of pixel units of an image sensor according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the response results of the second pixel unit row and the third pixel unit row of the image sensor according to the embodiment of the present application to light at adjacent positions;
  • FIG. 8 is a schematic diagram of a circuit of a pixel unit according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the principle of light sensing by an image sensor according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of modules of an imaging device according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • the features defined with “first” and “second” may explicitly or implicitly include one or more of the features.
  • a plurality of means two or more than two, unless otherwise specifically defined.
  • connection should be understood in a broad sense, unless otherwise clearly specified and limited.
  • it can be a fixed connection or a detachable connection.
  • It can be a mechanical connection or an electrical connection.
  • It can be directly connected, or indirectly connected through an intermediate medium, and it can be the internal communication between two elements or the interaction between two elements.
  • the specific meanings of the above-mentioned terms in this application can be understood according to specific circumstances.
  • the "above” or “below” of the first feature of the second feature may include direct contact between the first and second features, or may include the first and second features Not in direct contact but through other features between them.
  • “above”, “above” and “above” the second feature of the first feature include the first feature being directly above and obliquely above the second feature, or only that the level of the first feature is higher than the second feature.
  • the “below”, “below” and “below” the first feature of the second feature include the first feature directly below and obliquely below the second feature, or it simply means that the level of the first feature is smaller than the second feature.
  • the pixel array includes not only the normal pixel 102 for normal light-sensing, but also the pixel for focusing, the left half shield pixel 104 and the right pixel array.
  • the half-shielded pixel (Right Shield pixel) 106 is formed by fabricating a metal shield 108 for light shielding on the photosensitive surface of the pixel.
  • An arrangement of these three types of pixels in the entire pixel array is shown in Figure 2.
  • the pixel signals of all the left half-covered pixels 104 are separately extracted into a row, and the pixel signals of all the right half-covered pixels 106 are separately extracted into another row.
  • the image brightness distributions of the left half-covered pixel row 101 and the right half-covered pixel row 103 basically completely overlap
  • the left half-covered pixel row 101 and the right half-covered pixel row 103 have obvious response to light at adjacent positions
  • the height of the rectangle below each pixel in the figure represents the signal intensity corresponding to the collected signal. Therefore, the image brightness distribution of the left half-covered pixel row 101 and the right half-covered pixel row 103 will be offset, and the offset is in focus with the lens.
  • the embodiment of the present application proposes an image sensor 10.
  • the image sensor 10 of the embodiment of the present application includes a plurality of pixel units 12, and each pixel unit 12 includes a photosensitive element 122 located in the pixel unit 12.
  • the plurality of pixel units 12 includes a first pixel unit 121 for imaging, a second pixel unit 123 for focusing and a third pixel unit 125 for focusing.
  • the size of the photosensitive element 122 of the first pixel unit 121 is larger than the size of the photosensitive element 122 of the second pixel unit 123 and the size of the photosensitive element 122 of the third pixel unit 125.
  • the photosensitive element 122 of the second pixel unit 123 is located on the right side of the second pixel unit 123
  • the photosensitive element 122 of the third pixel unit 125 is located on the left side of the third pixel unit 125.
  • the image sensor 10 of the embodiment of the present application by reducing the size of the photosensitive element 122 of the second pixel unit 123 and the third pixel unit 125 for focusing, additional metal light-shielding layers can be omitted, thus reducing the additional reflected light to image the surroundings.
  • the quality of the image formed by the pixel unit 12 used for imaging around the pixel unit 12 used for focusing remains stable and consistent.
  • the planar shape of the photosensitive element 122 of the pixel unit 12 is approximately a square, and is arranged in a matrix to form a pixel array.
  • the planar shape of the photosensitive element 122 of the pixel unit 12 can also be other polygons or other shapes, and the arrangement is not limited to the determinant arrangement, which is not specifically limited here.
  • the image sensor 10 further includes a lens layer 14, a filter layer 16 and a buffer layer 18.
  • the lens layer 14 includes a plurality of lenses 142, and each lens 142 corresponds to one pixel unit 12 and is located above the pixel unit 12.
  • the lens 142 may be a microlens, and the lens 142 is used to condense incident light.
  • the filter layer 16 is located between the pixel unit 12 and the lens layer 14, and the filter layer 16 is used to filter out stray light in the incident light.
  • the filter layer 16 may be a color filter layer, so that the incident light (visible light) passes through the color filter layer and becomes light of different colors (such as R, B, G, W, etc.) and then enters the photosensitive layer.
  • Component 122 may be a color filter layer, so that the incident light (visible light) passes through the color filter layer and becomes light of different colors (such as R, B, G, W, etc.) and then enters the photosensitive layer.
  • the filter layer 16 can also be an infrared filter layer or other filter layers, which can be selected according to the function of the image sensor 10.
  • the buffer layer 18 is located between the pixel unit 12 and the filter layer 16. The buffer layer 18 can be used to protect the photodiode, adjust the light path entering the pixel unit, and also can maintain the flatness of the filter layer 16.
  • the lens layer 14, the filter layer 16, and the buffer layer 18 are sequentially stacked from top to bottom to form an optical stack.
  • the image sensor 10 includes a silicon device layer.
  • the silicon device layer includes a silicon layer 124 and a photosensitive element 122 disposed on the silicon layer 124.
  • the silicon device layer includes a plurality of pixel units 12, and the pixel unit 12 includes a silicon layer 124 and a photosensitive element 122.
  • the photosensitive element 122 may be a photodiode (PD).
  • Each pixel unit 12 may include one or more photodiodes.
  • the pixel array includes three pixel structures: the first pixel unit 121 (Normal pixel), the size of the photosensitive element 122 is relatively large, and the incident light enters the photosensitive element 122 after being condensed by the lens 142.
  • the photons are fully absorbed to generate photo-generated electrons and are retained in the photosensitive element 122;
  • the second pixel unit 123 (Right PD pixel), the size of the photosensitive element 122 is smaller than that of the photosensitive element 122 of the first pixel unit 121
  • the incident light is condensed by the lens 142 and enters the silicon device layer, only the electrons generated by the light entering the photosensitive element 122 will be retained in the photosensitive element 122, and the photo-generated charges outside the photosensitive element 122 will not be collected by the photosensitive element 122 , So it will not be regarded as a signal;
  • the third pixel unit 125 (Left PD pixel), its principle is similar to the second pixel unit 123, only the electrons excited by the photons entering the photosensitive element 122 will be collected And is used as a signal for subsequent processing.
  • the arrangement of the three pixel structures in the entire pixel array is shown in FIG. 6.
  • the pixel signals of all the second pixel units 123 are singled out and extracted into one row, and the pixel signals of all the third pixel units 125 are separately extracted into another row.
  • the image brightness distributions of the second pixel unit row 127 and the third pixel unit row 129 basically completely overlap ;
  • the generated charges will not be collected, that is, will not be used to characterize the signal strength, so the second The pixel unit row 127 and the third pixel unit row 129 have obvious differences in the response to light at adjacent positions.
  • the height of the rectangle below each pixel in the figure represents the corresponding collected signal intensity, so the second pixel unit row 127 and The image brightness distribution of the third pixel unit row 129 will be shifted, and there is a one-to-one correspondence between the shift and the amount of movement required to focus the lens. Therefore, according to the second pixel unit row 127 and the third pixel unit row 129
  • the brightness distribution offset can calculate the distance that the current lens needs to move for accurate focusing, and control the motor to move the lens to achieve accurate focusing. There is no need to move the lens repeatedly and calculate multiple times, so the focusing speed can be greatly improved.
  • the adverse effects of reflected light or scattered light on the photosensitive elements 122 of the first pixel unit 121, the second pixel unit 123, and the third pixel unit 125 are reduced or eliminated, so that the back-end processing circuit
  • the processing result (including imaging and focusing) of the output signal of the photosensitive element 122 is more accurate.
  • the aforementioned "physical arrangement is close to each other" is defined as “close to each other” during design, and does not necessarily mean that the image sensor 10 is close to each other.
  • the image sensor 10 of the embodiment of the present application not only achieves fast focusing, but also prevents the reflected light from interfering with the image of the surrounding first pixel unit 121.
  • the image sensor 10 of the embodiment of the present application can be widely used in the fields of consumer electronics, security monitoring, industrial automation, artificial intelligence, Internet of Things, etc., for the collection and sorting of image data information, and provide an information source for subsequent processing and applications.
  • the second pixel unit 123 and the third pixel unit 125 are arranged alternately and adjacently.
  • the second pixel unit 123 and the third pixel unit 125 are physically arranged close together.
  • the second pixel unit 123 and the third pixel unit 125 are arranged adjacently and alternately, which means that each second pixel unit 123 has at least one third pixel unit 125 adjacent to each other, and each third pixel unit 125 is adjacent to each other.
  • the second pixel unit 123 and the third pixel unit 125 may be arranged in the same row or in different rows, and the arrangement method is not specifically limited.
  • the middle pixel unit is the second pixel unit 123, then the other 8 pixel units: upper pixel unit, lower pixel unit, left pixel unit, right pixel unit, and upper left pixel unit At least one of the upper right pixel unit, the lower left pixel unit, and the lower right pixel unit is the third pixel unit 125.
  • the size of the photosensitive element 122 of the second pixel unit 123 is the same as the size of the photosensitive element 122 of the third pixel unit 125. In this way, when the second pixel unit 123 and the third pixel unit 125 are in the same position, the signal intensity collected by the photosensitive element 122 of the second pixel unit 123 and the photosensitive element 122 of the third pixel unit 125 are the same.
  • the size of the photosensitive element 122 of the second pixel unit 123 is half of the size of the photosensitive element 122 of the first pixel unit 121. In this way, the size of the photosensitive element 122 of the second pixel unit 123 for focusing and the size of the photosensitive element 122 of the third pixel unit 125 are both smaller than the size of the photosensitive element 122 of the first pixel unit 121 for imaging.
  • the planar shape of the photosensitive element 122 is basically rectangular, and the length of the photosensitive element 122 of the second pixel unit 123 and the length of the photosensitive element 122 of the third pixel unit 125 are both equal to the first pixel unit Half of the length of the photosensitive element 122 of the 121, the width of the photosensitive element 122 of the second pixel unit 123 and the width of the photosensitive element 122 of the third pixel unit 125 are equal to the width of the photosensitive element 122 of the first pixel unit 121.
  • the width of the photosensitive element 122 of the second pixel unit 123 and the width of the photosensitive element 122 of the third pixel unit 125 are both equal to half of the width of the photosensitive element 122 of the first pixel unit 121.
  • the length of the photosensitive element 122 of the two pixel unit 123 and the length of the photosensitive element 122 of the third pixel unit 125 are equal to the length of the photosensitive element 122 of the first pixel unit 121. It can also be the photosensitive element 122 of the second pixel unit 123.
  • the area of the surface 1222 and the area of the photosensitive surface 1222 of the photosensitive element 122 of the third pixel unit 125 are both smaller than the area of the photosensitive surface 1222 of the photosensitive element 122 of the first pixel unit 121.
  • the pixel unit 12 has a middle axis A.
  • the photosensitive element 122 of the second pixel unit 123 is located on the right side of the central axis A of the second pixel unit 123, or more than one-half of the photosensitive element 122 of the second pixel unit 123 is located on the right side of the central axis A of the second pixel unit 123 .
  • the photosensitive element 122 of the second pixel unit 123 is located on the right side of the silicon layer 124 of the second pixel unit 123.
  • the photosensitive element 122 of the second pixel unit 123 is located on the right side of the middle axis A of the second pixel unit 123.
  • the photosensitive element 122 of the second pixel unit 123 When the size of the photosensitive element 122 of the second pixel unit 123 is equal to half the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the second pixel unit 123 is located on the right side of the middle axis A of the second pixel unit 123 and The left edge of the photosensitive element 122 coincides with the center axis A.
  • the photosensitive element 122 of the second pixel unit 123 is greater than half of the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the second pixel unit 123 larger than one-half is located in the middle of the second pixel unit 123 The right side of axis A.
  • the pixel unit 12 has a middle axis A.
  • the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis A of the third pixel unit 125, or more than one-half of the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis A of the third pixel unit 125 .
  • the photosensitive element 122 of the third pixel unit 125 is located on the left side of the silicon layer 124 of the third pixel unit 125.
  • the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis A of the third pixel unit 125.
  • the photosensitive element 122 of the third pixel unit 125 When the size of the photosensitive element 122 of the third pixel unit 125 is equal to half the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis A of the third pixel unit 125 and The right edge of the photosensitive element 122 coincides with the center axis A.
  • the size of the photosensitive element 122 of the third pixel unit 125 is greater than half of the size of the photosensitive element 122 of the first pixel unit 121, more than one third of the photosensitive element 122 of the third pixel unit 125 is located in the middle of the third pixel unit 125 The left side of axis A.
  • the sum of the photosensitive area of the photosensitive element 122 of the second pixel unit 123 and the photosensitive area of the photosensitive element 122 of the third pixel unit 125 accounts for less than 6% of the total photosensitive area of the pixel unit 12.
  • the photosensitive area of the pixel unit 12 for imaging is sufficiently large, thereby ensuring the imaging quality of the image sensor 10.
  • the sum of the photosensitive area of the photosensitive element 122 of the second pixel unit 123 and the photosensitive area of the photosensitive element 122 of the third pixel unit 125 accounts for a percentage of the total photosensitive area of the pixel unit 12 equal to 5%. It should be noted that, in the present application, the photosensitive area of the photosensitive element 122 is proportional to the size, and the larger the size of the photosensitive element 122, the larger the photosensitive area.
  • the pixel unit 12 includes a switch component 126.
  • the switch assembly 126 is connected to the photosensitive element 122.
  • the switch assembly 126 is configured to selectively empty or output the charge of the photosensitive element 122.
  • the switch assembly 126 includes a transmission tube TX, a reset tube RST, a source follower tube SF, and a gate tube SEL.
  • the transmission tube TX, the reset tube RST, the source follower tube SF and the strobe tube SEL are all triodes.
  • the emitter of the transmission tube TX is connected to the photodiode, and the collector of the transmission tube TX is connected to the emitter of the reset tube RST.
  • the base of the source follower tube SF is connected to the collector of the transmission tube TX and the emitter of the reset tube RST, and the collector of the source follower tube SF is connected to the emitter of the strobe tube SEL.
  • the collector of the gate tube SEL is used to connect the signal line 1262.
  • the photosensitive element 122 is a photodiode PD.
  • the floating diffusion area FD is a node electrically connected to the transmission tube TX, the reset tube RST, and the source follower tube SF.
  • the source follower SF is a voltage source follower, and its function is to translate the voltage at the FD end to the PXD end, so that the voltage at the PXD end and the voltage at the FD end have a corresponding relationship.
  • the signal line 1262 is connected to the back-end processing circuit to transmit the output signal of the image sensor 10 to the back-end processing circuit for processing.
  • the difference between the first pixel unit 121, the second pixel unit 123, and the third pixel unit 125 lies in the size of the photosensitive element 122.
  • the image sensor 10 includes a reset phase, an exposure phase, a reference voltage readout phase, and a signal voltage readout phase that are performed in time sequence.
  • the switch assembly 126 is configured to empty the charge of the photosensitive element 122 during the reset phase.
  • the switch assembly 126 is configured to cause the photosensitive element 122 to generate electric charges and accumulate in the photosensitive element 122 during the exposure stage.
  • the switch component 126 is configured to output the reference voltage during the reference voltage reading phase.
  • the switch component 126 is configured to output the charge of the photosensitive element 122 to form a signal voltage during the signal voltage readout stage.
  • the reset tube RST and the transmission tube TX are placed at a high potential, the strobe tube SEL is placed at a low potential, and the charge in the photodiode PD is completely reset and cleared.
  • Exposure stage The transmission tube TX is at a low potential, the reset tube RST maintains a high potential, and the gate tube SEL maintains a low potential. Due to the light in the photodiode PD, photo-generated electrons are generated and accumulated in the photodiode PD.
  • Reference voltage reading stage the gate tube SEL is placed at a high potential, and the reset tube RST is placed at a low potential.
  • the source terminal of the gate tube SEL obtains the voltage Vref, which is quantified and read out by the subsequent analog-to-digital conversion circuit.
  • Signal voltage readout stage the gate tube SEL maintains a high potential, the reset tube RST maintains a low potential, the transmission tube TX is turned on (high potential), the electrons in the photodiode PD enter the floating diffusion area FD, resulting in a decrease in the potential of the floating diffusion area FD
  • the transmission tube TX (low potential) is turned off, and the voltage Vsig is obtained at the source end of the gate tube SEL, and the voltage Vsig is quantified and read by the subsequent analog-digital conversion circuit.
  • Vref-Vsig is the final corresponding image signal.
  • Vref is the voltage collected at the PXD node shown in FIG. 8 when the reference voltage is read
  • Vsig is the voltage collected at the PXD node when the signal voltage is read.
  • the photosensitive element 122 is N-type doped.
  • the photosensitive element 122 is manufactured by ion implantation (standard semiconductor process).
  • the photosensitive element 122 is an N-type doped photodiode.
  • the imaging device 100 of the embodiment of the present application includes the image sensor 10 of the above-mentioned embodiment.
  • an additional metal light-shielding layer can be omitted, thus reducing additional reflected light to image the surroundings.
  • the quality of the image formed by the pixel unit 12 used for imaging around the pixel unit 12 used for focusing remains stable and consistent.
  • the imaging device 100 includes a processing circuit 20, and the processing circuit 20 is connected to the image sensor 10.
  • the processing circuit 20 is used to determine the imaging information of the image sensor 10 according to the output signal of the first pixel unit 121 and used to determine the focus information according to the output signals of the second pixel unit 123 and the third pixel unit 125.
  • the imaging device 100 can use the first pixel unit 121 for imaging, and can also calculate the current imaging device 100 lens that needs to be moved for accurate focus based on the offset of the brightness distribution of the second pixel unit row 127 and the third pixel unit row 129. Distance, and control the motor to move the lens to achieve accurate focus, so that the focus speed is greatly improved.
  • the imaging device 100 further includes a lens module 30 and a circuit board 40, and the lens module 30 includes a lens barrel 32, a lens holder 34 and a lens 36.
  • the image sensor 10 and the processing circuit 20 are arranged on the circuit board 40, the lens holder 34 is arranged on the circuit board 40, the lens barrel 32 is connected to the lens holder 34, and the lens 36 is arranged in the lens barrel 32.
  • the number of lenses 36 may be one or two or more than two.
  • the imaging device 100 may further include a driving module 50 (such as a motor).
  • the driving module 50 is used to drive the lens barrel 32 and/or the lens 36 to move back and forth along the optical axis OP of the lens module 30, thereby moving the lens to achieve accurate focusing.
  • the mobile platform of the embodiment of the present application includes the imaging device 100 of the above-mentioned embodiment.
  • the mobile platform may be an unmanned aerial vehicle, an unmanned vehicle, a mobile robot, or other mobile platforms equipped with the imaging device 100.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

一种图像传感器(10)、成像装置(100)及移动平台。图像传感器(10)包括多个像素单元(12),每个像素单元(12)包括位于像素单元(12)内的感光元件(122)。多个像素单元(12)包括成像用的第一像素单元(121)和对焦用的第二像素单元(123)和第三像素单元(125)。第一像素单元(121)的感光元件(122)的尺寸大于第二像素单元(123)的感光元件(122)的尺寸和大于第三像素单元(125)的感光元件(122)的尺寸。第二像素单元(123)的感光元件(122)位于第二像素单元(123)的右侧,第三像素单元(125)的感光元件(122)位于第三像素单元(125)的左侧。

Description

图像传感器、成像装置及移动平台 技术领域
本申请涉及相机技术领域,特别涉及一种用于相位检测自动对焦的图像传感器、成像装置及移动平台。
背景技术
在相关技术中,为了让相机获取清晰的图像,相机具备对焦功能。其中一种对焦技术为相位检测自动对焦技术(phase detect auto focus,PDAF)。在该技术中,将图像传感器的部分像素作为对焦用的像素。这些对焦用的像素包括左半遮盖像素和右半遮盖像素。在具体实现时,通过在像素感光面上方制作用于遮光的金属遮盖层(metal shield)来形成半遮盖像素。然而,当光照射着在遮盖层上时,有一部分光会反射出去,反射出来的光经过周围界面的反射/折射,会进入周围正常感光的像素,使这些正常感光的像素信号受到干扰,进而降低了对焦用的像素点周围正常感光像素的成像质量。
发明内容
本申请提供一种图像传感器、成像装置及移动平台。
本申请的实施方式的图像传感器包括多个像素单元,每个所述像素单元包括位于所述像素单元内的感光元件,所述多个像素单元包括第一像素单元、第二像素单元和第三像素单元,所述第一像素单元用于成像,所述第二像素单元和所述第三像素单元用于对焦,所述第一像素单元的感光元件的尺寸大于所述第二像素单元的感光元件的尺寸和大于所述第三像素单元的感光元件的尺寸,所述第二像素单元的感光元件位于所述第二像素单元的右侧,所述第三像素单元的感光元件位于所述第三像素单元的左侧。
本申请实施方式的图像传感器,通过减小对焦用的第二像素单元和第三像素单元的感光元件尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元的干扰,使用于对焦的像素单元周围的用于成像的像素单元所形成的图像质量维持稳定一致。
本申请实施方式的成像装置,包括上述实施方式的图像传感器。
本申请实施方式的成像装置,通过减小对焦用的第二像素单元和第三像素单元的感光元件尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元的干扰,使用于对焦的像素单元周围的用于成像的像素单元所形成的图像质量维持稳定一致。
本申请实施方式的移动平台,包括上述实施方式的成像装置。
本申请实施方式的移动平台,通过减小对焦用的第二像素单元和第三像素单元的感光元件尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元的干扰,使用于对焦的像素单元周围的用于成像的像素单元所形成的图像质量维持稳定一致。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是相关技术的图像传感器的像素结构示意图;
图2是相关技术的图像传感器的像素排布示意图;
图3是相关技术的图像传感器的左半遮盖像素行和右半遮盖像素行在相邻位置上对光的响应结果示意图;
图4是相关技术的图像传感器的左半遮盖像素结构示意图;
图5是本申请实施方式的图像传感器的像素单元结构示意图;
图6是本申请实施方式的图像传感器的像素单元排布示意图;
图7是本申请实施方式的图像传感器的第二像素单元行和第三像素单元行在相邻位置上对光的响应结果示意图;
图8是本申请实施方式的像素单元的电路示意图;
图9是本申请实施方式的图像传感器感应光的原理示意图;
图10是本申请实施方式的成像装置的模块示意图;
图11是本申请实施方式的成像装置的结构示意图。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。
在本申请的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”、“内”、“外”、 “顺时针”、“逆时针”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本申请的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本申请的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接。可以是机械连接,也可以是电连接。可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征之“上”或之“下”可以包括第一和第二特征直接接触,也可以包括第一和第二特征不是直接接触而是通过它们之间的另外的特征接触。而且,第一特征在第二特征“之上”、“上方”和“上面”包括第一特征在第二特征正上方和斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”包括第一特征在第二特征正下方和斜下方,或仅仅表示第一特征水平高度小于第二特征。
本文的公开提供了许多不同的实施方式或例子用来实现本申请的不同结构。为了简化本申请的公开,本文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。此外,本申请可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目的,其本身不指示所讨论各种实施方式和/或设置之间的关系。此外,本申请提供了的各种特定的工艺和材料的例子,但是本领域普通技术人员可以意识到其他工艺的应用和/或其他材料的使用。
请参阅图1,在相关技术的图像传感器中,像素阵列除了包括用于正常感光的像素(Normal pixel)102之外,还包括对焦用的像素,左半遮盖像素(Left shield pixel)104和右半遮盖像素(Right shield pixel)106,其通过在像素的感光面上方制作用于遮光的金属遮盖层(metal shield)108形成。这三种像素在整个像素阵列的一种排布方式如图2所示。利用该图像传感器成像时,将所有左半遮盖像素104的像素信号单独提取出来成为一行,将所有右半遮盖像素106的像素信号单独提取出来成为另外一行。在对焦准确的情况下,由于左半遮盖像素104和右半遮盖像素106在物理排布上是紧挨的,那么左半遮盖像素行101和右半遮盖像素行103的图像亮度分布基本完全重叠;在对焦不准确的情况下, 如图3所示,由于金属遮盖层108对光的阻挡,左半遮盖像素行101和右半遮盖像素行103在相邻位置上对光的响应有明显的区别,图中每个像素下方矩形的高度表示对应收集到的信号强度,因此左半遮盖像素行101和右半遮盖像素行103的图像亮度分布就会出现偏移,该偏移量与镜头对焦所需移动的量存在一一对应的关系,因此根据左半遮盖像素行101和右半遮盖像素行103的亮度分布偏移量,可以推算出当前镜头准确对焦需要移动的距离,并控制马达移动镜头实现准确对焦,不需要反复移动镜头,多次计算,因此对焦速度可以大大提升。这种对焦方式可称为相位检测自动对焦技术。
然而,由图4可知,当光照射在用于遮光的金属遮盖层108上时,有一部分光会反射出去,反射出来的光经过周围界面的反射或折射,会进入周围正常感光的像素102,使这些正常感光的像素102的像素信号受到干扰,进而降低了相位检测自动对焦像素点周围的正常感光的像素102的成像质量。
因此,本申请实施方式提出一种图像传感器10。请参阅图5,本申请实施方式的图像传感器10包括多个像素单元12,每个像素单元12包括位于像素单元12内的感光元件122。多个像素单元12包括成像用的第一像素单元121,还包括对焦用的第二像素单元123和对焦用的第三像素单元125。第一像素单元121的感光元件122的尺寸大于第二像素单元123的感光元件122的尺寸和大于第三像素单元125的感光元件122的尺寸。第二像素单元123的感光元件122位于第二像素单元123的右侧,第三像素单元125的感光元件122位于第三像素单元125的左侧。
本申请实施方式的图像传感器10,通过减小对焦用的第二像素单元123和第三像素单元125的感光元件122尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元121的干扰,使用于对焦的像素单元12周围的用于成像的像素单元12所形成的图像质量维持稳定一致。
在图示的实施方式中,像素单元12的感光元件122的平面形状大致呈方形,并呈行列式排布形成像素阵列。在其它实施方式中,像素单元12的感光元件122的平面形状还可以其它多边形或其它形状,排布方式也不限于行列式排布,在此不作具体限定。
具体地,图像传感器10还包括透镜层14、滤光层16和缓冲层18。透镜层14包括多个透镜142,每个透镜142对应于一个像素单元12且位于像素单元12的上方。透镜142可为微透镜(micro lens),透镜142用于可以汇聚入射光线。滤光层16位于像素单元12与透镜层14之间,滤光层16用于滤除入射光线中的杂光。在一个例子中,滤光层16可以是彩色滤光层,使得入射光线(可见光)经彩色滤光层后成为分为不同颜色的光(如R、B、G、W等)后入射到感光元件122。在其它例子中,滤光层16还可以是红外滤光层或其它滤光层,根据图像传感器10的功能进行选择。缓冲层18位于像素单元12与 滤光层16之间。缓冲层18可以用于保护光电二极管,调节进入像素单元的光路,此外,还可以起到维持滤光层16的平整度的效果。在图5中,透镜层14、滤光层16和缓冲层18依次从上至下堆叠,形成光学堆栈(Optical stack)。图像传感器10包括硅器件层,硅器件层包括硅层124和设于硅层124的感光元件122,即硅器件层包括多个像素单元12,像素单元12包括硅层124和感光元件122。感光元件122可以是光电二极管(photo diode,PD)。每个像素单元12可以包括一个或多个光电二极管。
可以理解,在本申请中,像素阵列包含三种像素结构:第一像素单元121(正常像素,Normal pixel),其感光元件122的尺寸较大,入射光经过透镜142汇聚后进入感光元件122,光子被充分吸收产生光生电子,并被保留在感光元件122之中;第二像素单元123(右偏像素,Right PD pixel),其感光元件122的尺寸小于第一像素单元121的感光元件122的尺寸,当入射光被透镜142汇聚后,进入硅器件层,只有进入感光元件122的光产生的电子会被保留在感光元件122,感光元件122的之外的光生电荷不会被感光元件122收集,因此也不会被当成信号;第三像素单元125(左偏像素,Left PD pixel),其原理与第二像素单元123类似,只有进入其感光元件122中的光子激发的电子才会被收集并被当成后续处理用的信号。
在一个例子中,三种像素结构的在整个像素阵列的排布示意如图6所示。在利用图像传感器10成像时,将所有第二像素单元123的像素信号单出提取出来成为一行,将所有第三像素单元125的像素信号单独提取出来成为另外一行。在对焦准确的情况下,由于第二像素单元123和第三像素单元125在物理排布上是紧挨的,那么第二像素单元行127和第三像素单元行129的图像亮度分布基本完全重叠;在对焦不准确的情况下,如图7所示,由于光线照射在感光元件122以外的硅层124中,产生的电荷不会被收集,即不会被用来表征信号强度,因此第二像素单元行127和第三像素单元行129在相邻位置上对光的响应有明显的区别,图中每个像素下方矩形的高度表示对应收集到的信号强度,因此第二像素单元行127和第三像素单元行129的图像亮度分布就会出现偏移,该偏移量与镜头对焦所需移动的量存在一一对应的关系,因此根据第二像素单元行127和第三像素单元行129的亮度分布偏移量,可以推算出当前镜头准确对焦需要移动的距离,并控制马达移动镜头实现准确对焦,不需要反复移动镜头,多次计算,因此对焦速度可以大大提升。而且,由于无需设置金属遮盖层,减少了或消除了反射光或散射光对第一像素单元121、第二像素单元123和第三像素单元125的感光元件122的不利影响,使得后端处理电路对感光元件122输出信号的处理结果(包括成像和对焦)更准确。
另外,上述所说的“物理排布上是紧挨的”,是设计时所定义的“紧挨”,并不一定指在图像传感器10的位置上是紧挨的,第二像素单元123和第三像素单元125之间可以 有一个或多个像素单元12。
综上,本申请实施方式的图像传感器10既实现了快速对焦,又能防止反射光对周围的第一像素单元121的图像干扰。本申请实施方式的图像传感器10可广泛用于消费电子,安防监控,工业自动化,人工智能,物联网等领域,用于图像数据信息的采集和整理,为后续处理和应用提供信息源。
在某些实施方式中,第二像素单元123和第三像素单元125相邻交错排布。
如此,实现了第二像素单元123和第三像素单元125在物理排布上是紧挨的一种方式。具体地,第二像素单元123和第三像素单元125相邻交错排布,指的是每个第二像素单元123邻近位置至少有一个第三像素单元125,每个第三像素单元125邻近位置至少有一个第二像素单元123,第二像素单元123和第三像素单元125之间可以有一个或多个像素单元12。第二像素单元123和第三像素单元125可以排布在同一行,也可以排布在不同行,其排布方式不作具体限制。例如,在3*3阵列的像素单元122中,中间像素单元为第二像素单元123,那么,其它8个像素单元:上像素单元、下像素单元、左像素单元、右像素单元、左上像素单元、右上像素单元、左下像素单元和右下像素单元中至少有一个是第三像素单元125。
在某些实施方式中,第二像素单元123的感光元件122的尺寸与第三像素单元125的感光元件122的尺寸相同。如此,当第二像素单元123和第三像素单元125在同一个位置时,第二像素单元123的感光元件122和第三像素单元125的感光元件122收集到的信号强度相同。
进一步地,第二像素单元123的感光元件122的尺寸为第一像素单元121的感光元件122的尺寸的一半。如此,对焦用的第二像素单元123的感光元件122的尺寸和第三像素单元125的感光元件122的尺寸均小于成像用的第一像素单元121的感光元件122的尺寸。
在一个实施例中,请结合图6,感光元件122的平面形状基本呈矩形,第二像素单元123的感光元件122的长度和第三像素单元125的感光元件122的长度均等于第一像素单元121的感光元件122的长度的一半,第二像素单元123的感光元件122的宽度和第三像素单元125的感光元件122的宽度等于第一像素单元121的感光元件122的宽度。在其它实施例中,也可以是,第二像素单元123的感光元件122的宽度和第三像素单元125的感光元件122的宽度均等于第一像素单元121的感光元件122的宽度的一半,第二像素单元123的感光元件122的长度和第三像素单元125的感光元件122的长度等于第一像素单元121的感光元件122的长度,还可以是,第二像素单元123的感光元件122的感光面1222面积和第三像素单元125的感光元件122的感光面1222面积均小于第一 像素单元121的感光元件122的感光面1222面积。
在某些实施方式中,请结合图5,像素单元12具有中间轴线A。第二像素单元123的感光元件122位于第二像素单元123中间轴线A的右侧,或大于二分之一的第二像素单元123的感光元件122位于第二像素单元123中间轴线A的右侧。
可以理解,第二像素单元123的感光元件122位于第二像素单元123的硅层124的偏右侧。当第二像素单元123的感光元件122的尺寸小于第一像素单元121的感光元件122的尺寸的一半时,第二像素单元123的感光元件122位于第二像素单元123中间轴线A的右侧。当第二像素单元123的感光元件122的尺寸等于第一像素单元121的感光元件122的尺寸的一半时,第二像素单元123的感光元件122位于第二像素单元123中间轴线A的右侧且感光元件122的左边缘与中间轴线A重合。当第二像素单元123的感光元件122的尺寸大于第一像素单元121的感光元件122的尺寸的一半时,大于二分之一的第二像素单元123的感光元件122位于第二像素单元123中间轴线A的右侧。
在某些实施方式中,像素单元12具有中间轴线A。第三像素单元125的感光元件122位于第三像素单元125中间轴线A的左侧,或大于二分之一的第三像素单元125的感光元件122位于第三像素单元125中间轴线A的左侧。
可以理解,第三像素单元125的感光元件122位于第三像素单元125的硅层124的偏左侧。当第三像素单元125的感光元件122的尺寸小于第一像素单元121的感光元件122的尺寸的一半时,第三像素单元125的感光元件122位于第三像素单元125中间轴线A的左侧。当第三像素单元125的感光元件122的尺寸等于第一像素单元121的感光元件122的尺寸的一半时,第三像素单元125的感光元件122位于第三像素单元125中间轴线A的左侧且感光元件122的右边缘与中间轴线A重合。当第三像素单元125的感光元件122的尺寸大于第一像素单元121的感光元件122的尺寸的一半时,大于三分之一的第三像素单元125的感光元件122位于第三像素单元125中间轴线A的左侧。
在某些实施方式中,第二像素单元123的感光元件122的感光面积和第三像素单元125的感光元件122的感光面积之和占像素单元12的总感光面积的百分比小于6%。
如此,保证成像用的像素单元12的感光面积足够大,从而保证图像传感器10的成像质量。在一个实施例中,第二像素单元123的感光元件122的感光面积和第三像素单元125的感光元件122的感光面积之和占像素单元12的总感光面积的百分比等于5%。需要说明的是,在本申请中,感光元件122的感光面积与尺寸成正比,感光元件122的尺寸越大,其感光面积越大。
请参阅图8,在某些实施方式中,像素单元12包括开关组件126。开关组件126连接感光元件122。开关组件126被配置为选择性地使感光元件122的电荷清空或输出。
具体地,开关组件126包括传输管TX、复位管RST、源跟随管SF和选通管SEL。传输管TX、复位管RST、源跟随管SF和选通管SEL均为三极管。传输管TX的发射极连接光电二极管,传输管TX的集电极连接复位管RST的发射极。源跟随管SF的基极连接传输管TX的集电极、复位管RST的发射极,源跟随管SF的集电极连接选通管SEL的发射极。选通管SEL的集电极用于连接信号线1262。感光元件122为光电二极管PD。
在图8的示例中,浮置扩散区FD为传输管TX、复位管RST、源跟随管SF电连接的节点。源跟随管SF为电压源跟随管,其作用为把FD端的电压平移到PXD端,使PXD端的电压与FD端的电压有对应关系。信号线1262连接后端处理电路,以将图像传感器10的输出信号传输至后端处理电路进行处理。
在本申请中,第一像素单元121、第二像素单元123和第三像素单元125的区别在于感光元件122的尺寸。
请参阅图9,在某些实施方式中,图像传感器10包括按时序进行的复位阶段、曝光阶段、参考电压读出阶段和信号电压读出阶段。开关组件126被配置为在复位阶段对感光元件122的电荷清空。开关组件126被配置为在曝光阶段使感光元件122产生电荷并积累在感光元件122。开关组件126被配置为在参考电压读出阶段将参考电压输出。开关组件126被配置为在信号电压读出阶段使感光元件122的电荷输出以形成信号电压。
具体地,复位阶段:复位管RST、传输管TX置于高电位,选通管SEL置于低电位,光电二极管PD内的电荷被完全复位清空。曝光阶段:传输管TX位于低电位,复位管RST维持高电位,选通管SEL维持低电位,光电二极管PD内由于光照,产生光生电子并积累在光电二极管PD内。参考电压读出阶段:选通管SEL置于高电位,复位管RST置于低电位,此时选通管SEL的source端得到电压Vref,并通过后续的模拟数字转换电路量化读出。信号电压读出阶段:选通管SEL维持高电位,复位管RST维持低电位,传输管TX打开(高电位),光电二极管PD内电子进入浮置扩散区FD,导致浮置扩散区FD电位降低,此时关传输管TX(低电位),在选通管SEL的source端得到电压Vsig,并通过后续的模拟数字转换电路量化读出。最后后端处理电路处理时,Vref-Vsig就是最终对应的图像信号。其中,Vref为读取参考电压时,在图8所示的PXD节点采集到的电压,Vsig为信号电压读取时在PXD节点采集到的电压。
在某些实施方式中,感光元件122为N型掺杂。
可以理解,感光元件122是通过离子注入(半导体标准工艺)来制作的。在一个实施例中,感光元件122为N型掺杂的光电二极管。
请参阅图10和图11,本申请实施方式的成像装置100,包括上述实施方式的图像 传感器10。
本申请实施方式的成像装置100,通过减小对焦用的第二像素单元123和第三像素单元125的感光元件122尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元121的干扰,使用于对焦的像素单元12周围的用于成像的像素单元12所形成的图像质量维持稳定一致。
在某些实施方式中,成像装置100包括处理电路20,处理电路20连接图像传感器10。处理电路20用于根据第一像素单元121的输出信号确定图像传感器10的成像信息,及用于根据第二像素单元123和第三像素单元125的输出信号确定对焦信息。
如此,成像装置100既可以利用第一像素单元121成像,又可以根据第二像素单元行127和第三像素单元行129的亮度分布偏移量,推算出当前成像装置100镜头准确对焦需要移动的距离,并控制马达移动镜头实现准确对焦,使对焦速度大大提升。
进一步地,在图11的实施方式中,成像装置100还包括镜头模组30和电路板40,镜头模组30包括镜筒32、镜座34和镜片36。图像传感器10和处理电路20设置在电路板40,镜座34设置在电路板40,镜筒32连接镜座34,镜片36设置在镜筒32内。镜片36的数量可为一个或两个或两个以上。成像装置100还可包括驱动模组50(如马达)。驱动模组50用于驱动镜筒32和/或镜片36沿镜头模组30的光轴OP来回移动,从而移动镜头实现准确对焦。
本申请实施方式的移动平台,包括上述实施方式的成像装置100。
本申请实施方式的移动平台,通过减小对焦用的第二像素单元123和第三像素单元125的感光元件122尺寸,可以省略额外的金属遮光层,这样减少了额外的反射光对周围成像用的第一像素单元121的干扰,使用于对焦的像素单元12周围的用于成像的像素单元12所形成的图像质量维持稳定一致。
可以理解,移动平台可以是搭载有成像装置100的无人机、无人小车、移动机器人或其他移动平台等。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”等的描述意指结合实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
尽管已经示出和描述了本申请的实施方式,本领域的普通技术人员可以理解:在不脱离本申请的原理和宗旨的情况下可以对这些实施方式进行多种变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。

Claims (17)

  1. 一种图像传感器,其特征在于,包括多个像素单元,每个所述像素单元包括位于所述像素单元内的感光元件,所述多个像素单元包括第一像素单元、第二像素单元和第三像素单元,所述第一像素单元用于成像,所述第二像素单元和所述第三像素单元用于对焦,所述第一像素单元的感光元件的尺寸大于所述第二像素单元的感光元件的尺寸和大于所述第三像素单元的感光元件的尺寸,所述第二像素单元的感光元件位于所述第二像素单元的右侧,所述第三像素单元的感光元件位于所述第三像素单元的左侧。
  2. 根据权利要求1所述的图像传感器,其特征在于,所述第二像素单元和所述第三像素单元相邻交错排布。
  3. 根据权利要求1所述的图像传感器,其特征在于,所述第二像素单元的感光元件的尺寸与所述第三像素单元的感光元件的尺寸相同。
  4. 根据权利要求3所述的图像传感器,其特征在于,所述第二像素单元的感光元件的尺寸为所述第一像素单元的感光元件的尺寸的一半。
  5. 根据权利要求1所述的图像传感器,其特征在于,所述像素单元具有中间轴线,
    所述第二像素单元的感光元件位于所述第二像素单元中间轴线的右侧,或大于二分之一的所述第二像素单元的感光元件位于所述第二像素单元中间轴线的右侧。
  6. 根据权利要求1所述的图像传感器,其特征在于,所述像素单元具有中间轴线,
    所述第三像素单元的感光元件位于所述第三像素单元中间轴线的左侧,或大于二分之一的所述第三像素单元的感光元件位于所述第三像素单元中间轴线的左侧。
  7. 根据权利要求1所述的图像传感器,其特征在于,所述图像传感器包括透镜层,所述透镜层包括多个透镜,每个所述透镜对应于一个所述像素单元且位于所述像素单元的上方。
  8. 根据权利要求7所述的图像传感器,其特征在于,所述图像传感器包括滤光层,所述滤光层位于所述像素单元与所述透镜层之间。
  9. 根据权利要求8所述的图像传感器,其特征在于,所述图像传感器包括缓冲层,所 述缓冲层位于所述像素单元与所述滤光层之间。
  10. 根据权利要求1所述的图像传感器,其特征在于,所述第二像素单元的感光元件的感光面积和所述第三像素单元的感光元件的感光面积之和占所述像素单元的总感光面积的百分比小于6%。
  11. 根据权利要求1所述的图像传感器,其特征在于,所述像素单元包括开关组件,所述开关组件连接所述感光元件,所述开关组件被配置为选择性地使所述感光元件的电荷清空或输出。
  12. 根据权利要求11所述的图像传感器,其特征在于,所述图像传感器包括按时序进行的复位阶段、曝光阶段、参考电压读出阶段和信号电压读出阶段,
    所述开关组件被配置为在所述复位阶段对所述感光元件的电荷清空;
    所述开关组件被配置为在所述曝光阶段使所述感光元件产生电荷并积累在所述感光元件;
    所述开关组件被配置为在所述参考电压读出阶段将参考电压输出;
    所述开关组件被配置为在所述信号电压读出阶段使所述感光元件的电荷输出以形成信号电压。
  13. 根据权利要求11所述的图像传感器,其特征在于,所述开关组件包括传输管、复位管、源跟随管和选通管,所述传输管、所述复位管、所述源跟随管和所述选通管均为三极管,所述传输管的发射极连接所述光电二极管,所述传输管的集电极连接所述复位管的发射极,所述源跟随管的基极连接所述传输管的集电极、所述复位管的发射极,所述源跟随管的集电极连接所述选通管的发射极,所述选通管的集电极连接信号线。
  14. 根据权利要求1所述的图像传感器,其特征在于,所述感光元件为N型掺杂。
  15. 一种成像装置,其特征在于,包括权利要求1-14任一项所述的图像传感器。
  16. 根据权利要求15所述的成像装置,其特征在于,所述成像装置包括处理电路,所述处理电路连接所述图像传感器,所述处理电路用于根据所述第一像素单元的输出信号确定所述图像传感器的成像信息,及用于根据所述第二像素单元和所述第三像素单元的输出信号 确定对焦信息。
  17. 一种移动平台,其特征在于,包括权利要求15或16所述的成像装置。
PCT/CN2019/113097 2019-10-24 2019-10-24 图像传感器、成像装置及移动平台 WO2021077374A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/113097 WO2021077374A1 (zh) 2019-10-24 2019-10-24 图像传感器、成像装置及移动平台
CN201980032071.0A CN112119624A (zh) 2019-10-24 2019-10-24 图像传感器、成像装置及移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/113097 WO2021077374A1 (zh) 2019-10-24 2019-10-24 图像传感器、成像装置及移动平台

Publications (1)

Publication Number Publication Date
WO2021077374A1 true WO2021077374A1 (zh) 2021-04-29

Family

ID=73799349

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/113097 WO2021077374A1 (zh) 2019-10-24 2019-10-24 图像传感器、成像装置及移动平台

Country Status (2)

Country Link
CN (1) CN112119624A (zh)
WO (1) WO2021077374A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992856A (zh) * 2021-11-30 2022-01-28 维沃移动通信有限公司 图像传感器、摄像模组和电子设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102695007A (zh) * 2012-05-15 2012-09-26 格科微电子(上海)有限公司 图像传感器及其驱动方法
CN104519327A (zh) * 2013-10-07 2015-04-15 联咏科技股份有限公司 图像传感器及图像采集系统
CN204761572U (zh) * 2014-05-01 2015-11-11 半导体元件工业有限责任公司 成像装置
CN105338263A (zh) * 2014-08-04 2016-02-17 Lg伊诺特有限公司 图像传感器和包括该图像传感器的图像拾取设备
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
US20170347046A1 (en) * 2014-12-18 2017-11-30 Lg Innotek Co., Ltd. Image sensor, image acquiring device comprising same, and portable terminal comprising the device
CN109167941A (zh) * 2018-11-09 2019-01-08 德淮半导体有限公司 图像传感器及其制造方法
CN110062144A (zh) * 2019-05-14 2019-07-26 德淮半导体有限公司 相位对焦图像传感器及其形成方法、工作方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9451152B2 (en) * 2013-03-14 2016-09-20 Apple Inc. Image sensor with in-pixel depth sensing
US10070081B2 (en) * 2017-02-03 2018-09-04 SmartSens Technology (U.S.), Inc. Stacked image sensor pixel cell with dynamic range enhancement and selectable shutter modes and in-pixel CDS
CN108281438A (zh) * 2018-01-18 2018-07-13 德淮半导体有限公司 图像传感器及其形成方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102695007A (zh) * 2012-05-15 2012-09-26 格科微电子(上海)有限公司 图像传感器及其驱动方法
CN104519327A (zh) * 2013-10-07 2015-04-15 联咏科技股份有限公司 图像传感器及图像采集系统
CN204761572U (zh) * 2014-05-01 2015-11-11 半导体元件工业有限责任公司 成像装置
CN105338263A (zh) * 2014-08-04 2016-02-17 Lg伊诺特有限公司 图像传感器和包括该图像传感器的图像拾取设备
US20170347046A1 (en) * 2014-12-18 2017-11-30 Lg Innotek Co., Ltd. Image sensor, image acquiring device comprising same, and portable terminal comprising the device
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN109167941A (zh) * 2018-11-09 2019-01-08 德淮半导体有限公司 图像传感器及其制造方法
CN110062144A (zh) * 2019-05-14 2019-07-26 德淮半导体有限公司 相位对焦图像传感器及其形成方法、工作方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113992856A (zh) * 2021-11-30 2022-01-28 维沃移动通信有限公司 图像传感器、摄像模组和电子设备

Also Published As

Publication number Publication date
CN112119624A (zh) 2020-12-22

Similar Documents

Publication Publication Date Title
US10074682B2 (en) Phase difference detection in pixels
CN106068563B (zh) 固态成像装置、固态成像装置的制造方法和电子设备
US9883128B2 (en) Imaging systems with high dynamic range and phase detection pixels
US10033949B2 (en) Imaging systems with high dynamic range and phase detection pixels
CN109981939B (zh) 成像系统
US10325947B2 (en) Global shutter image sensors with light guide and light shield structures
US20120194696A1 (en) Solid-state image sensor and camera
KR102372745B1 (ko) 이미지 센서 및 이를 구비하는 전자장치
JP2011216896A (ja) マイクロレンズ付き固体イメージセンサ及び非テレセントリック撮像レンズを備えた光学系
JP2009158800A (ja) 固体撮像素子及びこれを用いた撮像装置
US20170339355A1 (en) Imaging systems with global shutter phase detection pixels
CN104517982A (zh) 固态摄像装置、固态摄像装置的制造方法以及电子设备
KR102128467B1 (ko) 이미지 센서 및 이미지 센서를 포함하는 영상 촬영 장치
JP2007155930A (ja) 固体撮像素子及びこれを用いた撮像装置
WO2021077374A1 (zh) 图像传感器、成像装置及移动平台
KR20080068373A (ko) 주광선 손실을 보상하는 마이크로렌즈 어레이 및 이를포함하는 이미지센서 조립체
US20220028910A1 (en) Image sensor having two-colored color filters sharing one photodiode
US20220293659A1 (en) Image sensing device
US11671723B2 (en) Image sensing device including light shielding pattern
JP6507712B2 (ja) 光電変換素子、画像読取装置及び画像形成装置
JP6653482B2 (ja) 撮像装置、およびそれに用いられる固体撮像装置
US20080173791A1 (en) Image sensor with three sets of microlenses
US11889217B2 (en) Image sensor including auto-focus pixels
US20220102413A1 (en) Image sensing device
JP2004191630A (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19949599

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19949599

Country of ref document: EP

Kind code of ref document: A1