WO2020015560A1 - 图像传感器及移动终端 - Google Patents

图像传感器及移动终端 Download PDF

Info

Publication number
WO2020015560A1
WO2020015560A1 PCT/CN2019/095366 CN2019095366W WO2020015560A1 WO 2020015560 A1 WO2020015560 A1 WO 2020015560A1 CN 2019095366 W CN2019095366 W CN 2019095366W WO 2020015560 A1 WO2020015560 A1 WO 2020015560A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
sub
preset
green
image sensor
Prior art date
Application number
PCT/CN2019/095366
Other languages
English (en)
French (fr)
Inventor
王丹妹
周华昭
朱盼盼
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to EP19837363.1A priority Critical patent/EP3826288A4/en
Priority to JP2021502977A priority patent/JP2021530875A/ja
Publication of WO2020015560A1 publication Critical patent/WO2020015560A1/zh
Priority to US17/152,368 priority patent/US11463642B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/79Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors

Definitions

  • the present disclosure relates to the field of image processing technologies, and in particular, to an image sensor and a mobile terminal.
  • CMOS complementary metal oxide semiconductors
  • phase detection autofocus PhaseDetectionAuto Focus (PDAF) technology solution can increase the distance of the detected object and complete the focusing action more quickly.
  • the pixel array arrangement of CMOS image sensors cannot detect the distance of objects and can only accept natural light.
  • the pixel array arrangement of 2PD technology can detect the distance of objects, it can only accept natural light, so the images in related technologies
  • the sensor's pixel array arrangement mode has the problems of limited shooting scenes, slow focusing, and affecting the user's shooting experience.
  • the embodiments of the present disclosure provide an image sensor and a mobile terminal to solve the pixel array arrangement mode of the image sensor in the related art, which has problems of limited shooting scenes, slow focusing, and affecting the shooting experience of users.
  • an image sensor including:
  • a pixel array including a preset number of pixel units arranged in a predetermined manner, the pixel unit including a first pixel and a second pixel adjacent to the first pixel position; the first pixel including a red sub-pixel, a green sub-pixel, and A blue sub-pixel, the second pixel includes at least one of a red sub-pixel and a blue sub-pixel, and a green sub-pixel and a preset sub-pixel;
  • the first pixel and the second pixel are all full-pixel dual-core focus pixels, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels;
  • the preset sub-pixel receives an infrared light band and one of a red light band, a green light band and a blue light band, or the preset sub-pixel receives an infrared light band, a red light band, a green light band and a blue light band.
  • an embodiment of the present disclosure further provides a mobile terminal including an imaging system, where the imaging system includes:
  • a driving module for driving the lens module to move
  • a filtering module provided between the lens module and the image sensor
  • An image data processing module connected to the image sensor
  • a display module connected to the image data processing module.
  • the technical solution of the present disclosure improves the RGB pixel array arrangement of a 2PD image sensor, and improves the RGB pixel array arrangement to a pixel array arrangement of a combination of RGB and preset pixels, which can be implemented in the form of 2PD
  • the detection distance ensures fast focusing.
  • preset pixels for receiving different light bands the amount of light can be increased, the photoelectric conversion efficiency can be improved, and the dark state photo effect can be guaranteed to meet the user's needs.
  • FIG. 1a shows a schematic layout of RGB in the related art
  • Figure 1b shows a cross-sectional view of a pixel
  • FIG. 1c shows a pixel array layout of 2PD
  • Figure 1d shows a cross-sectional view of 2PD pixels
  • FIG. 2a shows one of the schematic diagrams of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 2b shows a second schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • 2c shows a third schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 3a shows a fourth schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • 3b shows the fifth schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 4a shows a sixth schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 4b shows a seventh schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 5a shows the eighth schematic diagram of a pixel unit provided by an embodiment of the disclosure
  • FIG. 5b shows a ninth schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 6a shows a tenth schematic diagram of a pixel unit provided by an embodiment of the present disclosure
  • FIG. 6b shows the eleventh schematic diagram of a pixel unit provided by an embodiment of the disclosure
  • FIG. 7 shows a pixel cross-sectional view provided by an embodiment of the present disclosure
  • FIG. 8 shows a schematic diagram of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 9 shows a schematic diagram of an imaging system provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present disclosure.
  • An embodiment of the present disclosure provides an image sensor including a pixel array including a preset number of pixel units arranged in a predetermined manner, as shown in FIGS. 2a to 2c, 3a to 3b, and 4a to 4b.
  • the pixel unit includes a first pixel and a second pixel adjacent to the first pixel position; the first pixel includes a red sub-pixel, a green sub-pixel, and a blue sub-pixel, and the second pixel includes a red sub-pixel and a blue sub-pixel At least one of the sub-pixels, as well as a green sub-pixel and a preset sub-pixel;
  • the first pixel and the second pixel are all full-pixel dual-core focus pixels, and each of the first and second pixels includes four full-pixel dual-core focus sub-pixels; the preset sub-pixels receive infrared light bands, red light bands, and green light.
  • the preset sub-pixels receive infrared light bands, red light bands, and green light.
  • One of a wavelength band and a blue light band, or a preset sub-pixel receives an infrared light band, a red light band, a green light band, and a blue light band.
  • the pixel array included in the image sensor provided in the embodiment of the present disclosure includes a preset number of pixel units, where the preset number of pixel units are arranged in a predetermined manner.
  • the preset number of pixel units each include a first pixel and a second pixel.
  • the first pixel is different from the second pixel.
  • the first pixel includes a red sub-pixel (R), a green sub-pixel (G), and a blue sub-pixel.
  • the pixel (B), the second pixel includes at least one of a red subpixel and a blue subpixel, and further includes a green subpixel and a preset subpixel (D).
  • the first pixel and the second pixel in the embodiment of the present disclosure are all-pixel dual-core focus (2PD) pixels.
  • the pixels are 2PD pixels, that is, the sub-pixels in the first pixel and the second pixel are 2PD sub-pixels.
  • each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • the red, green, and blue sub-pixels in the first pixel are arranged in a certain manner, and the first pixel includes a red sub-pixel, a blue sub-pixel, and two green sub-pixels.
  • the two green sub-pixels are referred to herein as a first green sub-pixel and a second green sub-pixel, respectively, where the first green sub-pixel is the same as the second green sub-pixel.
  • the red sub-pixel is adjacent to the first green sub-pixel
  • the second green sub-pixel is located below the red sub-pixel
  • the blue sub-pixel is located below the first green sub-pixel
  • the second green sub-pixel is adjacent to the blue sub-pixel .
  • the second pixel is a sub-pixel replacement based on the first pixel.
  • the second pixel includes at least one of a red sub-pixel and a blue sub-pixel, and further includes a green sub-pixel and a preset sub-pixel, that is, the second pixel.
  • the pixel may include a red subpixel, a green subpixel, and a preset subpixel.
  • the preset subpixel replaces the blue subpixel in the first pixel; it may include a green subpixel, a blue subpixel, and a preset subpixel.
  • the preset sub-pixel replaces the red sub-pixel in the first pixel; it may further include a green sub-pixel, a red sub-pixel, a blue sub-pixel, and a preset sub-pixel.
  • the position of the preset sub-pixel in the second pixel may be the same as the position of a certain sub-pixel in the first pixel, or may be the same as the position of two 1/2 different sub-pixels adjacent to each other in the first pixel.
  • the position of the 1/2 preset sub-pixel in the second pixel may be the same as the position of any 1/2 sub-pixel in the first pixel.
  • the 1/2 preset sub-pixels of the two adjacent second pixels constitute the preset sub-pixel.
  • the position of the 1/2 preset sub-pixel in the second pixel is the same as the position of the 1/2 red sub-pixel in the first pixel, and the position of the 1/2 preset sub-pixel in the second pixel is 1/2
  • the positions of the green sub-pixels in the first pixel are the same. So far, a complete preset sub-pixel can be formed by combining two second pixels.
  • the preset sub-pixels in the embodiments of the present disclosure can receive infrared light bands and one of red light bands, green light bands, and blue light bands. That is, the preset sub-pixels can receive infrared light bands while simultaneously Receiving a red light band; or receiving a green light band on the basis of receiving an infrared light band; or receiving a blue light band on the basis of receiving an infrared light band. Or the preset sub-pixels receive infrared light band, red light band, green light band and blue light band. That is, based on receiving the infrared light band, red light band, green light band, and blue light band can be received, and the specific form of the preset subpixel receiving light band can be set according to actual needs.
  • the detection distance can be achieved in the form of 2PD to ensure fast focusing.
  • the amount of light can be increased, the photoelectric conversion efficiency can be improved, and the dark state photographing effect can be guaranteed to meet the user's requirements. Usage requirements.
  • the position of the preset sub-pixel in the second pixel and the position of the red sub-pixel, the green sub-pixel, or the blue sub-pixel in the first The same position in the pixel; or
  • the position of the preset sub-pixel in the second pixel is the same as the position of the first combined sub-pixel in the first pixel, or the position of the second combined sub-pixel in the first pixel;
  • the first combined subpixel is a combination of 1/2 red subpixels and 1/2 green subpixels adjacent to each other; the second combined subpixel is 1/2 green subpixels and 1/2 blue adjacent to each other. Combination of sub-pixels.
  • the second pixel When the position of the preset sub-pixel in the second pixel is the same as the position of the red sub-pixel in the first pixel, the second pixel includes a blue sub-pixel, two green sub-pixels, and a preset sub-pixel. Time is to replace the red sub-pixel with a preset sub-pixel based on the first pixel.
  • the second pixel When the position of the preset sub-pixel in the second pixel is the same as the position of the blue sub-pixel in the first pixel, the second pixel includes a red sub-pixel, two green sub-pixels, and a preset sub-pixel. The time is to replace the blue sub-pixel with a preset sub-pixel based on the first pixel.
  • the second pixel When the position of the preset sub-pixel in the second pixel is the same as that of a green sub-pixel in the first pixel, the second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and a pre- If a sub-pixel is set, one of the green sub-pixels is replaced with a preset sub-pixel based on the first pixel.
  • the second pixel When the position of the preset subpixel in the second pixel is the same as the position of the first combined subpixel in the first pixel, the second pixel includes a red subpixel, a green subpixel, a blue subpixel, and a preset subpixel.
  • the 1/2 red sub-pixel and 1/2 green sub-pixel adjacent to the position of the 2PD sub-pixel can be taken as the preset sub-pixel on the basis of the first pixel, that is, the preset sub-pixel is in the second pixel.
  • the position of the 1 ⁇ 2 green sub-pixel and the 1 ⁇ 2 red sub-pixel adjacent to the position are the same in the first pixel.
  • the second pixel When the position of the preset subpixel in the second pixel is the same as the position of the second combined subpixel in the first pixel, the second pixel includes a red subpixel, a green subpixel, a blue subpixel, and a preset subpixel.
  • the 1/2 blue subpixel and 1/2 green subpixel adjacent to the position of the 2PD subpixel can be taken as the preset subpixel on the basis of the first pixel, that is, the preset subpixel is on the second pixel.
  • the position in is the same as the position of the 1/2 green sub-pixel and 1/2 blue sub-pixel adjacent to the position in the first pixel.
  • the pixel unit includes a second pixel and at least one first pixel.
  • the pixel unit includes a second pixel and at least one first pixel, and the number of pixels in the pixel unit is at least two.
  • the number of pixels in the pixel unit is two, it includes one first pixel and one second pixel.
  • the pixel unit includes a first pixel and a second pixel, and the second pixel includes a red sub-pixel, a green sub-pixel, and a preset sub-pixel, and the preset sub-pixel is in the pixel unit. Accounted for 1/8.
  • the pixel unit When the number of pixels in the pixel unit is three, it includes two first pixels and one second pixel.
  • the pixel unit includes two first pixels and one second pixel.
  • the second pixel may include a blue sub-pixel, a green sub-pixel, and a preset sub-pixel.
  • the ratio in the pixel unit is 1/12.
  • the pixel unit When the number of pixels in the pixel unit is four, it includes three first pixels and one second pixel.
  • the pixel unit includes three first pixels and one second pixel.
  • the second pixel includes a blue sub-pixel, a green sub-pixel, a red sub-pixel, and a preset sub-pixel.
  • 1/2 red sub-pixel and 1/2 green sub-pixel of the 2PD sub-pixel are taken as preset sub-pixels, and the proportion of the preset sub-pixels in the pixel unit is 1/16.
  • the pixel array can be a 1/8 density RGB + D pixel unit, a 1/12 density RGB + D pixel unit or a 1/16 density RGB + D pixel unit as a pixel unit array, and the pixel unit array is periodically The array is composed.
  • the pixel array can also be in other forms, which will not be enumerated here.
  • the above-mentioned corresponding access methods of the preset sub-pixels are only used for illustration, and may also be other access methods. Other access methods in the embodiments of the present disclosure are not described here one by one.
  • the position of the preset sub-pixel in the pixel unit (the position of the second pixel) is not limited in the embodiment of the present disclosure.
  • the ratio of the preset sub-pixels in the pixel unit is 1 / 4n, and n is an integer greater than or equal to 2, and the size of the pixel array to which the preset sub-pixels are applicable is not limited.
  • the position of the 1/2 preset sub-pixel in the second pixel and the 1/2 red sub-pixel, 1/2 green sub-pixel, or 1/2 blue sub-pixel are in the first
  • the positions in the pixels are the same, and 1/2 of the preset sub-pixels in two adjacent second pixels constitute the preset sub-pixel.
  • the second pixel may include only 1/2 preset sub-pixels, and a complete preset sub-pixel may be obtained by combining two adjacent second pixels.
  • the position of the 1/2 preset sub-pixel in the second pixel may be the same as the position of the 1/2 red sub-pixel in the first pixel, or may be 1
  • the position of the / 2 green sub-pixel in the first pixel may be the same, or may be the same as the position of the 1/2 blue sub-pixel in the first pixel.
  • the position of the 1/2 preset sub-pixel in a second pixel is the same as the position of the 1/2 red sub-pixel in the first pixel, then the position of the 1/2 preset sub-pixel in another second pixel Same position as the 1/2 green sub-pixel in the first pixel.
  • the position of the 1/2 preset sub-pixel in a second pixel is the same as the position of the 1/2 green sub-pixel in the first pixel, then the position of the 1/2 preset sub-pixel in another second pixel Same position as the 1/2 blue sub-pixel or 1/2 red sub-pixel in the first pixel.
  • the number of the second pixels is two, and the number of the first pixels is greater than or equal to zero.
  • the pixel unit includes two second pixels and first pixels whose number is greater than or equal to zero.
  • two second pixels are included.
  • the pixel unit includes two second pixels, and one second pixel includes one red subpixel, two green subpixels, 1/2 blue subpixel, and 1/2 preset.
  • the position of the 1/2 preset sub-pixel in the second pixel is the same as the position of the 1/2 blue sub-pixel in the first pixel, and the other second pixel includes a red sub-pixel, a A green sub-pixel, a blue sub-pixel, a 1/2 green sub-pixel, and a 1/2 preset sub-pixel.
  • the positions of the 1/2 preset sub-pixels are the same as the positions of the 1/2 green sub-pixels in the first pixel.
  • the ratio of the preset sub-pixels in the pixel unit is 1/8.
  • the number of pixels in the pixel unit when the number of pixels in the pixel unit is three, it includes two second pixels and one first pixel. As shown in FIG. 6b, when the pixel unit includes two second pixels and a first pixel, one of the second pixels includes a red subpixel, a green subpixel, a blue subpixel, and a 1/2 green subpixel. Pixel and 1/2 preset sub-pixel, at this time, the position of the 1/2 preset sub-pixel in the second pixel may be the same as that of the 1/2 green sub-pixel in the first pixel, and in another second pixel. Including two green sub-pixels, one blue sub-pixel, 1/2 red sub-pixel, and 1/2 preset sub-pixel. At this time, the position of the 1/2 preset sub-pixel is first with the 1/2 red sub-pixel. The positions in the pixels are the same. The ratio of the preset sub-pixels in the pixel unit is 1/12.
  • the pixel unit When the number of pixels in the pixel unit is four, it includes two second pixels and two first pixels.
  • the pixel unit includes two second pixels and two first pixels, and one of the second pixels includes one red subpixel, one green subpixel, one blue subpixel, 1/2 The green sub-pixel and the 1/2 preset sub-pixel.
  • the position of the 1/2 preset sub-pixel in a second pixel is the same as that of the 1/2 green sub-pixel in the first pixel.
  • the pixel includes a blue subpixel, two green subpixels, a 1/2 red subpixel, and a 1/2 preset subpixel.
  • the position of the 1/2 preset subpixel and the 1/2 red subpixel are at the first position.
  • the position in one pixel is the same.
  • the ratio of the preset sub-pixels in the pixel unit is 1/16.
  • the above are just a few corresponding implementations, and can be modified on this basis, which will not be explained one by one here.
  • the preset sub-pixels have a point density of 1 / 4n in the pixel unit, and n is an integer greater than or equal to 2, and the size of the pixel array to which the preset sub-pixels are applicable is not limited.
  • the preset sub-pixel is used to receive the blue light band and the infrared light band.
  • the preset sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a first color filter, and a micromirror, which are sequentially stacked.
  • the first color filter is composed of a blue filter and an infrared filter.
  • the semiconductor layer, the metal layer, the photodiode, the first color filter, and the micromirror included in the preset sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer here may be a silicon substrate, but is not limited thereto.
  • the first color filter is an array of filter units, including a blue filter and an infrared filter. At this time, the preset sub-pixel can receive the blue light band and the infrared light band.
  • the preset sub-pixel is used to receive the green light band and the infrared light band;
  • the preset sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a second color filter, and a micromirror, which are sequentially stacked and arranged.
  • the second color filter is composed of a green filter and an infrared filter.
  • the second color filter is an array of filter units, including a green filter and an infrared filter.
  • the preset sub-pixel can receive the green light band and the infrared light band.
  • the preset sub-pixel is used to receive the red light band and the infrared light band;
  • the preset sub-pixel includes: a semiconductor layer, a metal layer, a photodiode, a third color filter, and a micromirror, which are sequentially stacked
  • the third color filter is composed of a red filter and an infrared filter.
  • the third color filter is an array of filter units, including a red filter and an infrared filter.
  • the preset sub-pixel can receive the red light band and the infrared light band.
  • the preset sub-pixel is used for receiving a red light band, a green light band, a blue light band, and an infrared light band;
  • the preset sub-pixel includes: a semiconductor layer, a metal layer, a photodiode, a first Four color filters and micromirrors.
  • the fourth color filter consists of a red filter, a green filter, a blue filter, and an infrared filter.
  • the fourth color filter is an array of filter units, including a red filter, a green filter, a blue filter, and an infrared filter.
  • the preset sub-pixel can receive the red light band, the green light band, the blue light band, and the infrared light band.
  • the wider the wavelength bandwidth that a single channel can pass the more brightness the image sensor can obtain, so it can be used to improve the imaging effect in the dark state.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror which are sequentially stacked
  • the green sub-pixel includes a semiconductor layer, a metal layer, a photodiode, and green which are sequentially stacked.
  • Filters and micromirrors; blue sub-pixels include a semiconductor layer, a metal layer, a photodiode, a blue filter, and a micromirror, which are sequentially stacked.
  • the semiconductor layer, metal layer, photodiode, red filter, and micromirror included in the red sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, metal layer, photodiode, green filter, and micromirror included in the corresponding green sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, the metal layer, the photodiode, the blue filter, and the micromirror included in the blue sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer here may be a silicon substrate, but is not limited thereto.
  • the structure of the red and preset sub-pixels can be seen in FIG. 7.
  • the D filter in FIG. 7 can be the first, second, third, and fourth color filters.
  • the red filter is replaced with a green filter. Filter or blue filter to obtain the structure of the green sub-pixel or the blue sub-pixel.
  • the red, green, and blue subpixels are used to obtain the color information of the pixels of the composite image, which block the entry of infrared rays; for example, only visible light with a wavelength of 380 to 700nm is allowed to enter, which can directly generate a complete and realistic image under high illumination .
  • the infrared wavelength is 750 ⁇ 1100nm.
  • the preset sub-pixels can receive the infrared band, which can improve the imaging effect in the dark state and realize the infrared ranging function.
  • the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled element CCD image sensor, or a quantum thin film image sensor.
  • the pixel array arrangement of the present disclosure is not limited to the applicable image sensor type. It can be a CMOS-based image sensor, a charge-coupled device (CCD) -based image sensor, or a quantum thin-film-based image sensor.
  • the image sensor can of course be other types of image sensors.
  • the image sensor in the embodiment of the present disclosure can be applied to any electronic product including a camera module.
  • the detection distance can be guaranteed in the form of 2PD Fast focusing.
  • preset pixels for receiving different light bands the amount of light can be increased, the photoelectric conversion efficiency can be improved, and the dark state photo effect can be guaranteed to meet the user's needs.
  • the mobile terminal 1 includes an imaging system 2, the imaging system 2 includes the above-mentioned image sensor 21, and the mobile terminal 1 further includes a lens module 22; A driving module 23 for driving the lens module 22 to move; a filtering module 24 provided between the lens module 22 and the image sensor 21; an image data processing module 25 connected to the image sensor 21; and a connection module connected to the image data processing module 25 Display module 26.
  • the mobile terminal 1 includes an imaging system 2.
  • the imaging system 2 includes the image sensor 21 described above.
  • the imaging system 2 further includes a lens module 22 for focusing light.
  • the lens module 22 is connected to the driving module 23.
  • the driving module 23 is configured to adjust the position of the lens module 22 according to the distance of the object to be photographed.
  • a filter module 24 is provided between the lens module 22 and the image sensor 21, where light is focused by the lens module 22, and after passing through the filter module 24, it can be focused on the pixel array of the image sensor 21.
  • the image sensor 21 is connected to an image data processing module 25, and the image data processing module 25 is connected to a display module 26. After the light is focused on the pixel array of the image sensor 21, after the image sensor 21 performs photoelectric conversion, the data is transmitted to the image data processing module 25. After the image data processing module 25 processes the data, it is displayed in the form of a picture on the display module 26 Render.
  • the 2PD pixels in the image sensor 21 can be used to obtain the phase difference, so as to obtain the distance between the object and the imaging surface, thereby achieving fast focusing.
  • the filter module 24 in the embodiment of the present disclosure can pass a light wavelength of 380 nm to 1100 nm. At this time, after the light is focused by the lens module 22, it can be filtered by the filtering module 24, wherein the filtering module 24 can be used to pass natural light and infrared light, and can be used to ensure the imaging effect of the imaging system 2.
  • the detection distance can be guaranteed in the form of 2PD Fast focusing.
  • preset pixels for receiving different light bands the amount of light can be increased, the photoelectric conversion efficiency can be improved, and the dark state photo effect can be guaranteed to meet the user's needs.
  • FIG. 10 is a schematic diagram of a hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the mobile terminal 1000 includes, but is not limited to, a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, and a display unit. 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power source 1011.
  • the mobile terminal 1000 further includes an imaging system.
  • the imaging system includes an image sensor and a lens module; a driving module for driving the lens module to move; a filtering module provided between the lens module and the image sensor; and an image connected to the image sensor A data processing module; and a display module connected to the image data processing module.
  • the filter module can pass the light wavelength of 380nm to 1100nm.
  • the image sensor includes a pixel array including a preset number of pixel units arranged in a predetermined manner.
  • the pixel unit includes a first pixel and a second pixel adjacent to the first pixel position.
  • the first pixel includes a red sub-pixel.
  • a pixel, a green sub-pixel, and a blue sub-pixel, and the second pixel includes at least one of a red sub-pixel and a blue sub-pixel, and a green sub-pixel and a preset sub-pixel;
  • the first pixel and the second pixel are all full-pixel dual-core focus pixels, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels;
  • the preset sub-pixel receives an infrared light band and one of a red light band, a green light band, and a blue light band, or the preset sub-pixel receives an infrared light band, a red light band, a green light band, and a blue light band.
  • the position of the preset sub-pixel in the second pixel is the same as that of the red, green, or blue sub-pixel in the first pixel;
  • the position of the preset sub-pixel in the second pixel is the same as the position of the first combined sub-pixel in the first pixel, or the position of the second combined sub-pixel in the first pixel;
  • the first combined subpixel is a combination of 1/2 red subpixels and 1/2 green subpixels adjacent to each other; the second combined subpixel is 1/2 green subpixels and 1/2 blue adjacent to each other. Combination of sub-pixels.
  • the pixel unit includes a second pixel and at least one first pixel.
  • the position of the 1/2 preset sub-pixel in the second pixel is the same as the position of the 1/2 red sub-pixel, 1/2 green sub-pixel, or 1/2 blue sub-pixel in the first pixel.
  • One-half preset sub-pixels of the second pixels constitute a preset sub-pixel.
  • the number of the second pixels is two, and the number of the first pixels is greater than or equal to zero.
  • the preset sub-pixels are used to receive the blue light band and the infrared light band;
  • the preset sub-pixel includes: a semiconductor layer, a metal layer, a photodiode, a first color filter, and a micromirror, which are sequentially stacked, and the first color filter is composed of a blue filter and an infrared filter.
  • the preset sub-pixels are used to receive green light band and infrared light band;
  • the preset sub-pixel includes: a semiconductor layer, a metal layer, a photodiode, a second color filter, and a micromirror, which are sequentially stacked, and the second color filter is composed of a green filter and an infrared filter.
  • the preset sub-pixels are used to receive the red light band and the infrared light band;
  • the preset sub-pixels include: a semiconductor layer, a metal layer, a photodiode, a third color filter, and a micromirror, which are sequentially stacked, and the third color filter is composed of a red filter and an infrared filter.
  • the preset sub-pixels are used to receive the red light band, the green light band, the blue light band, and the infrared light band;
  • the preset sub-pixels include: a semiconductor layer, a metal layer, a photodiode, a fourth color filter, and a micromirror, which are sequentially stacked, and the fourth color filter includes a red filter, a green filter, and a blue filter. Film and infrared filter.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror, which are sequentially stacked.
  • the green sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a green filter, and micro-mirror, which are sequentially stacked.
  • Mirror; blue sub-pixels include a semiconductor layer, a metal layer, a photodiode, a blue filter, and a micromirror, which are sequentially stacked.
  • the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled element CCD image sensor, or a quantum thin film image sensor.
  • the above mobile terminal improves the RGB pixel array arrangement of the 2PD image sensor, and improves the RGB pixel array arrangement to a pixel array arrangement of a combination of RGB and preset pixels, which can be detected in the form of 2PD.
  • the distance guarantees fast focusing.
  • preset pixels for receiving different light bands the amount of incoming light can be increased, the photoelectric conversion efficiency can be improved, and the dark state photo effect can be guaranteed to meet the user's needs.
  • the structure of the mobile terminal shown in FIG. 10 does not constitute a limitation on the mobile terminal, and the mobile terminal may include more or fewer components than shown in the figure, or combine some components, or different components. Layout.
  • the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a car terminal, a wearable device, and a pedometer.
  • the radio frequency unit 1001 may be used for receiving and sending signals during the process of receiving and sending information or during a call. Specifically, the downlink data from the base station is received and processed by the processor 1010; The uplink data is sent to the base station.
  • the radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1001 can also communicate with a network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 1002, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into audio signals and output them as sound. Also, the audio output unit 1003 may also provide audio output (for example, a call signal receiving sound, a message receiving sound, etc.) related to a specific function performed by the mobile terminal 1000.
  • the audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1004 is used to receive audio or video signals.
  • the input unit 1004 may include a graphics processing unit (GPU) 10041 and a microphone 10042.
  • the graphics processor 10041 pairs images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode Data is processed.
  • the processed image frame can be displayed on the display unit 1006, and the display unit here is the above-mentioned display module.
  • the image frames processed by the graphics processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002.
  • the graphics processor 10041 is the above-mentioned image data processing module.
  • the microphone 10042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 1001 in the case of a telephone call mode and output.
  • the mobile terminal 1000 further includes at least one sensor 1005, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 10061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 10061 and the mobile terminal 1000 when the mobile terminal 1000 moves to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, which can be used to identify mobile terminal attitudes (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc .; sensor 1005 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensors, etc. are not repeated here.
  • the display unit 1006 is configured to display information input by the user or information provided to the user.
  • the display unit 1006 may include a display panel 10061.
  • the display panel 10061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the user input unit 1007 may be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the mobile terminal.
  • the user input unit 1007 includes a touch panel 10071 and other input devices 10072.
  • the touch panel 10071 also known as a touch screen, can collect user's touch operations on or near it (for example, the user uses a finger, a stylus or any suitable object or accessory on the touch panel 10071 or near the touch panel 10071 operating).
  • the touch panel 10071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal caused by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it
  • the processor 1010 receives a command sent from the processor 1010 and executes the command.
  • the touch panel 10071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 1007 may also include other input devices 10072.
  • the other input device 10072 may include, but is not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and an operation lever, and details are not described herein again.
  • the touch panel 10071 may be overlaid on the display panel 10061.
  • the touch panel 10071 detects a touch operation on or near the touch panel 10071, the touch panel 10071 transmits the touch operation to the processor 1010 to determine the type of the touch event.
  • the type of event provides a corresponding visual output on the display panel 10061.
  • the touch panel 10071 and the display panel 10061 are implemented as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated. The implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 1008 is an interface through which an external device is connected to the mobile terminal 1000.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, and audio input / output (I / O) port, video I / O port, headphone port, and more.
  • the interface unit 1008 may be used to receive an input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 1000 or may be used in the mobile terminal 1000 and externally Transfer data between devices.
  • the memory 1009 can be used to store software programs and various data.
  • the memory 1009 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required for at least one function; the storage data area may store data according to Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1009 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1010 is a control center of the mobile terminal, and connects various parts of the entire mobile terminal by using various interfaces and lines, and runs or executes software programs and / or modules stored in the memory 1009, and calls data stored in the memory 1009. , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1010.
  • the mobile terminal 1000 may further include a power source 1011 (such as a battery) for supplying power to various components.
  • a power source 1011 such as a battery
  • the power source 1011 may be logically connected to the processor 1010 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system Management and other functions.
  • the mobile terminal 1000 includes some functional modules that are not shown, and details are not described herein again.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Power Engineering (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Optical Filters (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

提供了一种图像传感器及移动终端,该图像传感器包括像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和第二像素;第一像素包括红色、绿色和蓝色子像素,第二像素包括红色和蓝色子像素中的至少一种子像素以及绿色和预设子像素;第一和第二像素均为全像素双核对焦像素,且第一和第二像素中各包括四个全像素双核对焦子像素;预设子像素接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,或者预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。

Description

图像传感器及移动终端
相关申请的交叉引用
本申请主张在2018年7月19日在中国提交的中国专利申请号No.201810796852.4的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及图像处理技术领域,尤其涉及一种图像传感器及移动终端。
背景技术
相关技术中,智能电子产品已经逐渐成为人们生活中的必需品,拍照功能作为电子产品的一重要配置也在逐渐发展。但随着拍照功能的推广和普及,人们已经不在满足当前的智能电子产品中摄像头仅有的拍照功能,更加期望实现拍照效果多样化、玩法多样化以及功能多样化。
市场上,基于互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)的图像传感器像素阵列排布中,最常用的是红色(Red,R)绿色(Green,G)蓝色(Blue,B)拜耳像素阵列排布模式,如图1a和图1b所示,但此种排布方式不能检测物体距离,且只能用于接受自然光,在正常光照时拍照记录图像。
全像素双核对焦2PD技术的像素阵列排布模式如图1c和图1d所示,此种排布方式也只能用于接受自然光,用于拍照记录图像,但相对相位检测自动对焦(Phase Detection Auto Focus,PDAF)技术方案,可以增加检测物体距离,更加快速的完成对焦动作。
其中2PD相位检测技术原理说明如下:由图1c和图1d可见,像素阵列中部分R,G和B子像素被一分为二,根据不同入射方向获取的光能量不一样,从而左边子像素点和右边子像素点即构成一对相位检测对;当左边子像素点和右边子像素点亮度值均达到相对最大峰值时,此刻图像相对最清晰,即为合焦,然后通过算法计算获得物距,从而实现快速对焦。
综上所述,CMOS的图像传感器像素阵列排布中无法检测物体距离且只 能接受自然光,2PD技术的像素阵列排布中虽然可以检测物体距离,但是仅可接受自然光,因此相关技术中的图像传感器的像素阵列排布模式,存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
发明内容
本公开实施例提供一种图像传感器及移动终端,以解决相关技术中的图像传感器的像素阵列排布模式,存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
第一方面,本公开实施例提供一种图像传感器,包括:
像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素;第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种子像素以及绿色子像素和预设子像素;
第一像素和第二像素均为全像素双核对焦像素,且第一像素和第二像素中各包括四个全像素双核对焦子像素;
预设子像素接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,或者预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。
第二方面,本公开实施例还提供一种移动终端,包括成像系统,成像系统包括:
上述的图像传感器;
透镜模组;
用于驱动透镜模组移动的驱动模块;
设置于透镜模组与图像传感器之间的滤波模块;
与图像传感器连接的图像数据处理模块;以及
与图像数据处理模块连接的显示模块。
本公开技术方案,通过在2PD图像传感器的RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改进为RGB与预设像素点组合的像素阵列排布方式,可以通过2PD的形式实现检测距离保证快速对焦,通过设置 用于接收不同光波段的预设像素点,可增加进光量,提升光电转换效率,保证暗态拍照效果,满足用户的使用需求。
附图说明
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
图1a表示相关技术中RGB排布示意图;
图1b表示像素点的切面图;
图1c表示2PD的像素阵列排布图;
图1d表示2PD像素点的切面图;
图2a表示本公开实施例提供的像素单元的示意图之一;
图2b表示本公开实施例提供的像素单元的示意图之二;
图2c表示本公开实施例提供的像素单元的示意图之三;
图3a表示本公开实施例提供的像素单元的示意图之四;
图3b表示本公开实施例提供的像素单元的示意图之五;
图4a表示本公开实施例提供的像素单元的示意图之六;
图4b表示本公开实施例提供的像素单元的示意图之七;
图5a表示本公开实施例提供的像素单元的示意图之八;
图5b表示本公开实施例提供的像素单元的示意图之九;
图6a表示本公开实施例提供的像素单元的示意图之十;
图6b表示本公开实施例提供的像素单元的示意图之十一;
图7表示本公开实施例提供的像素切面图;
图8表示本公开实施例提供的移动终端示意图;
图9表示本公开实施例提供的成像系统示意图;
图10表示本公开实施例提供的移动终端硬件结构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开实施例提供一种图像传感器,包括:像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,如图2a至图2c、图3a至图3b以及图4a至图4b所示,像素单元包括第一像素和与第一像素位置相邻的第二像素;第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种子像素以及绿色子像素和预设子像素;
第一像素和第二像素均为全像素双核对焦像素,且第一像素和第二像素中各包括四个全像素双核对焦子像素;预设子像素接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,或者预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。
本公开实施例提供的图像传感器所包含的像素阵列,包含预设数目个像素单元,其中预设数目个像素单元按照预定方式进行排布。预设数目个像素单元均包括第一像素以及第二像素,其中第一像素与第二像素有所区别,第一像素中包括红色子像素(R)、绿色子像素(G)和蓝色子像素(B),第二像素中包括红色子像素和蓝色子像素中的至少一种,还包括绿色子像素和预设子像素(D)。
其中,本公开实施例中的第一像素以及第二像素均为全像素双核对焦(2PD)像素,通过采用2PD像素可以检测物体距离,更加快速的完成对焦动作,这里的第一像素以及第二像素均为2PD像素,也就是第一像素和第二像素中的子像素均为2PD子像素。且本公开实施例中第一像素和第二像素中各包括四个全像素双核对焦子像素。
其中,第一像素中的红色子像素、绿色子像素和蓝色子像素按照一定的方式进行排列,且第一像素中包含一个红色子像素、一个蓝色子像素以及两个绿色子像素,在这里为了便于区别将两个绿色子像素分别称为第一绿色子像素和第二绿色子像素,其中第一绿色子像素与第二绿色子像素相同。红色子像素与第一绿色子像素相邻,第二绿色子像素位于红色子像素的下方,蓝 色子像素位于第一绿色子像素的下方,且第二绿色子像素与蓝色子像素相邻。
第二像素是在第一像素的基础上进行了子像素替换,第二像素中包括红色子像素和蓝色子像素中的至少一种,还包括绿色子像素以及预设子像素,即第二像素中可以包括红色子像素、绿色子像素和预设子像素,此时预设子像素替换第一像素中的蓝色子像素;可以包括绿色子像素、蓝色子像素以及预设子像素,此时预设子像素替换第一像素中的红色子像素;还可以包括绿色子像素、红色子像素、蓝色子像素以及预设子像素。
其中预设子像素在第二像素中的位置可以与第一像素中某一子像素的位置相同,还可以与第一像素中位置相邻的两个1/2不同子像素的位置相同。当然还可以是1/2预设子像素在第二像素中的位置与第一像素中任一1/2子像素的位置相同。此时相邻两个第二像素中的1/2预设子像素组成预设子像素。例如1/2预设子像素在第二像素中的位置与1/2红色子像素在第一像素中的位置相同,另外1/2预设子像素在第二像素中的位置与1/2绿色子像素在第一像素中的位置相同,至此可以通过两个第二像素组合形成完整的预设子像素。
本公开实施例中的预设子像素可以接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,即预设子像素可以在接收红外光波段的基础上,同时可接收红色光波段;或者在接收红外光波段的基础上,接收绿色光波段;或者在接收红外光波段的基础上,接收蓝色光波段。或者预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。即可以在接收红外光波段的基础上,接收红色光波段、绿色光波段和蓝色光波段,预设子像素接收光波段的具体形式可以根据实际需求设置。
本公开实施例,可以通过2PD的形式实现检测距离保证快速对焦,通过设置用于接收不同光波段的预设像素点,可增加进光量,提升光电转换效率,保证暗态拍照效果,满足用户的使用需求。
在本公开实施例中,如图2a至图2c、图3a至图3b所示,预设子像素在第二像素中的位置,与红色子像素、绿色子像素或蓝色子像素在第一像素中的位置相同;或者
预设子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同,或者与第二组合子像素在第一像素中的位置相同;
其中,第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
当预设子像素在第二像素中的位置与红色子像素在第一像素中的位置相同时,第二像素中包括一个蓝色子像素、两个绿色子像素以及一个预设子像素,此时就是在第一像素的基础上,将红色子像素替换为预设子像素。
当预设子像素在第二像素中的位置与蓝色子像素在第一像素中的位置相同时,第二像素中包括一个红色子像素、两个绿色子像素以及一个预设子像素,此时就是在第一像素的基础上,将蓝色子像素替换为预设子像素。
当预设子像素在第二像素中的位置与一绿色子像素在第一像素中的位置相同时,第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素以及一个预设子像素,此时就是在第一像素的基础上,将其中一绿色子像素替换为预设子像素。
当预设子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同时,第二像素中包括红色子像素、绿色子像素、蓝色子像素以及预设子像素,此时可以在第一像素的基础上取2PD子像素的位置相邻的1/2红色子像素和1/2绿色子像素为预设子像素,即预设子像素在第二像素中的位置与位置相邻的1/2绿色子像素和1/2红色子像素在第一像素中的位置相同。
当预设子像素在第二像素中的位置,与第二组合子像素在第一像素中的位置相同时,第二像素中包括红色子像素、绿色子像素、蓝色子像素以及预设子像素,此时可以在第一像素的基础上取2PD子像素的位置相邻的1/2蓝色子像素和1/2绿色子像素为预设子像素,即预设子像素在第二像素中的位置与位置相邻的1/2绿色子像素和1/2蓝色子像素在第一像素中的位置相同。
在上述实施例的基础上,像素单元中包括一个第二像素以及至少一个第一像素。
像素单元中包含一个第二像素以及至少一个第一像素,其中像素单元中像素的数量至少为两个。当像素单元中像素的数量为两个时,包括一个第一像素以及一个第二像素。例如,如图5a所示,像素单元中包括一个第一像素以及一个第二像素,其中第二像素中包括红色子像素、绿色子像素以及预设子像素,则预设子像素在像素单元中的占比为1/8。
当像素单元中像素的数量为三个时,包括两个第一像素以及一个第二像素。例如,如图5b所示,像素单元中包括两个第一像素以及一个第二像素,其中第二像素中可以包括蓝色子像素、绿色子像素以及预设子像素,则预设子像素在像素单元中的占比为1/12。
当像素单元中像素的数量为四个时,包括三个第一像素以及一个第二像素。例如,如图3a所示,像素单元中包括三个第一像素以及一个第二像素,其中第二像素中包括蓝色子像素、绿色子像素、红色子像素以及预设子像素,可以在第一像素的基础上取2PD子像素的1/2红色子像素和1/2绿色子像素为预设子像素,预设子像素在像素单元中的占比为1/16。
像素阵列可以是由1/8密度的RGB+D像素单元、1/12密度的RGB+D像素单元或1/16密度的RGB+D像素单元作为一个像素单位阵列,像素单位阵列再进行周期性阵列排布组成。当然像素阵列还可以是其他形式,这里不再列举。
上述对应的几种预设子像素的取点方式仅用于举例说明,还可以是其他的取点方式,本公开实施例中其他的取点方式这里不再一一介绍。预设子像素在像素单元中的取点位置(第二像素的位置)本公开实施例中不做限制。其中预设子像素在像素单元中的占比为1/4n,且n为大于或者等于2的整数,且预设子像素所适用的像素阵列大小不限。
如图4a和图4b所示,1/2预设子像素在所述第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在第一像素中的位置相同,相邻两个第二像素中的1/2预设子像素组成预设子像素。
在第二像素中可以仅包含1/2预设子像素,通过两个相邻的第二像素进行组合可以得到一个完整的预设子像素。当第二像素中包含1/2预设子像素时,1/2预设子像素在第二像素中的位置可以与1/2红色子像素在第一像素中的位置相同,还可以与1/2绿色子像素在第一像素中的位置相同,也可以与1/2蓝色子像素在第一像素中的位置相同。
当1/2预设子像素在一第二像素中的位置与1/2红色子像素在第一像素中的位置相同时,则在另一第二像素中1/2预设子像素的位置与1/2绿色子像素在第一像素中的位置相同。当1/2预设子像素在一第二像素中的位置与1/2绿 色子像素在第一像素中的位置相同时,则在另一第二像素中1/2预设子像素的位置与1/2蓝色子像素或1/2红色子像素在第一像素中的位置相同。
在上述实施例的基础上,在像素单元中,第二像素的数量为两个,第一像素的数量大于或者等于零。
像素单元中像素的数量至少为两个,则像素单元中包括两个第二像素以及数量大于或者等于零的第一像素。当像素单元中像素的数量为两个时,包括两个第二像素。例如,如图6a所示,像素单元中包括两个第二像素,其中在一第二像素中包括一个红色子像素、两个绿色子像素、1/2蓝色子像素以及1/2预设子像素,此时1/2预设子像素在第二像素中的位置与1/2蓝色子像素在第一像素中的位置相同,在另一第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素、1/2绿色子像素以及1/2预设子像素,1/2预设子像素的位置与1/2绿色子像素在第一像素中的位置相同。预设子像素在像素单元中的占比为1/8。
当像素单元中像素的数量为三个时,包括两个第二像素以及一个第一像素。如图6b所示,像素单元中包括两个第二像素以及一个第一像素时,其中一第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素、1/2绿色子像素以及1/2预设子像素,此时1/2预设子像素在第二像素中的位置可以与1/2绿色子像素在第一像素中的位置相同,在另一第二像素中包括两个绿色子像素、一个蓝色子像素、1/2红色子像素以及1/2预设子像素,此时1/2预设子像素的位置则与1/2红色子像素在第一像素中的位置相同。预设子像素在像素单元中的占比为1/12。
当像素单元中像素的数量为四个时,包括两个第二像素以及两个第一像素。例如,如图4b所示,像素单元中包括两个第二像素以及两个第一像素,其中一第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素、1/2绿色子像素以及1/2预设子像素,此时1/2预设子像素在一第二像素中的位置与1/2绿色子像素在第一像素中的位置相同,在另一第二像素中包括一个蓝色子像素、两个绿色子像素、1/2红色子像素以及1/2预设子像素,此时1/2预设子像素的位置与1/2红色子像素在第一像素中的位置相同。预设子像素在像素单元中的占比为1/16。
上述仅仅为几种对应的实施方式,还可以在此基础上进行变形,这里不再一一阐述。其中预设子像素在像素单元中的取点密度为1/4n,且n为大于或者等于2的整数,且预设子像素所适用的像素阵列大小不限。
在本公开实施例中,预设子像素用于接收蓝色光波段以及红外光波段;预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第一彩色滤光片以及微镜,第一彩色滤光片由蓝色滤光片和红外滤光片组成。
预设子像素所包含的半导体层、金属层、光电二极管、第一彩色滤光片以及微镜,由下至上依次排列,这里的半导体层可以为硅基板,但并不局限于此。第一彩色滤光片为滤光单元阵列,包含蓝色滤光片和红外滤光片。此时预设子像素可接收蓝色光波段以及红外光波段。
在本公开实施例中,预设子像素用于接收绿色光波段以及红外光波段;预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第二彩色滤光片以及微镜,第二彩色滤光片由绿色滤光片和红外滤光片组成。
第二彩色滤光片为滤光单元阵列,包含绿色滤光片和红外滤光片。此时预设子像素可接收绿色光波段以及红外光波段。
在本公开实施例中,预设子像素用于接收红色光波段以及红外光波段;预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第三彩色滤光片以及微镜,第三彩色滤光片由红色滤光片和红外滤光片组成。
第三彩色滤光片为滤光单元阵列,包含红色滤光片和红外滤光片。此时预设子像素可接收红色光波段以及红外光波段。
在本公开实施例中,预设子像素用于接收红色光波段、绿色光波段、蓝色光波段以及红外光波段;预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第四彩色滤光片以及微镜,第四彩色滤光片由红色滤光片、绿色滤光片、蓝色滤光片以及红外滤光片组成。
第四彩色滤光片为滤光单元阵列,包含红色滤光片、绿色滤光片、蓝色滤光片和红外滤光片。此时预设子像素可接收红色光波段、绿色光波段、蓝色光波段以及红外光波段。
其中,由于在同样光照情况下,单通道可通过的波长带宽越宽,图像传感器可获取的亮度就越多,因此可用于提升暗态成像效果。
在本公开实施例中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜。
红色子像素所包含的半导体层、金属层、光电二极管、红色滤光片以及微镜,由下至上依次排列。相应的绿色子像素所包含的半导体层、金属层、光电二极管、绿色滤光片以及微镜,由下至上依次排列。蓝色子像素所包含的半导体层、金属层、光电二极管、蓝色滤光片以及微镜,由下至上依次排列。这里的半导体层可以为硅基板,但并不局限于此。其中红色、预设子像素的结构可参见图7,图7中的D滤光片可以为第一、第二、第三以及第四彩色滤光片,将红色滤光片替换为绿色滤光片或者蓝色滤光片,即可获取绿色子像素或者蓝色子像素的结构。
红色、绿色以及蓝色子像素用于获取合成图像的像素的色彩信息,其阻挡红外线的进入;例如,仅使波长在380~700nm的可见光进入,可在高照度下直接生成色彩完整逼真的图像。红外波长为750~1100nm,预设子像素可接收红外波段,进而可提升暗态成像效果,实现红外测距功能。
在本公开实施例中,图像传感器为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
本公开的像素阵列排布方式,适用的图像传感器类型不限,可以是基于CMOS的图像传感器,可以是基于电荷耦合元件(Charge-coupled Device,CCD)的图像传感器,也可以是基于量子薄膜的图像传感器,当然还可以是其他类型的图像传感器。且本公开实施例的图像传感器可适用于任何包含摄像头模组的电子产品中。
这样,通过在2PD图像传感器的RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改进为RGB与预设像素点组合的像素阵列排布方式,可以通过2PD的形式实现检测距离保证快速对焦,通过设置用于接收不同光波段的预设像素点,可增加进光量,提升光电转换效率,保证暗态拍照效果,满足用户的使用需求。
本公开实施例还提供一种移动终端,如图8和图9所示,移动终端1包 括成像系统2,成像系统2包括上述的图像传感器21,移动终端1还包括:透镜模组22;用于驱动透镜模组22移动的驱动模块23;设置于透镜模组22与图像传感器21之间的滤波模块24;与图像传感器21连接的图像数据处理模块25;以及与图像数据处理模块25连接的显示模块26。
本公开实施例的移动终端1包括成像系统2,其中成像系统2包括上述的图像传感器21,成像系统2还包括用于对光线聚焦的透镜模组22,透镜模组22与驱动模块23连接,驱动模块23用于随着待拍摄对象的远近,从而进行调整透镜模组22的位置。
在透镜模组22与图像传感器21之间设置有滤波模块24,其中在光线通过透镜模组22聚焦,经过滤波模块24后,可以聚焦在图像传感器21的像素阵列上。图像传感器21与图像数据处理模块25连接,图像数据处理模块25与显示模块26连接。在光线聚焦在图像传感器21的像素阵列上之后,图像传感器21进行光电转换后,将数据传输给图像数据处理模块25,图像数据处理模块25对数据进行处理后在显示模块26上以图片的形式呈现。
其中在驱动模块23调整透镜模组22的位置之后,可以利用图像传感器21中的2PD像素获取相位差,从而获取物体与成像面距离,进而实现快速对焦。
本公开实施例中的滤波模块24可通过380nm至1100nm的光波长。此时在光线通过透镜模组22聚焦后,可通过滤波模块24进行滤波,其中滤波模块24可用于自然光和红外光的通过,可用于保证成像系统2的成像效果。
这样,通过在2PD图像传感器的RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改进为RGB与预设像素点组合的像素阵列排布方式,可以通过2PD的形式实现检测距离保证快速对焦,通过设置用于接收不同光波段的预设像素点,可增加进光量,提升光电转换效率,保证暗态拍照效果,满足用户的使用需求。
图10为实现本公开各个实施例的一种移动终端的硬件结构示意图,该移动终端1000包括但不限于:射频单元1001、网络模块1002、音频输出单元1003、输入单元1004、传感器1005、显示单元1006、用户输入单元1007、接口单元1008、存储器1009、处理器1010以及电源1011等部件。
移动终端1000,还包括成像系统,成像系统包括图像传感器,透镜模组;用于驱动透镜模组移动的驱动模块;设置于透镜模组与图像传感器之间的滤波模块;与图像传感器连接的图像数据处理模块;以及与图像数据处理模块连接的显示模块。
其中,滤波模块可通过380nm至1100nm的光波长。
其中,图像传感器包括:像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素;第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种子像素以及绿色子像素和预设子像素;
第一像素和第二像素均为全像素双核对焦像素,且第一像素和第二像素中各包括四个全像素双核对焦子像素;
预设子像素接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,或者预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。
其中,预设子像素在第二像素中的位置,与红色子像素、绿色子像素或蓝色子像素在第一像素中的位置相同;或者
预设子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同,或者与第二组合子像素在第一像素中的位置相同;
其中,第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
其中,像素单元中包括一个第二像素以及至少一个第一像素。
其中,1/2预设子像素在第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在第一像素中的位置相同,相邻两个第二像素中的1/2预设子像素组成预设子像素。
其中,在像素单元中,第二像素的数量为两个,第一像素的数量大于或者等于零。
其中,预设子像素用于接收蓝色光波段以及红外光波段;
预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第一彩色滤光片以及微镜,第一彩色滤光片由蓝色滤光片和红外滤光片组成。
其中,预设子像素用于接收绿色光波段以及红外光波段;
预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第二彩色滤光片以及微镜,第二彩色滤光片由绿色滤光片和红外滤光片组成。
其中,预设子像素用于接收红色光波段以及红外光波段;
预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第三彩色滤光片以及微镜,第三彩色滤光片由红色滤光片和红外滤光片组成。
其中,预设子像素用于接收红色光波段、绿色光波段、蓝色光波段以及红外光波段;
预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第四彩色滤光片以及微镜,第四彩色滤光片由红色滤光片、绿色滤光片、蓝色滤光片以及红外滤光片组成。
其中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜。
其中,图像传感器为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
上述移动终端,通过在2PD图像传感器的RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改进为RGB与预设像素点组合的像素阵列排布方式,可以通过2PD的形式实现检测距离保证快速对焦,通过设置用于接收不同光波段的预设像素点,可增加进光量,提升光电转换效率,保证暗态拍照效果,满足用户的使用需求。
本领域技术人员可以理解,图10中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,移动终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
应理解的是,本公开实施例中,射频单元1001可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器1010处理;另外,将上行的数据发送给基站。通常,射频单元1001包括 但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元1001还可以通过无线通信系统与网络和其他设备通信。
移动终端通过网络模块1002为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元1003可以将射频单元1001或网络模块1002接收的或者在存储器1009中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元1003还可以提供与移动终端1000执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元1003包括扬声器、蜂鸣器以及受话器等。
输入单元1004用于接收音频或视频信号。输入单元1004可以包括图形处理器(Graphics Processing Unit,GPU)10041和麦克风10042,图形处理器10041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元1006上,这里的显示单元即为上述的显示模块。经图形处理器10041处理后的图像帧可以存储在存储器1009(或其它存储介质)中或者经由射频单元1001或网络模块1002进行发送。其中图形处理器10041即为上述的图像数据处理模块。麦克风10042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元1001发送到移动通信基站的格式输出。
移动终端1000还包括至少一种传感器1005,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板10061的亮度,接近传感器可在移动终端1000移动到耳边时,关闭显示面板10061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别移动终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器1005还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元1006用于显示由用户输入的信息或提供给用户的信息。显示单元1006可包括显示面板10061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板10061。
用户输入单元1007可用于接收输入的数字或字符信息,以及产生与移动终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元1007包括触控面板10071以及其他输入设备10072。触控面板10071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板10071上或在触控面板10071附近的操作)。触控面板10071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器1010,接收处理器1010发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板10071。除了触控面板10071,用户输入单元1007还可以包括其他输入设备10072。具体地,其他输入设备10072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板10071可覆盖在显示面板10061上,当触控面板10071检测到在其上或附近的触摸操作后,传送给处理器1010以确定触摸事件的类型,随后处理器1010根据触摸事件的类型在显示面板10061上提供相应的视觉输出。虽然在图10中,触控面板10071与显示面板10061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板10071与显示面板10061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元1008为外部装置与移动终端1000连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元1008可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传 输到移动终端1000内的一个或多个元件或者可以用于在移动终端1000和外部装置之间传输数据。
存储器1009可用于存储软件程序以及各种数据。存储器1009可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1009可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器1010是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器1009内的软件程序和/或模块,以及调用存储在存储器1009内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器1010可包括一个或多个处理单元;可选的,处理器1010可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1010中。
移动终端1000还可以包括给各个部件供电的电源1011(比如电池),可选的,电源1011可以通过电源管理系统与处理器1010逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,移动终端1000包括一些未示出的功能模块,在此不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的 技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如只读存储器(Read-Only Memory,ROM)/随机存取存储器(Random Access Memory,RAM)、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (13)

  1. 一种图像传感器,包括:
    像素阵列,所述像素阵列包括按照预定方式排布的预设数目个像素单元,所述像素单元包括第一像素和与所述第一像素位置相邻的第二像素;所述第一像素包括红色子像素、绿色子像素和蓝色子像素,所述第二像素包括红色子像素和蓝色子像素中的至少一种子像素以及绿色子像素和预设子像素;
    所述第一像素和所述第二像素均为全像素双核对焦像素,且所述第一像素和所述第二像素中各包括四个全像素双核对焦子像素;
    所述预设子像素接收红外光波段以及红色光波段、绿色光波段和蓝色光波段中的其中一种,或者所述预设子像素接收红外光波段、红色光波段、绿色光波段和蓝色光波段。
  2. 根据权利要求1所述的图像传感器,其中,
    所述预设子像素在所述第二像素中的位置,与所述红色子像素、所述绿色子像素或所述蓝色子像素在所述第一像素中的位置相同;或者
    所述预设子像素在所述第二像素中的位置,与第一组合子像素在所述第一像素中的位置相同,或者与第二组合子像素在所述第一像素中的位置相同;
    其中,所述第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;所述第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
  3. 根据权利要求2所述的图像传感器,其中,所述像素单元中包括一个所述第二像素以及至少一个所述第一像素。
  4. 根据权利要求1所述的图像传感器,其中,
    1/2预设子像素在所述第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在所述第一像素中的位置相同,相邻两个所述第二像素中的1/2预设子像素组成所述预设子像素。
  5. 根据权利要求4所述的图像传感器,其中,在所述像素单元中,所述第二像素的数量为两个,所述第一像素的数量大于或者等于零。
  6. 根据权利要求1所述的图像传感器,其中,所述预设子像素用于接收 蓝色光波段以及红外光波段;
    所述预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第一彩色滤光片以及微镜,所述第一彩色滤光片由蓝色滤光片和红外滤光片组成。
  7. 根据权利要求1所述的图像传感器,其中,所述预设子像素用于接收绿色光波段以及红外光波段;
    所述预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第二彩色滤光片以及微镜,所述第二彩色滤光片由绿色滤光片和红外滤光片组成。
  8. 根据权利要求1所述的图像传感器,其中,所述预设子像素用于接收红色光波段以及红外光波段;
    所述预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第三彩色滤光片以及微镜,所述第三彩色滤光片由红色滤光片和红外滤光片组成。
  9. 根据权利要求1所述的图像传感器,其中,所述预设子像素用于接收红色光波段、绿色光波段、蓝色光波段以及红外光波段;
    所述预设子像素包括:依次堆叠设置的半导体层、金属层、光电二极管、第四彩色滤光片以及微镜,所述第四彩色滤光片由红色滤光片、绿色滤光片、蓝色滤光片以及红外滤光片组成。
  10. 根据权利要求1所述的图像传感器,其中,所述红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;
    所述绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;
    所述蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜。
  11. 根据权利要求1所述的图像传感器,其中,所述图像传感器为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
  12. 一种移动终端,包括成像系统,其中,所述成像系统包括:
    如权利要求1至11任一项所述的图像传感器;
    透镜模组;
    用于驱动所述透镜模组移动的驱动模块;
    设置于所述透镜模组与所述图像传感器之间的滤波模块;
    与所述图像传感器连接的图像数据处理模块;以及
    与所述图像数据处理模块连接的显示模块。
  13. 根据权利要求12所述的移动终端,其中,所述滤波模块可通过380nm至1100nm的光波长。
PCT/CN2019/095366 2018-07-19 2019-07-10 图像传感器及移动终端 WO2020015560A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19837363.1A EP3826288A4 (en) 2018-07-19 2019-07-10 IMAGE SENSOR AND MOBILE TERMINAL
JP2021502977A JP2021530875A (ja) 2018-07-19 2019-07-10 イメージセンサ及び移動端末
US17/152,368 US11463642B2 (en) 2018-07-19 2021-01-19 Image sensor including pixel array and mobile terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810796852.4 2018-07-19
CN201810796852.4A CN108965665B (zh) 2018-07-19 2018-07-19 一种图像传感器及移动终端

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/152,368 Continuation US11463642B2 (en) 2018-07-19 2021-01-19 Image sensor including pixel array and mobile terminal

Publications (1)

Publication Number Publication Date
WO2020015560A1 true WO2020015560A1 (zh) 2020-01-23

Family

ID=64495566

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/095366 WO2020015560A1 (zh) 2018-07-19 2019-07-10 图像传感器及移动终端

Country Status (5)

Country Link
US (1) US11463642B2 (zh)
EP (1) EP3826288A4 (zh)
JP (1) JP2021530875A (zh)
CN (1) CN108965665B (zh)
WO (1) WO2020015560A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965665B (zh) 2018-07-19 2020-01-31 维沃移动通信有限公司 一种图像传感器及移动终端
CN111837132B (zh) * 2020-03-27 2024-02-06 深圳市汇顶科技股份有限公司 指纹检测的装置和电子设备
CN112532832B (zh) * 2020-11-23 2022-04-12 Oppo(重庆)智能科技有限公司 成像装置与电子设备
WO2023161731A1 (en) * 2022-02-23 2023-08-31 Sony Group Corporation Method for automatic sensor pixel arrangement optimized for multiple camera tasks
CN117130082A (zh) * 2023-03-16 2023-11-28 荣耀终端有限公司 多光谱滤光片阵列、多光谱成像组件、摄像头和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030183746A1 (en) * 2002-04-02 2003-10-02 Pao-Jung Chen High speed single-linear three-color CIS image sensing array
US20110080506A1 (en) * 2009-10-07 2011-04-07 Ping-Kuo Weng Image sensing device and system
CN104065894A (zh) * 2013-03-19 2014-09-24 索尼公司 固态成像装置,固态成像装置的驱动方法以及电子设备
CN108600712A (zh) * 2018-07-19 2018-09-28 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN108965665A (zh) * 2018-07-19 2018-12-07 维沃移动通信有限公司 一种图像传感器及移动终端

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011053711A1 (en) * 2009-10-30 2011-05-05 Invisage Technologies, Inc. Systems and methods for color binning
KR20110137700A (ko) * 2010-06-17 2011-12-23 삼성전자주식회사 광학 장치 및 광학장치를 이용한 이미징 장치
JP5513326B2 (ja) * 2010-09-07 2014-06-04 キヤノン株式会社 撮像素子及び撮像装置
CN101986432B (zh) * 2010-10-25 2015-04-01 上海华虹宏力半导体制造有限公司 Cmos图像传感器
JP2013070180A (ja) * 2011-09-21 2013-04-18 Panasonic Corp 固体撮像素子および固体撮像装置
US20130208154A1 (en) * 2012-02-14 2013-08-15 Weng Lyang Wang High-sensitivity CMOS image sensors
CN103051849B (zh) * 2012-12-18 2017-09-29 上海集成电路研发中心有限公司 新型结构的像素阵列
JP2014239290A (ja) * 2013-06-06 2014-12-18 ソニー株式会社 焦点検出装置、電子機器、製造装置、製造方法
CN103617998B (zh) * 2013-11-22 2016-02-10 深港产学研基地 一种三维cmos数字图像传感器
JP2016033980A (ja) 2014-07-31 2016-03-10 キヤノン株式会社 撮像デバイス、撮像装置および撮像システム
US20160182846A1 (en) * 2014-12-22 2016-06-23 Google Inc. Monolithically integrated rgb pixel array and z pixel array
JP2016139735A (ja) * 2015-01-28 2016-08-04 京セラ株式会社 光電変換層および光電変換装置
JP6436816B2 (ja) * 2015-02-26 2018-12-12 キヤノン株式会社 撮像装置及びその駆動方法
JP6355595B2 (ja) * 2015-06-02 2018-07-11 キヤノン株式会社 撮像素子、撮像装置、撮像素子の制御方法、プログラムおよび記憶媒体
JP6640555B2 (ja) * 2015-12-23 2020-02-05 マクセル株式会社 カメラシステム
TWI578303B (zh) * 2016-05-12 2017-04-11 友達光電股份有限公司 顯示面板及顯示面板的驅動方法
US9909978B2 (en) * 2016-07-05 2018-03-06 Sharp Kabushiki Kaisha Maturity determination device and maturity determination method
KR20180024604A (ko) * 2016-08-30 2018-03-08 삼성전자주식회사 이미지 센서 및 그 구동 방법
JP6648666B2 (ja) 2016-09-30 2020-02-14 株式会社ニコン 撮像素子および焦点調節装置
KR20180051185A (ko) * 2016-11-08 2018-05-16 삼성전자주식회사 색 분리요소를 포함하는 이미지 센서와 그 동작방법
JP2018093284A (ja) * 2016-11-30 2018-06-14 マクセル株式会社 可視近赤外同時撮像装置
CN106982328B (zh) * 2017-04-28 2020-01-10 Oppo广东移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN107040724B (zh) * 2017-04-28 2020-05-15 Oppo广东移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN107610319A (zh) * 2017-08-07 2018-01-19 上海灵岳电子设备有限公司 基于单接触式图像传感器的纸币信息识别装置及方法
CN108511480A (zh) * 2017-08-31 2018-09-07 昆山国显光电有限公司 像素结构以及oled显示装置
CN207354459U (zh) * 2017-10-19 2018-05-11 维沃移动通信有限公司 一种摄像头模组和移动终端
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030183746A1 (en) * 2002-04-02 2003-10-02 Pao-Jung Chen High speed single-linear three-color CIS image sensing array
US20110080506A1 (en) * 2009-10-07 2011-04-07 Ping-Kuo Weng Image sensing device and system
CN104065894A (zh) * 2013-03-19 2014-09-24 索尼公司 固态成像装置,固态成像装置的驱动方法以及电子设备
CN108600712A (zh) * 2018-07-19 2018-09-28 维沃移动通信有限公司 一种图像传感器、移动终端及图像拍摄方法
CN108965665A (zh) * 2018-07-19 2018-12-07 维沃移动通信有限公司 一种图像传感器及移动终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3826288A4 *

Also Published As

Publication number Publication date
CN108965665B (zh) 2020-01-31
JP2021530875A (ja) 2021-11-11
CN108965665A (zh) 2018-12-07
EP3826288A1 (en) 2021-05-26
US20210144323A1 (en) 2021-05-13
US11463642B2 (en) 2022-10-04
EP3826288A4 (en) 2021-08-11

Similar Documents

Publication Publication Date Title
CN108900750B (zh) 一种图像传感器及移动终端
US11962918B2 (en) Image sensor, mobile terminal, and image photographing method
WO2020015560A1 (zh) 图像传感器及移动终端
WO2022143280A1 (zh) 图像传感器、摄像模组和电子设备
WO2020015532A1 (zh) 图像传感器、移动终端及拍摄方法
JP5959714B2 (ja) 撮像装置
WO2020015626A1 (zh) 移动终端及图像拍摄方法
WO2019091426A1 (zh) 摄像头组件、图像获取方法及移动终端
CN108965666B (zh) 一种移动终端及图像拍摄方法
US11996421B2 (en) Image sensor, mobile terminal, and image capturing method
CN113676651B (zh) 图像传感器、控制方法、控制装置、电子设备和存储介质
CN109005339B (zh) 一种图像采集方法、终端及存储介质
CN108933904B (zh) 一种拍照装置、拍照方法、移动终端及存储介质
WO2019170121A1 (zh) 摄像头模组及移动终端
CN108965703A (zh) 一种图像传感器、移动终端及图像拍摄方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19837363

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021502977

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019837363

Country of ref document: EP