WO2020015532A1 - 图像传感器、移动终端及拍摄方法 - Google Patents

图像传感器、移动终端及拍摄方法 Download PDF

Info

Publication number
WO2020015532A1
WO2020015532A1 PCT/CN2019/094545 CN2019094545W WO2020015532A1 WO 2020015532 A1 WO2020015532 A1 WO 2020015532A1 CN 2019094545 W CN2019094545 W CN 2019094545W WO 2020015532 A1 WO2020015532 A1 WO 2020015532A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
sub
mobile terminal
infrared
image sensor
Prior art date
Application number
PCT/CN2019/094545
Other languages
English (en)
French (fr)
Inventor
王丹妹
周华昭
朱盼盼
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to EP19838328.3A priority Critical patent/EP3826285A4/en
Publication of WO2020015532A1 publication Critical patent/WO2020015532A1/zh
Priority to US17/152,266 priority patent/US11463641B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/79Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors

Definitions

  • the present disclosure relates to the field of communication technologies, and in particular, to an image sensor, a mobile terminal, and a shooting method.
  • CMOS Complementary Metal Oxide Semiconductor
  • R red
  • G green
  • B blue
  • this arrangement cannot detect the distance of objects, and can only be used to receive natural light, and take pictures to record images under normal lighting.
  • FIG. 1b The pixel array arrangement mode of full-pixel dual-core focus (Dual PhotoDiode, 2PD) technology is shown in Figure 1b and Figure 1c.
  • This arrangement method can only be used to accept natural light and used to take pictures and record images.
  • Focusing (Phase Detection Auto Focus) (PDAF) technical solution can increase the distance of the detected object and complete the focusing action more quickly.
  • the pixel array arrangement of CMOS image sensors cannot detect the distance of objects and can only accept natural light.
  • the pixel array arrangement of 2PD technology can detect the distance of objects, it can only accept natural light, so the images in related technologies
  • the sensor's pixel array arrangement mode has the problems of limited shooting scenes, slow focusing, and affecting the user's shooting experience.
  • Embodiments of the present disclosure provide an image sensor, a mobile terminal, and a shooting method to solve the problems in the related art that shooting scenes are limited, focus is slow, and the shooting experience of the user is affected.
  • an image sensor including:
  • a pixel array including a preset number of pixel units arranged in a predetermined manner, the pixel unit including a first pixel and a second pixel adjacent to the first pixel position, the first pixel including a red sub-pixel, a green sub-pixel, and A blue sub-pixel, a second pixel including a red sub-pixel, a green sub-pixel, and an infrared sub-pixel, and the first and second pixels are all-pixel dual-core focus pixels;
  • the position of the blue sub-pixel in the first pixel is the same as the position of the infrared sub-pixel in the second pixel, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • an embodiment of the present disclosure further provides a mobile terminal including an imaging system and an infrared transmitting module.
  • the imaging system includes the foregoing image sensor, and further includes:
  • a driving module for driving the lens module to move
  • a filtering module provided between the lens module and the image sensor
  • An image data processing module connected to the image sensor
  • a display module connected to the image data processing module.
  • an embodiment of the present disclosure further provides a shooting method, which is applied to a mobile terminal.
  • the mobile terminal includes the image sensor described above, and further includes an infrared transmitting module.
  • the method includes:
  • Emitting infrared light through an infrared transmitting module Emitting infrared light through an infrared transmitting module
  • stereoscopic information acquisition is performed on the object to be photographed.
  • the technical solution of the present disclosure improves the pixel array arrangement of the image sensor, and improves the RGB pixel array arrangement to the 2PD RGB-IR pixel array arrangement, which can be used to capture the recorded image on the basis of receiving infrared light.
  • it detects the distance between the object to be captured and the mobile terminal, realizes fast focusing and background blurring, improves the imaging effect of the image, and realizes the related application functions of stereo photography and dark state imaging. While ensuring the user's shooting experience, it enhances Functionality of the mobile terminal.
  • FIG. 1a is a schematic diagram of a conventional RGB arrangement in the related art
  • FIG. 1b shows a pixel array layout of 2PD
  • Figure 1c shows a cross-sectional view of a 2PD pixel
  • FIG. 2a shows one of the pixel array layout diagrams according to the embodiment of the present disclosure
  • FIG. 2b is a cross-sectional view of a pixel according to an embodiment of the present disclosure
  • FIG. 3a shows a first schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • 3b shows a second schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • 3c shows a third schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • FIG. 3d shows the second layout diagram of the pixel array according to the embodiment of the present disclosure
  • FIG. 4a shows a fourth schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • FIG. 4b shows a fifth schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • FIG. 4c shows a sixth schematic diagram of a pixel unit according to an embodiment of the present disclosure
  • 4d shows a schematic diagram of a pixel unit 7 according to an embodiment of the present disclosure
  • FIG. 4e shows the third layout diagram of the pixel array according to the embodiment of the present disclosure
  • FIG. 5a shows a schematic diagram of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 5b shows a schematic diagram of an imaging system according to an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a photographing method according to an embodiment of the present disclosure.
  • FIG. 7 shows a hardware structure of a mobile terminal according to an embodiment of the present disclosure.
  • An embodiment of the present disclosure provides an image sensor. As shown in FIGS. 2a, 3a to 3d, and 4a to 4e, the image sensor includes:
  • a pixel array including a preset number of pixel units arranged in a predetermined manner, the pixel unit including a first pixel and a second pixel adjacent to the first pixel position, the first pixel including a red sub-pixel, a green sub-pixel, and The blue sub-pixel, the second pixel includes a red sub-pixel, a green sub-pixel, and an infrared sub-pixel, and the first pixel and the second pixel are full-pixel dual-core focus pixels; wherein the position of the blue sub-pixel in the first pixel Is the same position as the infrared sub-pixel in the second pixel, and each of the first pixel and the second pixel includes four full-pixel dual-core focusing sub-pixels.
  • the pixel array included in the image sensor provided in the embodiment of the present disclosure includes a preset number of pixel units, where the preset number of pixel units are arranged in a predetermined manner.
  • the preset number of pixel units each include a first pixel and a second pixel adjacent to the first pixel position. The first pixel is different from the sub-pixel in the second pixel.
  • the first pixel includes a red sub-pixel (R ), Green sub-pixel (G) and blue sub-pixel (B), the second pixel includes a red sub-pixel, a green sub-pixel, and an infrared sub-pixel (Infrared Radiation, IR), and the infrared sub-pixel is set in the second pixel , Can take an image capture under the condition of receiving infrared light, realize dark state imaging, and ensure the user shooting experience.
  • R red sub-pixel
  • G Green sub-pixel
  • B blue sub-pixel
  • IR infrared sub-pixel
  • the first pixel and the second pixel in the embodiment of the present disclosure are all-pixel dual-core focus (2PD) pixels.
  • 2PD pixels By using the 2PD pixels, the object distance can be detected, and the focusing operation can be completed more quickly.
  • the first pixel and the second pixel each include four sub-pixels, and the four sub-pixels are all 2PD sub-pixels.
  • the position of the blue sub-pixel in the first pixel is the same as the position of the infrared sub-pixel in the second pixel.
  • the following describes the specific form of the first sub-pixel and the second sub-pixel.
  • the red, green, and blue sub-pixels in the first pixel are arranged in a certain manner, and the first pixel includes a red sub-pixel, a blue sub-pixel, and two green sub-pixels. It is convenient to distinguish the two green sub-pixels as a first green sub-pixel and a second green sub-pixel, respectively, where the first green sub-pixel is the same as the second green sub-pixel.
  • the red sub-pixel is adjacent to the first green sub-pixel
  • the second green sub-pixel is located below the red sub-pixel
  • the blue sub-pixel is located below the first green sub-pixel
  • the second green sub-pixel is adjacent to the blue sub-pixel .
  • the red, green, and infrared sub-pixels in the second pixel are arranged in a certain way, and the second pixel includes a red sub-pixel, an infrared sub-pixel, and two green sub-pixels.
  • the two green sub-pixels are referred to as a first green sub-pixel and a second green sub-pixel, respectively, where the first green sub-pixel is the same as the second green sub-pixel.
  • the red sub-pixel is adjacent to the first green sub-pixel
  • the second green sub-pixel is located below the red sub-pixel
  • the infrared sub-pixel is located below the first green sub-pixel
  • the second green sub-pixel is adjacent to the infrared sub-pixel.
  • the mobile terminal by improving the arrangement method of the RGB pixel array, and changing the arrangement method of the RGB pixel array to a Red Green Green Blue Infrared Radiation (RGB-IR) pixel array arrangement method,
  • RGB-IR Red Green Green Blue Infrared Radiation
  • the image sensor of the embodiment of the present disclosure can detect the distance between the object to be photographed and the mobile terminal, realize fast focusing and background blurring.
  • the imaging effect of the image can be improved, and related applications of stereo photography can be realized. Function, while ensuring the user's shooting experience, enhance the functionality of the mobile terminal.
  • the pixel unit includes a second pixel and at least one first pixel.
  • each pixel unit includes a second pixel and at least one first pixel.
  • the specific forms of the first pixel and the second pixel included in the pixel unit are described below in an enumerated manner. Instructions.
  • the density of the infrared sub-pixels in the pixel unit is 1/8.
  • the pixel array shown in FIG. 2a includes eight pixel units, and each pixel unit includes a first pixel and a second pixel. At this time, the second pixel includes two green sub-pixels, one red sub-pixel, and one infrared sub-pixel, that is, the infrared sub-pixel replaces the blue sub-pixel in the first pixel.
  • the position of the infrared sub-pixel (the position of the second pixel) in the pixel unit is not specifically limited.
  • the first pixel and the second pixel may be sequentially arranged, or the second pixel and the first pixel may be sequentially arranged.
  • a pixel unit is formed according to the first pixel and the second pixel, and a pixel array shown in FIG. 2a is formed in combination according to the formed pixel unit.
  • the arrangement of the first pixel and the second pixel in the pixel unit may be different, that is, the position of the second pixel may be different.
  • the pixel array shown in FIG. 2a is only used for illustration, and those skilled in the art can also appropriately deform.
  • pixel units in the current state There are two types of pixel units in the current state, which are a first pixel unit and a second pixel unit, where the first pixel unit and the second pixel are arranged in the first direction in the first pixel unit, and the second pixel unit The second pixel and the first pixel are arranged along the first direction.
  • Two identical first pixel units are arranged along the first direction, and two identical second pixel units are arranged along the first direction.
  • the first pixel unit, the second pixel unit, the first pixel unit, and the second pixel unit are along
  • the second direction is sequentially arranged to form the pixel array shown in FIG. 2a, wherein the first direction is perpendicular to the second direction.
  • the density of the infrared sub-pixels in the pixel unit is 1/12.
  • the position of the infrared sub-pixel (the position of the second pixel) in the pixel unit is not specifically limited.
  • two first pixels and a second pixel are arranged in sequence.
  • the position of the second pixel is on the right side of the pixel unit, FIG. 3b.
  • the first pixel, the second pixel, and the first pixel are arranged in sequence.
  • the position of the second pixel is in the middle of the pixel unit.
  • the second pixel and the two first pixels are arranged in sequence.
  • the position of the second pixel is in the pixel unit.
  • a pixel array may be formed by combining the pixel units shown in FIGS. 3a, 3b, and 3c.
  • the pixel unit shown in FIG. 3a is a third pixel unit
  • the pixel unit shown in FIG. 3b is a fourth pixel unit
  • the pixel unit shown in FIG. 3c is a fifth pixel unit.
  • the pixel array formed in FIG. 3d includes eight pixel units, two identical third pixel units are disposed along the first direction, and two identical fifth pixel units are disposed along the first direction.
  • the five pixel units, the third pixel units, and the fifth pixel units are sequentially arranged along a second direction, where the first direction is perpendicular to the second direction.
  • the arrangement of the pixel array shown in FIG. 3D is only for illustration. When the pixel unit includes two first pixels and one second pixel, the present disclosure also includes various arrangements of the pixel array. Here, No longer list them one by one.
  • the density of the infrared sub-pixels in the pixel unit is 1/16.
  • the position of the infrared sub-pixel (the position of the second pixel) in the pixel unit is not limited.
  • the two first pixels in FIG. 4a are sequentially arranged on the upper layer, and the other first pixel and a second pixel are sequentially arranged on the lower layer.
  • the position of the two pixels is the lower right of the pixel unit.
  • the first pixel and the second pixel are sequentially arranged on the upper layer, the two first pixels are sequentially arranged on the lower layer, and the position of the second pixel is at the upper right of the pixel unit.
  • two first pixels are sequentially arranged on the upper layer, a second pixel and a first pixel are sequentially arranged on the lower layer, and the position of the second pixel is the lower left of the pixel unit.
  • the second pixel and the first pixel are sequentially arranged on the upper layer, the two first pixels are sequentially arranged on the lower layer, and the position of the second pixel is the upper left of the pixel unit.
  • FIG. 4a shows the sixth pixel unit
  • FIG. 4b shows the first pixel unit
  • FIG. 4c is an eighth pixel unit
  • FIG. 4d is a ninth pixel unit.
  • the pixel array formed in FIG. 4e includes 6 pixel units.
  • the sixth pixel unit, the seventh pixel unit, and the sixth pixel unit are disposed along the first direction
  • the eighth pixel unit, the ninth pixel unit, and the eighth pixel unit are disposed along the first direction.
  • the first direction is provided
  • the eighth pixel unit is located below the sixth pixel unit
  • the ninth pixel unit is located below the seventh pixel unit.
  • the arrangement of the pixel array shown in FIG. 4e is only for illustration. When the pixel unit includes three first pixels and one second pixel, the present disclosure also includes various arrangements of the pixel array. Here, No longer list them one by one.
  • the pixel unit in the embodiment of the present disclosure is not limited to this form, and the picking position (position of the second pixel) of the infrared sub-pixel in the pixel unit is not limited here.
  • the density of the infrared sub-pixel in the pixel unit is 1 / 4n, and the value of n is an integer greater than or equal to 2.
  • the pixel array can be: 1/8 density RGB + IR pixel unit, 1/12 density RGB + IR pixel unit or 1/16 density RGB + IR pixel unit as a pixel unit array, and the pixel unit array is then processed. It consists of a periodic array.
  • the pixel array can also be in other forms, which will not be enumerated here, and the size of the pixel array applicable to the infrared sub-pixel is not limited.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror which are sequentially stacked
  • the green sub-pixel includes a semiconductor layer, a metal layer, a photodiode, and green which are sequentially stacked.
  • Filters and micromirrors; blue sub-pixels include semiconductor layers, metal layers, photodiodes, blue filters, and micromirrors that are sequentially stacked; infrared sub-pixels include semiconductor layers, metal layers, and photodiodes that are sequentially stacked , Infrared filters and micromirrors.
  • the semiconductor layer, metal layer, photodiode, red filter, and micromirror included in the red sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, metal layer, photodiode, green filter, and micromirror included in the corresponding green sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, the metal layer, the photodiode, the blue filter, and the micromirror included in the blue sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer here may be a silicon substrate, but is not limited thereto.
  • the structure of the green sub-pixel and the infrared sub-pixel can be seen in FIG. 2b. The difference in the structure of each sub-pixel lies in the different filters. As shown in FIG. 2b, the green filter can be replaced with a red or blue filter to obtain the structure of the red sub-pixel and the blue sub-pixel.
  • the red, green, and blue filters are used to obtain the color information of the pixels of the composite image, which block the entry of infrared rays; for example, only visible light with a wavelength of 380 to 700 nm is allowed to enter, which can be used under high illumination Generate color-integrated, realistic images directly.
  • the semiconductor layer, metal layer, photodiode, infrared filter, and micromirror included in the infrared sub-pixel are arranged in order from bottom to top.
  • the infrared sub-pixel has a photosensitive bandwidth of 750 to 1100 nm.
  • the infrared filter area formed at this time can be used to pass the infrared band, which can not only improve the imaging effect in the dark state, but also realize the infrared ranging function.
  • the RGB sub-pixel point is a light receiving element corresponding to wavelength light of each RGB color
  • the IR sub-pixel point is a light receiving element corresponding to infrared light.
  • the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge coupled device (Charge Coupled Device, CCD) image sensor, or a quantum thin film image sensor.
  • CMOS image sensor complementary metal oxide semiconductor CMOS image sensor
  • CCD Charge Coupled Device
  • the RGB-IR pixel array arrangement of the present disclosure is not limited to the applicable image sensor type. It can be a CMOS-based image sensor, a CCD-based image sensor, or a quantum film-based image sensor. Of course, it can also be Other types of image sensors. And the image sensor in the embodiment of the present disclosure can be applied to any electronic product including a camera module.
  • the image sensor provided by the embodiment of the present disclosure improves the pixel array arrangement method, and improves the RGB pixel array arrangement method to 2PD RGB-IR pixel array arrangement method.
  • the distance between the object to be photographed and the mobile terminal is detected to achieve rapid focusing and background blurring, improve the imaging effect of the image, and realize the related application functions of stereo photography and dark imaging. While ensuring the user's shooting experience, Enhance the functionality of mobile terminals.
  • An embodiment of the present disclosure further provides a mobile terminal, as shown in FIG. 5a and FIG. 5b, including an imaging system 51 and an infrared transmitting module 52.
  • the imaging system 51 includes the above-mentioned image sensor 511, and further includes: a lens module 512; A driving module 513 that drives the lens module 512 to move; a filtering module 514 provided between the lens module 512 and the image sensor 511; an image data processing module 515 connected to the image sensor 511; and a display connected to the image data processing module 515 Block 516.
  • the mobile terminal 5 includes an imaging system 51 and an infrared transmitting module 52.
  • the imaging system 51 includes the above-mentioned image sensor 511.
  • the imaging system 51 further includes a lens module 512 for focusing light. 512 is connected to a driving module 513.
  • the driving module 513 is used to adjust the position of the lens module 512 according to the distance of the object to be photographed.
  • a filter module 514 is provided between the lens module 512 and the image sensor 511, where light is focused through the lens module 512, and after passing through the filter module 514, it can be focused on the pixel array of the image sensor 511.
  • the image sensor 511 is connected to an image data processing module 515, and the image data processing module 515 is connected to a display module 516. After the light is focused on the pixel array of the image sensor 511, after the image sensor 511 performs photoelectric conversion, the data is transmitted to the image data processing module 515. After the image data processing module 515 processes the data, it is displayed as a picture on the display module 516 Render.
  • the 2PD pixels in the image sensor 511 can be used to obtain the phase difference, so as to obtain the distance between the object and the imaging surface, thereby achieving fast focusing.
  • the RGB + IR pixel array arrangement based on the 2PD image sensor in the present disclosure can be used with the infrared transmitting module 52 to realize stereo related functions, such as: face recognition unlocking, secure payment, stereo imaging and other terminal applications. On the basis, the functionality of the mobile terminal is improved.
  • the filter module 514 in the embodiment of the present disclosure can pass a light wavelength from 380 nm to 1100 nm. At this time, after the light is focused by the lens module 512, it can be filtered by the filtering module 514, where the filtering module 514 can be used to pass natural light and infrared light, and can be used to ensure the imaging effect of the imaging system 51.
  • the infrared emitting module 52 on the mobile terminal is disposed on the periphery of the lens module 512.
  • the infrared transmitting module 52 emits infrared rays, and the infrared rays will be reflected after encountering obstacles.
  • the imaging system 51 captures the reflected infrared rays, it performs photoelectric conversion through the infrared sub-pixels to obtain the time difference between the infrared rays being emitted and the infrared rays being received. Because the speed of light propagation is fixed, the distance between the obstacle and the mobile terminal can be calculated, and finally the distance from each minimal unit on the obstacle to the mobile terminal can be obtained to realize the stereo imaging recording function. Of course, the distance between each infrared light reflection point on the obstacle and the mobile terminal can also be obtained by acquiring the phase difference of the infrared light.
  • the mobile terminal in the embodiment of the present disclosure improves the pixel array arrangement of the image sensor, and improves the RGB pixel array arrangement to a 2PD RGB-IR pixel array arrangement, which can capture and record images when receiving infrared light. On the basis of, it detects the distance between the object to be photographed and the mobile terminal, realizes fast focusing and background blurring, improves the imaging effect of the image, and realizes the related application functions of stereo photography and dark imaging, while ensuring the user's shooting experience. To ensure the diversification of mobile terminal functions.
  • An embodiment of the present disclosure further provides a shooting method, which is applied to a mobile terminal.
  • the mobile terminal includes the image sensor described above, and further includes an infrared transmitting module. As shown in FIG. 6, the method includes:
  • Step 601 Transmit infrared light through an infrared transmitting module.
  • the infrared transmitting module on the mobile terminal can emit infrared rays.
  • the infrared rays will be reflected after encountering the object to be photographed, and the reflected infrared light will be received by the imaging system of the mobile terminal. Since the image sensor of the mobile terminal forms an RGB-IR pixel array, photoelectric conversion can be performed through infrared sub-pixels.
  • Step 602 Obtain the distance between each infrared light reflection point on the to-be-photographed object and the mobile terminal according to the infrared light reflected by the to-be-photographed object.
  • the process of actually acquiring the distance between the object to be photographed and the imaging surface, and acquiring the distance between each infrared light reflection point on the object to be photographed and the mobile terminal is: A pixel array including a second pixel receives infrared light reflected by each infrared light reflection point on the object to be photographed; according to the time difference between sending and receiving infrared light and the propagation speed of the infrared light or obtaining the phase difference of the infrared light, the object to be photographed is obtained The distance between each infrared light reflection point on the mobile terminal and the mobile terminal.
  • photoelectric conversion is performed through the infrared sub-pixels to obtain the time difference between the infrared emission and the infrared reception. Since the light propagation speed is fixed, it can be calculated based on 1/2 of the product of the time difference and the propagation speed. The distance of the obstacle from the mobile terminal. The time taken by the mobile terminal to receive the infrared light reflected by each infrared light reflection point is different. For each infrared light reflection point, a distance can be calculated correspondingly, and the distance between each infrared light reflection point and the mobile terminal can be obtained.
  • TOF Time of flight
  • the pixel array of the image sensor includes a preset number of pixel units arranged in a predetermined manner.
  • the pixel unit includes a first pixel and a second pixel adjacent to the first pixel position.
  • the first pixel includes a red sub-pixel.
  • Pixel, green sub-pixel, and blue sub-pixel, the second pixel includes red sub-pixel, green sub-pixel, and infrared sub-pixel, and the first and second pixels are full-pixel dual-core focus pixels; the blue sub-pixel is in the first
  • the position in the pixel is the same as the position of the infrared sub-pixel in the second pixel, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • Step 603 Obtain stereo information about the object to be photographed according to the distance between each infrared light reflection point on the object to be photographed and the mobile terminal.
  • the shooting method of the embodiment of the present disclosure can improve the pixel array arrangement of the image sensor, and improve the RGB pixel array arrangement to 2PD's RGB-IR pixel array arrangement, which can record when receiving infrared light. Based on the image, the distance between the object to be captured and the mobile terminal is detected, and fast focusing and background blurring are achieved to improve the imaging effect of the image. It also realizes the related application functions of stereo photography and dark imaging, which guarantees the user ’s shooting experience. At the same time, the functions of the mobile terminal are guaranteed to be diverse.
  • FIG. 7 is a schematic diagram of a hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the mobile terminal 700 includes, but is not limited to, a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, and a display unit. 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, and a power source 711.
  • the mobile terminal 700 further includes an imaging system and an infrared transmitting module.
  • the imaging system includes an image sensor and a lens module; a driving module for driving the lens module to move; a filtering module provided between the lens module and the image sensor; and an image A sensor-connected image data processing module; and a display module connected to the image data processing module.
  • the filter module can pass the light wavelength of 380nm to 1100nm, and the infrared emission module is arranged on the periphery of the lens module.
  • the image sensor includes a pixel array including a preset number of pixel units arranged in a predetermined manner.
  • the pixel unit includes a first pixel and a second pixel adjacent to the first pixel position.
  • the first pixel includes a red sub-pixel, A green sub-pixel and a blue sub-pixel
  • the second pixel includes a red sub-pixel, a green sub-pixel, and an infrared sub-pixel
  • the first pixel and the second pixel are all full-pixel dual-core focusing pixels
  • the position in the pixel is the same as the position of the infrared sub-pixel in the second pixel, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • the pixel unit includes a second pixel and at least one first pixel.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror, which are sequentially stacked.
  • the green sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a green filter, and micro-mirror, which are sequentially stacked.
  • Mirrors; blue sub-pixels include semiconductor layers, metal layers, photodiodes, blue filters, and micromirrors that are sequentially stacked; infrared sub-pixels include semiconductor layers, metal layers, photodiodes, and infrared filters that are sequentially stacked And micromirrors.
  • the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled element CCD image sensor, or a quantum thin film image sensor.
  • the structure of the mobile terminal shown in FIG. 7 does not constitute a limitation on the mobile terminal, and the mobile terminal may include more or fewer components than shown in the figure, or combine some components, or different components. Layout.
  • the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a car terminal, a wearable device, and a pedometer.
  • the processor 710 is configured to: emit infrared light through an infrared transmitting module; obtain the distance between the object to be photographed and the mobile terminal according to the infrared light reflected by each infrared light reflection point on the object to be photographed; and according to each infrared on the object to be photographed The distance between the light reflection point and the mobile terminal is used to obtain stereo information about the subject to be photographed.
  • the processor 710 is configured to receive each infrared on the to-be-photographed object through a pixel array including a second pixel.
  • the infrared light reflected by the light reflection point according to the time difference between sending and receiving infrared light and the propagation speed of the infrared light or by obtaining the phase difference of the infrared light, obtain the distance between each infrared light reflection point on the object to be photographed and the mobile terminal.
  • the radio frequency unit 701 may be used to receive and send signals during the transmission and reception of information or during a call. Specifically, the downlink data from the base station is received and processed by the processor 710; The uplink data is sent to the base station.
  • the radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 701 can also communicate with a network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 702, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into audio signals and output them as sound. Also, the audio output unit 703 may also provide audio output (for example, a call signal receiving sound, a message receiving sound, etc.) related to a specific function performed by the mobile terminal 700.
  • the audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 704 is configured to receive an audio or video signal.
  • the input unit 704 may include a graphics processing unit (GPU) 7041 and a microphone 7042.
  • the graphics processor 7041 pairs images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 706, where the display unit is the above-mentioned display module.
  • the image frames processed by the graphics processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio frequency unit 701 or the network module 702.
  • the graphics processor 7041 is the above-mentioned image data processing module.
  • the microphone 7042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 701 in the case of a telephone call mode and output.
  • the mobile terminal 700 further includes at least one sensor 705, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 7061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 7061 and the mobile terminal 700 when the mobile terminal 700 moves to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, which can be used to identify mobile terminal attitudes (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc .; sensor 705 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensors, etc. are not repeated here.
  • the display unit 706 is configured to display information input by the user or information provided to the user.
  • the display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 707 may be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the mobile terminal.
  • the user input unit 707 includes a touch panel 7071 and other input devices 7072.
  • Touch panel 7071 also known as touch screen, can collect user's touch operations on or near it (for example, the user uses a finger, stylus, etc. any suitable object or accessory on touch panel 7071 or near touch panel 7071 operating).
  • the touch panel 7071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal caused by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it To the processor 710, receive the command sent by the processor 710 and execute it.
  • various types such as resistive, capacitive, infrared, and surface acoustic wave can be used to implement the touch panel 7071.
  • the user input unit 707 may further include other input devices 7072.
  • other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, and details are not described herein again.
  • the touch panel 7071 may be overlaid on the display panel 7061.
  • the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch panel 7071 transmits the touch operation to the processor 710 to determine the type of touch event.
  • the type of event provides corresponding visual output on the display panel 7061.
  • the touch panel 7071 and the display panel 7061 are implemented as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 can be integrated. The implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 708 is an interface through which an external device is connected to the mobile terminal 700.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, and audio input / output (Input / Output, I / O) port, video I / O port, headphone port, etc.
  • the interface unit 708 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 700 or may be used to connect the mobile terminal 700 and the external Transfer data between devices.
  • the memory 709 may be used to store software programs and various data.
  • the memory 709 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required for at least one function; the storage data area may store data according to Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 709 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 710 is the control center of the mobile terminal. It uses various interfaces and lines to connect various parts of the entire mobile terminal. It runs or executes software programs and / or modules stored in the memory 709, and calls data stored in the memory 709. , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 710 may include one or more processing units; optionally, the processor 710 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 710.
  • the mobile terminal 700 may further include a power source 711 (such as a battery) for supplying power to various components.
  • a power source 711 such as a battery
  • the power source 711 may be logically connected to the processor 710 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 700 includes some functional modules that are not shown, and details are not described herein again.
  • the disclosed apparatus and method may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, which may be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objective of the solution of this embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, or each of the units may exist separately physically, or two or more units may be integrated into one unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Power Engineering (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

本公开提供了一种图像传感器、移动终端及拍摄方法,图像传感器包括:像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素,第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素、绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素;其中,蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,且第一像素和第二像素中各包括四个全像素双核对焦子像素。

Description

图像传感器、移动终端及拍摄方法
相关申请的交叉引用
本申请主张在2018年7月19日在中国提交的中国专利申请号No.201810797261.9的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及通信技术领域,尤其涉及一种图像传感器、移动终端及拍摄方法。
背景技术
目前,智能电子产品已经逐渐成为人们生活中的必需品,拍照功能作为电子产品的一项重要配置也在逐渐发展。但随着拍照功能的推广和普及,人们已经不再满足当前的智能电子产品中摄像头仅有的拍照功能,更加期望实现拍照效果多样化、玩法多样化以及功能多样化。
目前市场上,基于互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)的图像传感器像素阵列排布中,最常用的是红色(R)绿色(G)蓝色(B)拜耳像素阵列排布模式,如图1a所示,但此种排布方式不能检测物体距离,且只能用于接受自然光,在正常光照时拍照记录图像。
全像素双核对焦(Dual PhotoDiode,2PD)技术的像素阵列排布模式如图1b和图1c所示,此种排布方式也只能用于接受自然光,用于拍照记录图像,但相对相位检测自动对焦(Phase Detection Auto Focus,PDAF)技术方案,可以增加检测物体距离,更加快速的完成对焦动作。
其中2PD相位检测技术原理说明如下:由图1b和图1c可见,像素阵列中部分R,G和B子像素被一分为二,根据不同入射方向获取的光能量不一样,从而左边子像素点和右边子像素点即构成一对相位检测对;当左边子像素点和右边子像素点亮度值均达到相对最大峰值时,此刻图像相对最清晰,即为合焦,然后通过算法计算获得物距,从而实现快速对焦。
综上所述,CMOS的图像传感器像素阵列排布中无法检测物体距离且只 能接受自然光,2PD技术的像素阵列排布中虽然可以检测物体距离,但是仅可接受自然光,因此相关技术中的图像传感器的像素阵列排布模式,存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
发明内容
本公开实施例提供一种图像传感器、移动终端及拍摄方法,以解决相关技术中拍摄时存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
为了解决上述问题,本公开实施例是这样实现的:
第一方面,本公开实施例提供一种图像传感器,包括:
像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素,第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素、绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素;
其中,蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,且第一像素和第二像素中各包括四个全像素双核对焦子像素。
第二方面,本公开实施例还提供一种移动终端,包括成像系统以及红外发射模块,成像系统包括上述的图像传感器,还包括:
透镜模组;
用于驱动透镜模组移动的驱动模块;
设置于透镜模组与图像传感器之间的滤波模块;
与图像传感器连接的图像数据处理模块;以及
与图像数据处理模块连接的显示模块。
第三方面,本公开实施例还提供一种拍摄方法,应用于移动终端,移动终端包括上述的图像传感器,还包括红外发射模块,该方法包括:
通过红外发射模块发射红外光;
根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与移动终端之间的距离;
根据待拍摄对象上各个红外光反射点与移动终端之间的距离,对待拍摄对象进行立体信息获取。
本公开的技术方案,通过对图像传感器的像素阵列排布方式进行改进,将RGB像素阵列排布方式改进为2PD的RGB-IR的像素阵列排布方式,可以在接收红外光拍摄记录图像的基础上,检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,提升图像的成像效果,并实现立体拍照的相关应用功能以及暗态成像,在保证用户拍摄体验的同时,增强移动终端的功能性。
附图说明
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本公开保护的范围。
图1a表示相关技术常规RGB排布示意图;
图1b表示2PD的像素阵列排布图;
图1c表示2PD像素切面图;
图2a表示本公开实施例像素阵列排布示意图之一;
图2b表示本公开实施例像素点切面图;
图3a表示本公开实施例像素单元示意图一;
图3b表示本公开实施例像素单元示意图二;
图3c表示本公开实施例像素单元示意图三;
图3d表示本公开实施例像素阵列排布示意图之二;
图4a表示本公开实施例像素单元示意图四;
图4b表示本公开实施例像素单元示意图五;
图4c表示本公开实施例像素单元示意图六;
图4d表示本公开实施例像素单元示意图七;
图4e表示本公开实施例像素阵列排布示意图之三;
图5a表示本公开实施例移动终端示意图;
图5b表示本公开实施例成像系统示意图;
图6表示本公开实施例拍摄方法示意图;
图7表示本公开实施例移动终端硬件结构示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本公开保护的范围。
本公开实施例提供一种图像传感器,如图2a、图3a至图3d以及图4a至图4e所示,图像传感器包括:
像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素,第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素、绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素;其中,蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,且第一像素和第二像素中各包括四个全像素双核对焦子像素。
本公开实施例提供的图像传感器所包含的像素阵列,包含预设数目个像素单元,其中预设数目个像素单元按照预定方式进行排布。预设数目个像素单元均包括第一像素以及与第一像素位置相邻的第二像素,其中第一像素与第二像素中的子像素有所区别,第一像素中包含红色子像素(R)、绿色子像素(G)和蓝色子像素(B),第二像素中包含红色子像素、绿色子像素和红外子像素(Infrared Radiation,IR),通过在第二像素中设置红外子像素,可以在接收红外光的情况下进行图像拍摄,实现暗态成像,保证用户拍摄体验。
其中,本公开实施例中的第一像素以及第二像素均为全像素双核对焦(2PD)像素,通过采用2PD像素可以检测物体距离,更加快速的完成对焦动作。第一像素中和第二像素中各包含四个子像素,且四个子像素均为2PD子像素。
蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,下面通过第一子像素以及第二子像素的具体形式进行介绍。
第一像素中的红色子像素、绿色子像素和蓝色子像素按照一定的方式进 行排列,且第一像素中包含一个红色子像素、一个蓝色子像素以及两个绿色子像素,在这里为了便于区别将两个绿色子像素分别称为第一绿色子像素和第二绿色子像素,其中第一绿色子像素与第二绿色子像素相同。红色子像素与第一绿色子像素相邻,第二绿色子像素位于红色子像素的下方,蓝色子像素位于第一绿色子像素的下方,且第二绿色子像素与蓝色子像素相邻。
第二像素中的红色子像素、绿色子像素和红外子像素按照一定的方式进行排列,且第二像素中包含一个红色子像素、一个红外子像素以及两个绿色子像素,在这里为了便于区别将两个绿色子像素分别称为第一绿色子像素和第二绿色子像素,其中第一绿色子像素与第二绿色子像素相同。红色子像素与第一绿色子像素相邻,第二绿色子像素位于红色子像素的下方,红外子像素位于第一绿色子像素的下方,且第二绿色子像素与红外子像素相邻。
本公开实施例,通过在RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改为红绿蓝红外(Red Green Blue Infrared Radiation,RGB-IR)的像素阵列排布方式,可以使得移动终端在接收红外光的情况下进行图像拍摄,实现暗态成像,保证用户拍摄体验。
同时本公开实施例的图像传感器可以实现检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,通过与红外发射模块配合,可提升图像的成像效果,实现立体拍照的相关应用功能,在保证用户拍摄体验的同时,增强移动终端的功能性。
在本公开实施例中,如图2a、图3a至图3d以及图4a至图4e所示,像素单元包括一个第二像素以及至少一个第一像素。
在本公开实施例所提供的像素阵列中,每一个像素单元包含一个第二像素以及至少一个第一像素,下面以列举的方式对像素单元所包含的第一像素和第二像素的具体形式进行说明。
如图2a所示,当像素单元中包含一个第一像素以及一个第二像素时,此时红外子像素在像素单元中的密度为1/8。图2a所示的像素阵列中包括8个像素单元,每一像素单元包含一个第一像素以及一个第二像素。此时第二像素中包括两个绿色子像素,一个红色子像素以及一个红外子像素,即红外子像素代替了第一像素中的蓝色子像素。
像素单元中红外子像素的位置(第二像素的位置)不做具体限定,在像素单元中可以是第一像素、第二像素依次排列,也可以是第二像素、第一像素依次排列。根据第一像素和第二像素形成像素单元,根据所形成的像素单元组合形成图2a所示的像素阵列。图2a所示的像素阵列中,像素单元中的第一像素和第二像素的排列方式可以不同,即第二像素的位置可以有所区别。图2a所示的像素阵列仅仅用于举例说明,本领域技术人员还可以进行适当变形。
当前状态下的像素单元对应的种类为两种,分别为第一像素单元和第二像素单元,其中在第一像素单元中第一像素和第二像素沿第一方向排列,在第二像素单元中第二像素以及第一像素沿第一方向排列。
两个完全相同的第一像素单元沿第一方向设置,两个完全相同的第二像素单元沿第一方向设置,第一像素单元、第二像素单元、第一像素单元以及第二像素单元沿第二方向依次排列,形成图2a所示的像素阵列,其中第一方向与第二方向垂直。
如图3a至3c所示,当像素单元中包含两个第一像素以及一个第二像素时,此时红外子像素在像素单元中的密度为1/12。像素单元中红外子像素的位置(第二像素的位置)不做具体限定,图3a中两个第一像素以及一第二像素依次排列,第二像素的位置为像素单元的右侧,图3b中第一像素、第二像素以及第一像素依次排列,第二像素的位置为像素单元的中部,图3c中第二像素、两个第一像素依次排列,第二像素的位置为像素单元的左侧。可以根据图3a、图3b以及图3c所示的像素单元组合形成像素阵列。
当前状态下图3a所示的像素单元为第三像素单元,图3b所示的像素单元为第四像素单元,图3c所示的像素单元为第五像素单元。
图3d所形成的像素阵列中包括8个像素单元,两个完全相同的第三像素单元沿第一方向设置,两个完全相同的第五像素单元沿第一方向设置,第三像素单元、第五像素单元、第三像素单元以及第五像素单元沿第二方向依次排列,其中第一方向与第二方向垂直。图3d示出的像素阵列的排布方式仅仅用于举例说明,当像素单元中包含两个第一像素以及一个第二像素时,本公开中还包括多种像素阵列的排布方式,在此不再一一列举。
如图4a所示,当像素单元中包含三个第一像素以及一个第二像素时,此时红外子像素在像素单元中的密度为1/16。像素单元中红外子像素的位置(第二像素的位置)不做限定,此时图4a中两个第一像素在上层依次排列,另一第一像素和一第二像素在下层依次排列,第二像素的位置为像素单元的右下方。图4b中第一像素以及第二像素在上层依次排列,两个第一像素在下层依次排列,第二像素的位置为像素单元的右上方。图4c中两个第一像素在上层依次排列,第二像素和一第一像素在下层依次排列,第二像素的位置为像素单元的左下方。图4d中第二像素以及第一像素在上层依次排列,两个第一像素在下层依次排列,第二像素的位置为像素单元的左上方。
当前状态下的像素单元对应的种类为四种,分别为第六像素单元、第七像素单元、第八像素单元以及第九像素单元,其中图4a所示为第六像素单元,图4b为第七像素单元,图4c为第八像素单元,图4d为第九像素单元。
图4e所形成的像素阵列中包括6个像素单元,其中第六像素单元、第七像素单元以及第六像素单元沿第一方向设置,第八像素单元、第九像素单元以及第八像素单元沿第一方向设置,且第八像素单元位于第六像素单元的下方,第九像素单元位于第七像素单元的下方。图4e示出的像素阵列的排布方式仅仅用于举例说明,当像素单元中包含三个第一像素以及一个第二像素时,本公开中还包括多种像素阵列的排布方式,在此不再一一列举。
本公开实施例中的像素单元并不局限于此种形式,像素单元中红外子像素的取点位置(第二像素的位置)在此不做限制,其中红外子像素在像素单元中的密度为1/4n,且n的取值为大于或者等于2的整数。则像素阵列可以是:由1/8密度的RGB+IR像素单元、1/12密度的RGB+IR像素单元或1/16密度的RGB+IR像素单元作为一个像素单位阵列,像素单位阵列再进行周期性阵列排布组成。当然像素阵列还可以是其它形式,这里不再列举,且红外子像素点所适用的像素阵列大小不限。
在本公开实施例中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;红外子像素 包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
红色子像素所包含的半导体层、金属层、光电二极管、红色滤光片以及微镜,由下至上依次排列。相应的绿色子像素所包含的半导体层、金属层、光电二极管、绿色滤光片以及微镜,由下至上依次排列。蓝色子像素所包含的半导体层、金属层、光电二极管、蓝色滤光片以及微镜,由下至上依次排列。这里的半导体层可以为硅基板,但并不局限于此。其中绿色子像素以及红外子像素的结构形式可参见图2b所示。各个子像素的结构形式的区别在于滤光片不同,参见图2b所示,可以将绿色滤光片替换为红色或者蓝色滤光片,得到红色子像素以及蓝色子像素的结构。
红色滤光片、绿色滤光片和蓝色滤光片用于获取合成图像的像素的色彩信息,其阻挡红外线的进入;例如,仅使波长在380~700nm的可见光进入,可在高照度下直接生成色彩完整逼真的图像。
红外子像素所包含的半导体层、金属层、光电二极管、红外滤光片以及微镜,由下至上依次排列。红外子像素的感光带宽为750~1100nm,此时形成的红外滤光区可用于使红外波段通过,不仅可提升暗态成像效果,还可实现红外测距功能。
由以上说明可见,RGB子像素点是对应于每种RGB颜色的波长光的光接收元件,IR子像素点是对应红外光的光接收元件。
在本公开实施例中,图像传感器为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件(Charge Coupled Device,CCD)图像传感器或量子薄膜图像传感器。
本公开RGB-IR的像素阵列排布方式,适用的图像传感器类型不限,可以是基于CMOS的图像传感器,可以是基于CCD的图像传感器,也可以是基于量子薄膜的图像传感器,当然还可以是其它类型的图像传感器。且本公开实施例的图像传感器可适用于任何包含摄像头模组的电子产品中。
本公开实施例提供的图像传感器,通过在像素阵列排布方式上进行改进,将RGB像素阵列排布方式改进为2PD的RGB-IR的像素阵列排布方式,可以在接收红外光拍摄记录图像的基础上,检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,提升图像的成像效果,并实现立体拍照的 相关应用功能以及暗态成像,在保证用户拍摄体验的同时,增强移动终端的功能性。
本公开实施例还提供一种移动终端,如图5a和图5b所示,包括成像系统51以及红外发射模块52,成像系统51包括上述的图像传感器511,还包括:透镜模组512;用于驱动透镜模组512移动的驱动模块513;设置于透镜模组512与图像传感器511之间的滤波模块514;与图像传感器511连接的图像数据处理模块515;以及与图像数据处理模块515连接的显示模块516。
本公开实施例的移动终端5包括成像系统51,还包括红外发射模块52,其中成像系统51包括上述的图像传感器511,成像系统51还包括用于对光线聚焦的透镜模组512,透镜模组512与驱动模块513连接,驱动模块513用于随着待拍摄对象的远近,从而调整透镜模组512的位置。
在透镜模组512与图像传感器511之间设置有滤波模块514,其中在光线通过透镜模组512聚焦,经过滤波模块514后,可以聚焦在图像传感器511的像素阵列上。图像传感器511与图像数据处理模块515连接,图像数据处理模块515与显示模块516连接。在光线聚焦在图像传感器511的像素阵列上之后,图像传感器511进行光电转换后,将数据传输给图像数据处理模块515,图像数据处理模块515对数据进行处理后在显示模块516上以图片的形式呈现。
其中在驱动模块513调整透镜模组512的位置之后,可以利用图像传感器511中的2PD像素获取相位差,从而获取物体与成像面距离,进而实现快速对焦。
另外,本公开中基于2PD图像传感器的RGB+IR像素阵列排布方式,可以配合红外发射模块52实现立体相关功能,例如:人脸识别解锁,安全支付,立体成像等终端应用,在保证成像的基础上,提升移动终端的功能性。
本公开实施例中的滤波模块514可通过380nm至1100nm的光波长。此时在光线通过透镜模组512聚焦后,可通过滤波模块514进行滤波,其中滤波模块514可用于自然光和红外光的通过,可用于保证成像系统51的成像效果。
其中,移动终端上的红外发射模块52设置于透镜模组512的周缘。红外 发射模块52发出红外线,红外线在遇到障碍物后会发生反射;当成像系统51捕捉到反射回来的红外光线后,经过红外子像素进行光电转换,可获取红外线从发射到接收到红外线的时间差,由于光的传播速度固定,从而可以计算出障碍物距离移动终端的距离,最终可获取障碍物上每个极小单位到移动终端的距离,实现立体成像记录功能。当然还可以通过获取红外光相位差的方式,获取障碍物上各个红外光反射点与移动终端之间的距离。
本公开实施例的移动终端,通过对图像传感器的像素阵列排布方式进行改进,将RGB像素阵列排布方式改进为2PD的RGB-IR的像素阵列排布方式,可以在接收红外光拍摄记录图像的基础上,检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,提升图像的成像效果,并实现立体拍照的相关应用功能以及暗态成像,在保证用户拍摄体验的同时,保证移动终端的功能多样化。
本公开实施例还提供一种拍摄方法,应用于移动终端,移动终端包括上述的图像传感器,还包括红外发射模块,如图6所示,该方法包括:
步骤601、通过红外发射模块发射红外光。
移动终端上的红外发射模块可以发出红外线,红外线在遇到待拍摄对象后会发生反射,其反射的红外光会被移动终端的成像系统所接收。由于移动终端的图像传感器形成RGB-IR的像素阵列,因此可通过红外子像素进行光电转换。
步骤602、根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与移动终端之间的距离。
在获取待拍摄对象与移动终端之间的距离时,实际为获取待拍摄对象与成像面之间的距离,获取待拍摄对象上各个红外光反射点与移动终端之间的距离的过程为:通过包含第二像素的像素阵列接收待拍摄对象上各个红外光反射点所反射的红外光;根据发送与接收红外光的时间差以及红外光的传播速度或者通过获取红外光的相位差,获取待拍摄对象上各个红外光反射点与移动终端之间的距离。
当捕捉到反射回来的红外光线后,经过红外子像素进行光电转换,可获取红外线从发射到接收到红外线的时间差,由于光的传播速度固定,从而可 以根据时间差与传播速度乘积的1/2计算出障碍物距离移动终端的距离。其中移动终端接收到各个红外光反射点反射红外光的时间有所区别,针对每一个红外光反射点可以对应计算一距离,进而可以获得各个红外光反射点与移动终端之间的距离。还可以通过获取红外光的相位差获取各个红外光反射点与移动终端之间的距离,其中通过获取红外光的相位差的方式获取各个红外光反射点与移动终端之间的距离时,可以参见飞行时间(Time Of Flight,TOF)技术,在此不再详细阐述。
其中本公开实施例的图像传感器的像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素,第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素、绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素;蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,且第一像素和第二像素中各包括四个全像素双核对焦子像素。
步骤603、根据待拍摄对象上各个红外光反射点与移动终端之间的距离,对待拍摄对象进行立体信息获取。
在获取待拍摄对象与移动终端之间的距离时,具体为获取待拍摄对象上每个极小单位到移动终端的距离,然后执行对待拍摄对象拍摄的流程,实现立体成像记录功能。
本公开实施例的拍摄方法,可以通过对图像传感器的像素阵列排布方式进行改进,将RGB像素阵列排布方式改进为2PD的RGB-IR的像素阵列排布方式,可以在接收红外光拍摄记录图像的基础上,检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,提升图像的成像效果,并实现立体拍照的相关应用功能以及暗态成像,在保证用户拍摄体验的同时,保证移动终端的功能多样化。
图7为实现本公开各个实施例的一种移动终端的硬件结构示意图,该移动终端700包括但不限于:射频单元701、网络模块702、音频输出单元703、输入单元704、传感器705、显示单元706、用户输入单元707、接口单元708、存储器709、处理器710、以及电源711等部件。
移动终端700,还包括成像系统以及红外发射模块,成像系统包括图像 传感器,透镜模组;用于驱动透镜模组移动的驱动模块;设置于透镜模组与图像传感器之间的滤波模块;与图像传感器连接的图像数据处理模块;以及与图像数据处理模块连接的显示模块。
其中,滤波模块可通过380nm至1100nm的光波长,红外发射模块设置于透镜模组的周缘。
图像传感器包括:像素阵列,像素阵列包括按照预定方式排布的预设数目个像素单元,像素单元包括第一像素和与第一像素位置相邻的第二像素,第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素、绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素;其中,蓝色子像素在第一像素中的位置,与红外子像素在第二像素中的位置相同,且第一像素和第二像素中各包括四个全像素双核对焦子像素。
其中,像素单元包括一个第二像素以及至少一个第一像素。
其中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;红外子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
其中,图像传感器为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
本领域技术人员可以理解,图7中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,移动终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器710用于:通过红外发射模块发射红外光;根据待拍摄对象上各个红外光反射点所反射的红外光获取待拍摄对象与移动终端之间的距离;根据待拍摄对象上各个红外光反射点与移动终端之间的距离,对待拍摄对象进行立体信息获取。
其中根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与移动终端之间的距离时,处理器710用于:通过包含第二像素的像素阵 列接收待拍摄对象上各个红外光反射点所反射的红外光;根据发送与接收红外光的时间差以及红外光的传播速度或者通过获取红外光的相位差,获取待拍摄对象上各个红外光反射点与移动终端之间的距离。
这样,通过对图像传感器的像素阵列排布方式进行改进,将RGB像素阵列排布方式改进为2PD的RGB-IR的像素阵列排布方式,可以在接收红外光拍摄记录图像的基础上,检测待拍摄对象与移动终端之间的距离,实现快速对焦和背景虚化,提升图像的成像效果,并实现立体拍照的相关应用功能以及暗态成像,在保证用户拍摄体验的同时,保证移动终端的功能多样化。
应理解的是,本公开实施例中,射频单元701可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器710处理;另外,将上行的数据发送给基站。通常,射频单元701包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元701还可以通过无线通信系统与网络和其它设备通信。
移动终端通过网络模块702为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元703可以将射频单元701或网络模块702接收的或者在存储器709中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元703还可以提供与移动终端700执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元703包括扬声器、蜂鸣器以及受话器等。
输入单元704用于接收音频或视频信号。输入单元704可以包括图形处理器(Graphics Processing Unit,GPU)7041和麦克风7042,图形处理器7041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元706上,这里的显示单元即为上述的显示模块。经图形处理器7041处理后的图像帧可以存储在存储器709(或其它存储介质)中或者经由射频单元701或网络模块702进行发送。其中图形处理器7041即为上述的图像数据处理模块。麦克风7042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元701发送到 移动通信基站的格式输出。
移动终端700还包括至少一种传感器705,比如光传感器、运动传感器以及其它传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板7061的亮度,接近传感器可在移动终端700移动到耳边时,关闭显示面板7061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别移动终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器705还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元706用于显示由用户输入的信息或提供给用户的信息。显示单元706可包括显示面板7061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板7061。
用户输入单元707可用于接收输入的数字或字符信息,以及产生与移动终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元707包括触控面板7071以及其它输入设备7072。触控面板7071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板7071上或在触控面板7071附近的操作)。触控面板7071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器710,接收处理器710发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板7071。除了触控面板7071,用户输入单元707还可以包括其它输入设备7072。具体地,其它输入设备7072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板7071可覆盖在显示面板7061上,当触控面板7071 检测到在其上或附近的触摸操作后,传送给处理器710以确定触摸事件的类型,随后处理器710根据触摸事件的类型在显示面板7061上提供相应的视觉输出。虽然在图7中,触控面板7071与显示面板7061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板7071与显示面板7061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元708为外部装置与移动终端700连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(Input/Output,I/O)端口、视频I/O端口、耳机端口等等。接口单元708可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端700内的一个或多个元件或者可以用于在移动终端700和外部装置之间传输数据。
存储器709可用于存储软件程序以及各种数据。存储器709可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器709可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其它易失性固态存储器件。
处理器710是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器709内的软件程序和/或模块,以及调用存储在存储器709内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器710可包括一个或多个处理单元;可选的,处理器710可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器710中。
移动终端700还可以包括给各个部件供电的电源711(比如电池),可选的,电源711可以通过电源管理系统与处理器710逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,移动终端700包括一些未示出的功能模块,在此不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其它变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其它要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如只读存储器(Read-Only Memory,ROM)/随机存取存储器(Random Access Memory,RAM)、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (9)

  1. 一种图像传感器,包括:
    像素阵列,所述像素阵列包括按照预定方式排布的预设数目个像素单元,所述像素单元包括第一像素和与所述第一像素位置相邻的第二像素,所述第一像素包括红色子像素、绿色子像素和蓝色子像素,所述第二像素包括红色子像素、绿色子像素和红外子像素,且所述第一像素和所述第二像素均为全像素双核对焦像素;
    其中,所述蓝色子像素在所述第一像素中的位置,与所述红外子像素在所述第二像素中的位置相同,且所述第一像素和所述第二像素中各包括四个全像素双核对焦子像素。
  2. 根据权利要求1所述的图像传感器,其中,所述像素单元包括一个所述第二像素以及至少一个所述第一像素。
  3. 根据权利要求1所述的图像传感器,其中,所述红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;
    所述绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;
    所述蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;
    所述红外子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
  4. 根据权利要求1所述的图像传感器,其中,所述图像传感器为互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)图像传感器、电荷耦合元件(Charge Coupled Device,CCD)图像传感器或量子薄膜图像传感器。
  5. 一种移动终端,包括成像系统以及红外发射模块,所述成像系统包括如权利要求1至4任一项所述的图像传感器,还包括:
    透镜模组;
    用于驱动所述透镜模组移动的驱动模块;
    设置于所述透镜模组与所述图像传感器之间的滤波模块;
    与所述图像传感器连接的图像数据处理模块;以及
    与所述图像数据处理模块连接的显示模块。
  6. 根据权利要求5所述的移动终端,其中,所述滤波模块可通过380nm至1100nm的光波长。
  7. 根据权利要求5所述的移动终端,其中,所述红外发射模块设置于所述透镜模组的周缘。
  8. 一种拍摄方法,应用于移动终端,所述移动终端包括如权利要求1所述的图像传感器,还包括红外发射模块,所述方法包括:
    通过所述红外发射模块发射红外光;
    根据待拍摄对象所反射的红外光获取所述待拍摄对象上各个红外光反射点与所述移动终端之间的距离;
    根据所述待拍摄对象上各个红外光反射点与所述移动终端之间的距离,对所述待拍摄对象进行立体信息获取。
  9. 根据权利要求8所述的拍摄方法,其中,所述根据待拍摄对象所反射的红外光获取所述待拍摄对象上各个红外光反射点与所述移动终端之间的距离的步骤,包括:
    通过包含第二像素的像素阵列接收所述待拍摄对象上各个红外光反射点所反射的红外光;
    根据发送与接收红外光的时间差以及红外光的传播速度或者通过获取红外光的相位差,获取所述待拍摄对象上各个红外光反射点与所述移动终端之间的距离。
PCT/CN2019/094545 2018-07-19 2019-07-03 图像传感器、移动终端及拍摄方法 WO2020015532A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19838328.3A EP3826285A4 (en) 2018-07-19 2019-07-03 IMAGE SENSOR, MOBILE TERMINAL, AND PHOTOGRAPHY PROCESS
US17/152,266 US11463641B2 (en) 2018-07-19 2021-01-19 Image sensor, mobile terminal, and photographing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810797261.9 2018-07-19
CN201810797261.9A CN108900751A (zh) 2018-07-19 2018-07-19 一种图像传感器、移动终端及拍摄方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/152,266 Continuation US11463641B2 (en) 2018-07-19 2021-01-19 Image sensor, mobile terminal, and photographing method

Publications (1)

Publication Number Publication Date
WO2020015532A1 true WO2020015532A1 (zh) 2020-01-23

Family

ID=64351060

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/094545 WO2020015532A1 (zh) 2018-07-19 2019-07-03 图像传感器、移动终端及拍摄方法

Country Status (4)

Country Link
US (1) US11463641B2 (zh)
EP (1) EP3826285A4 (zh)
CN (1) CN108900751A (zh)
WO (1) WO2020015532A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12009379B2 (en) * 2017-05-01 2024-06-11 Visera Technologies Company Limited Image sensor
CN108900772A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种移动终端及图像拍摄方法
CN108900751A (zh) 2018-07-19 2018-11-27 维沃移动通信有限公司 一种图像传感器、移动终端及拍摄方法
TWI754809B (zh) 2019-04-11 2022-02-11 大陸商廣州立景創新科技有限公司 影像感測裝置
JP7314752B2 (ja) * 2019-09-30 2023-07-26 株式会社リコー 光電変換素子、読取装置、画像処理装置および光電変換素子の製造方法
CN114067636A (zh) * 2021-12-17 2022-02-18 国开启科量子技术(北京)有限公司 一种量子随机数教学装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
WO2017171412A2 (ko) * 2016-03-30 2017-10-05 엘지전자 주식회사 이미지 처리 장치 및 이동 단말기
CN107968103A (zh) * 2016-10-20 2018-04-27 昆山国显光电有限公司 像素结构及其制造方法、显示装置
CN207354459U (zh) * 2017-10-19 2018-05-11 维沃移动通信有限公司 一种摄像头模组和移动终端
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108900751A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种图像传感器、移动终端及拍摄方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6484420B2 (ja) * 2014-09-17 2019-03-13 株式会社小糸製作所 車両用リアランプ
US9917134B1 (en) * 2016-09-11 2018-03-13 Himax Technologies Limited Methods of fabricating an image sensor
CN106982328B (zh) * 2017-04-28 2020-01-10 Oppo广东移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017171412A2 (ko) * 2016-03-30 2017-10-05 엘지전자 주식회사 이미지 처리 장치 및 이동 단말기
CN107968103A (zh) * 2016-10-20 2018-04-27 昆山国显光电有限公司 像素结构及其制造方法、显示装置
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN207354459U (zh) * 2017-10-19 2018-05-11 维沃移动通信有限公司 一种摄像头模组和移动终端
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108900751A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种图像传感器、移动终端及拍摄方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3826285A4 *

Also Published As

Publication number Publication date
CN108900751A (zh) 2018-11-27
EP3826285A4 (en) 2021-08-11
US20210144322A1 (en) 2021-05-13
EP3826285A1 (en) 2021-05-26
US11463641B2 (en) 2022-10-04

Similar Documents

Publication Publication Date Title
WO2020015627A1 (zh) 图像传感器、移动终端及图像拍摄方法
WO2020015532A1 (zh) 图像传感器、移动终端及拍摄方法
CN108900750B (zh) 一种图像传感器及移动终端
WO2020015560A1 (zh) 图像传感器及移动终端
WO2020015626A1 (zh) 移动终端及图像拍摄方法
WO2022143280A1 (zh) 图像传感器、摄像模组和电子设备
CN108965666B (zh) 一种移动终端及图像拍摄方法
WO2019144956A1 (zh) 图像传感器、镜头模组、移动终端、人脸识别方法及装置
CN107948505A (zh) 一种全景拍摄方法及移动终端
US11996421B2 (en) Image sensor, mobile terminal, and image capturing method
CN110913144B (zh) 图像处理方法及摄像装置
CN108965703A (zh) 一种图像传感器、移动终端及图像拍摄方法
CN110248050B (zh) 一种摄像头模组及移动终端
CN115278000A (zh) 图像传感器、图像生成方法、摄像头模组及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19838328

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019838328

Country of ref document: EP