WO2020015626A1 - 移动终端及图像拍摄方法 - Google Patents

移动终端及图像拍摄方法 Download PDF

Info

Publication number
WO2020015626A1
WO2020015626A1 PCT/CN2019/096126 CN2019096126W WO2020015626A1 WO 2020015626 A1 WO2020015626 A1 WO 2020015626A1 CN 2019096126 W CN2019096126 W CN 2019096126W WO 2020015626 A1 WO2020015626 A1 WO 2020015626A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
sub
module
infrared
mobile terminal
Prior art date
Application number
PCT/CN2019/096126
Other languages
English (en)
French (fr)
Inventor
王丹妹
周华昭
朱盼盼
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to EP19838874.6A priority Critical patent/EP3826292A4/en
Publication of WO2020015626A1 publication Critical patent/WO2020015626A1/zh
Priority to US17/151,767 priority patent/US20240014236A9/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/201Filters in the form of arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/702SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present disclosure relates to the field of image processing technologies, and in particular, to a mobile terminal and an image capturing method.
  • CMOS complementary metal oxide semiconductor
  • R red
  • G green
  • B blue
  • Bayer pixel array arrangement pattern As shown in Figures 1a and 1b, this arrangement cannot detect the distance of objects, and can only be used to receive natural light and take pictures to record images under normal lighting.
  • the pixel array arrangement pattern of full pixel dual core focus (2PD) technology is shown in Figure 1c and Figure 1d.
  • This arrangement method can only be used to receive natural light and used to take pictures and record images, but relatively Phase detection autofocus (PDAF) technology solution can increase the distance of the detected object and complete the focusing action more quickly.
  • PDAF Phase detection autofocus
  • the pixel array arrangement of CMOS image sensors cannot detect the distance of objects and can only accept natural light.
  • the pixel array arrangement of 2PD technology can detect the distance of objects, it can only accept natural light, so the images in related technologies
  • the sensor's pixel array arrangement mode has the problems of limited shooting scenes, slow focusing, and affecting the user's shooting experience.
  • Some embodiments of the present disclosure provide a mobile terminal and an image shooting method to solve the problems of limited shooting scenes, slow focusing, and affecting the shooting experience of users when shooting in related technologies.
  • some embodiments of the present disclosure provide a mobile terminal including a first camera module and a second camera module adjacent to the first camera module.
  • the first camera module includes a first image sensor.
  • Two camera modules include a second image sensor;
  • the first pixel array corresponding to the first image sensor includes a preset number of first pixel units arranged in a first predetermined manner, and the first pixel unit includes a first pixel and a second pixel adjacent to the first pixel position;
  • the second pixel array corresponding to the second image sensor includes a preset number of second pixel units arranged in a second predetermined manner, and the second pixel unit includes a first pixel;
  • the first pixel includes a red subpixel, a green subpixel, and a blue subpixel
  • the second pixel includes at least one of a red subpixel and a blue subpixel, and a green subpixel and an infrared subpixel
  • the first pixel and the second pixel The pixels are all-pixel dual-core focus pixels, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • some embodiments of the present disclosure provide an image shooting method, which is applied to the above-mentioned mobile terminal.
  • the method includes:
  • a background blur image is generated by triangulation ranging.
  • some embodiments of the present disclosure further provide an image shooting method, which is applied to the above-mentioned mobile terminal.
  • the mobile terminal further includes an infrared transmitting module disposed on a periphery of the first camera module; the method includes:
  • Emitting infrared light through an infrared transmitting module Emitting infrared light through an infrared transmitting module
  • a first camera module is made by using a first image sensor and a second camera module is made by using a second image sensor.
  • the two camera modules are combined to form a dual camera.
  • FIG. 1a is a schematic diagram of a conventional RGB arrangement in the related art
  • Figure 1b shows a cross-sectional view of a conventional pixel
  • FIG. 1c shows a pixel array layout of 2PD
  • Figure 1d shows a cross-sectional view of a 2PD pixel
  • FIG. 2 shows a first schematic diagram of a mobile terminal according to some embodiments of the present disclosure
  • FIG. 3a is a schematic diagram of a first camera module according to some embodiments of the present disclosure.
  • 3b shows a schematic diagram of a second camera module according to some embodiments of the present disclosure
  • FIG. 4a shows one of the schematic diagrams of the first pixel unit in some embodiments of the present disclosure
  • FIG. 4b shows a second schematic diagram of a first pixel unit in some embodiments of the present disclosure
  • 4c shows a third schematic diagram of a first pixel unit in some embodiments of the present disclosure
  • 4d shows a fourth schematic diagram of a first pixel unit in some embodiments of the present disclosure
  • FIG. 4e shows a fifth schematic diagram of a first pixel unit in some embodiments of the present disclosure
  • FIG. 5a shows a sixth schematic diagram of a first pixel unit according to some embodiments of the present disclosure
  • 5b shows the seventh schematic diagram of a first pixel unit in some embodiments of the present disclosure
  • 5c shows a schematic diagram of a first pixel unit of some embodiments of the present disclosure
  • FIG. 6 shows a pixel cross-sectional view of some embodiments of the present disclosure
  • FIG. 7 shows a second schematic diagram of a mobile terminal according to some embodiments of the present disclosure.
  • FIG. 8 is a schematic diagram illustrating a connection between a first camera module and an image processor according to some embodiments of the present disclosure
  • FIG. 9 is a schematic diagram illustrating a connection between a second camera module and an image processor according to some embodiments of the present disclosure.
  • FIG. 10 shows a first schematic diagram of an image capturing method according to some embodiments of the present disclosure
  • FIG. 11 shows a second schematic diagram of an image capturing method according to some embodiments of the present disclosure.
  • FIG. 12 is a schematic diagram of a hardware structure of a mobile terminal according to some embodiments of the present disclosure.
  • the mobile terminal 1 includes a first camera module 11 and is adjacent to the first camera module 11.
  • a second camera module 12 the first camera module 11 includes a first image sensor 111, and the second camera module 12 includes a second image sensor 121;
  • the first pixel array corresponding to the first image sensor 111 includes a preset number of first pixel units arranged in a first predetermined manner, and the first pixel unit includes a first pixel and a second pixel adjacent to the first pixel position;
  • the second pixel array corresponding to the second image sensor 121 includes a preset number of second pixel units arranged in a second predetermined manner, and the second pixel unit includes a first pixel;
  • the first pixel includes a red subpixel, a green subpixel, and a blue subpixel
  • the second pixel includes at least one of a red subpixel and a blue subpixel, and a green subpixel and an infrared subpixel
  • the first pixel and the second pixel The pixels are all-pixel dual-core focus pixels, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • the mobile terminal 1 provided by some embodiments of the present disclosure includes a first camera module 11 and a second camera module 12, wherein the first camera module 11 includes a first image sensor 111 and the second camera module 12 includes a second image Sensor 121.
  • the first image sensor 111 corresponds to a first pixel array
  • the second image sensor 121 corresponds to a second pixel array.
  • the first camera module 11 and the second camera module 12 are adjacent to each other.
  • the first pixel array includes a preset number of first pixel units, wherein the preset number of first pixel units are arranged in a first predetermined manner, and each of the preset number of first pixel units includes a first pixel and a second pixel.
  • the second pixel array includes a preset number of second pixel units, wherein the preset number of second pixel units are arranged in a second predetermined manner.
  • the preset number of second pixel units each include a first pixel.
  • the first pixel is different from the sub-pixel in the second pixel.
  • the first pixel includes a red sub-pixel (R), a green sub-pixel (G), and a blue sub-pixel (B).
  • the second pixel includes a red sub-pixel. At least one of a pixel and a blue sub-pixel, and a green sub-pixel and an infrared sub-pixel (IR).
  • IR infrared sub-pixel
  • the first pixel and the second pixel of some embodiments of the present disclosure are all-pixel dual-core focus (2PD) pixels.
  • the 2PD pixel can detect the object distance and complete the focusing action more quickly.
  • the first pixel and the second pixel here Both are 2PD pixels, that is, the sub-pixels in the first pixel and the second pixel are 2PD sub-pixels.
  • the first camera module 11 and the second camera module 12 can quickly complete the focusing process.
  • the red, green, and blue sub-pixels in the first pixel are arranged in a certain manner, and the first pixel includes four full-pixel dual-core focus sub-pixels, which specifically include one red sub-pixel, one blue
  • the sub-pixels and the two green sub-pixels are referred to herein as a first green sub-pixel and a second green sub-pixel, respectively, for convenience of distinction, where the first green sub-pixel is the same as the second green sub-pixel.
  • the red sub-pixel is adjacent to the first green sub-pixel
  • the second green sub-pixel is located below the red sub-pixel
  • the blue sub-pixel is located below the first green sub-pixel
  • the second green sub-pixel is adjacent to the blue sub-pixel .
  • the second pixel includes four full-pixel dual-core focus sub-pixels, specifically including at least one of a red sub-pixel and a blue sub-pixel, and a green sub-pixel and an infrared sub-pixel, that is, the second pixel may include three kinds of sub-pixels or The seed pixel.
  • the second pixel may include three types of sub-pixels, it may include red sub-pixels, green sub-pixels, and infrared sub-pixels. At this time, the number of green sub-pixels is two, and blue sub-pixels, green sub-pixels, and infrared sub-pixels may also be included. Pixels, the number of green sub-pixels is two at this time.
  • the second pixel includes four seed pixels, it may include a red subpixel, a blue subpixel, a green subpixel, and an infrared subpixel.
  • mobile terminals can receive infrared light. Under the circumstances, image shooting is performed to achieve dark imaging to ensure the user shooting experience, and the setting of 2PD pixels can achieve fast focusing.
  • the image sensor of some embodiments of the present disclosure can cooperate with the infrared transmission module to realize the related application functions of stereo photography, and can use two camera modules to implement the background blur function according to the principle of triangular ranging to ensure the user's shooting experience. To enhance the functionality of the mobile terminal.
  • the position of the infrared sub-pixel in the second pixel is the same as the position of the red, green, or blue sub-pixel in the first pixel. ;
  • the position of the infrared sub-pixel in the second pixel is the same as the position of the first combined sub-pixel in the first pixel, or the position of the second combined sub-pixel in the first pixel;
  • the first combined subpixel is a combination of 1/2 red subpixels and 1/2 green subpixels adjacent to each other; the second combined subpixel is 1/2 green subpixels and 1/2 blue adjacent to each other. Combination of sub-pixels.
  • the second pixel When the position of the infrared sub-pixel in the second pixel is the same as the position of the red sub-pixel in the first pixel, the second pixel includes a blue sub-pixel, two green sub-pixels, and an infrared sub-pixel. On the basis of the first pixel, the red sub-pixel is replaced with an infrared sub-pixel. When the position of the infrared sub-pixel in the second pixel is the same as the position of the blue sub-pixel in the first pixel, the second pixel includes a red sub-pixel, two green sub-pixels, and an infrared sub-pixel. On the basis of the first pixel, the blue sub-pixel is replaced with an infrared sub-pixel.
  • the second pixel When the position of the infrared sub-pixel in the second pixel is the same as that of a green sub-pixel in the first pixel, the second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and an infrared sub-pixel. At this time, based on the first pixel, one of the green sub-pixels is replaced with an infrared sub-pixel.
  • the second pixel When the position of the infrared sub-pixel in the second pixel is the same as the position of the first combined sub-pixel in the first pixel, the second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and an infrared sub-pixel. At this time, the 1/2 red sub-pixel and the 1/2 green sub-pixel adjacent to the position of the 2PD sub-pixel may be taken as the infrared sub-pixel on the basis of the first pixel.
  • the second pixel When the position of the infrared sub-pixel in the second pixel is the same as the position of the second combined sub-pixel in the first pixel, the second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and an infrared sub-pixel. At this time, the 1/2 blue sub-pixel and the 1/2 green sub-pixel adjacent to the position of the 2PD sub-pixel may be taken as the infrared sub-pixel on the basis of the first pixel.
  • the first pixel unit includes a second pixel and at least one first pixel.
  • the first pixel unit includes a second pixel and at least one first pixel, and the number of pixels in the first pixel unit is at least two.
  • the number of pixels in the first pixel unit is two, including a first pixel and a second pixel, at this time, the point density of the infrared sub-pixels in the first pixel unit is 1/8.
  • the first pixel unit includes a first pixel and a second pixel, and the second pixel includes a red sub-pixel, two green sub-pixels, and an infrared sub-pixel.
  • FIG. 4a the first pixel unit includes a first pixel and a second pixel, and the second pixel includes a red sub-pixel, two green sub-pixels, and an infrared sub-pixel.
  • the first pixel unit includes a first pixel and a second pixel
  • the second pixel includes a red subpixel, a green subpixel, a blue subpixel, and an infrared subpixel.
  • the ratio of the infrared sub-pixels in the first pixel unit is 1/8, that is, the point density of the infrared sub-pixels is 1/8.
  • the density of the infrared sub-pixels in the first pixel unit is 1/12.
  • a first pixel unit includes two first pixels and a second pixel
  • the second pixel includes a blue subpixel, two green subpixels, and an infrared subpixel.
  • the proportion of the sub-pixels in the first pixel unit is 1/12, that is, the point density of the infrared sub-pixel is 1/12.
  • the density of the infrared sub-pixels in the first pixel unit is 1/16.
  • the first pixel unit includes three first pixels and one second pixel, and the second pixel includes a blue sub-pixel, a green sub-pixel, a red sub-pixel, and an infrared sub-pixel.
  • 1/2 red sub-pixel and 1/2 green sub-pixel of the 2PD sub-pixel are taken as infrared sub-pixels.
  • the first pixel unit includes three first pixels and one second pixel.
  • the second pixel includes a blue sub-pixel, a green sub-pixel, a red sub-pixel, and an infrared sub-pixel.
  • 1/2 blue subpixel and 1/2 green subpixel of 2PD subpixel are taken as infrared subpixels.
  • the ratio of the infrared sub-pixels in the first pixel unit is 1/16, that is, the point density of the infrared sub-pixels is 1/16.
  • the above-mentioned several infrared sub-pixel picking methods corresponding to FIG. 4a to FIG. 4e are only used for illustration, and may also be other picking methods. The corresponding multiple implementation methods are not described here one by one.
  • the position of the infrared sub-pixel in the first pixel unit (the position of the second pixel) is not limited in some embodiments of the present disclosure.
  • the density of the infrared sub-pixels in the first pixel unit is 1 / 4n, and n is an integer greater than or equal to 2, and the size of the first pixel array to which the infrared sub-pixels are applicable is not limited.
  • the color sub-pixels have the same position in the first pixel, and the 1/2 infrared sub-pixels in two adjacent second pixels constitute infrared sub-pixels.
  • the second pixel may include only 1/2 infrared sub-pixels, and a complete infrared sub-pixel may be obtained by combining two second pixels.
  • the position of the 1/2 infrared sub-pixel in the second pixel may be the same as the position of the 1/2 red sub-pixel in the first pixel, or may be
  • the position of the green sub-pixel in the first pixel may be the same, or may be the same as the position of the 1/2 blue sub-pixel in the first pixel.
  • the position of the 1/2 infrared sub-pixel in a second pixel is the same as the position of the 1/2 red sub-pixel in the first pixel, then the position of the 1/2 infrared sub-pixel in another second pixel is equal to 1
  • the position of the / 2 green sub-pixel in the first pixel is the same.
  • the position of the 1/2 infrared sub-pixel in a second pixel is the same as the position of the 1/2 green sub-pixel in the first pixel, then the position of the 1/2 infrared sub-pixel in another second pixel is 1
  • the position of the / 2 blue sub-pixel or the 1/2 red sub-pixel in the first pixel is the same.
  • the number of the second pixels is two, and the number of the first pixels is greater than or equal to zero.
  • the first pixel unit includes two second pixels and first pixels whose number is greater than or equal to zero.
  • the point density of the infrared sub-pixels in the first pixel unit is 1/8.
  • the first pixel unit includes two second pixels, and each second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and a 1/2 infrared sub-pixel.
  • the position of the 1/2 infrared sub-pixel in a second pixel is the same as the position of the 1/2 green sub-pixel in the first pixel, the position of the 1/2 infrared sub-pixel in another second pixel is 1 / 2
  • the position of the blue sub-pixel in the first pixel is the same.
  • the ratio of the infrared sub-pixels in the first pixel unit is 1/8, that is, the point density of the infrared sub-pixels is 1/8.
  • the point density of the infrared sub-pixels in the first pixel unit is 1/12.
  • the first pixel unit includes two second pixels and one first pixel, where each second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and 1/2 Infrared sub-pixel, when the position of 1/2 infrared sub-pixel in a second pixel is the same as the position of 1/2 red sub-pixel in the first pixel, 1/2 infrared sub-pixel in another second pixel Is the same position as the 1/2 green sub-pixel in the first pixel.
  • the ratio of the infrared sub-pixels in the first pixel unit is 1/12, that is, the point density of the infrared sub-pixels is 1/12.
  • the point density of the infrared sub-pixels in the first pixel unit is 1/16.
  • the first pixel unit includes two second pixels and two first pixels, where each second pixel includes a red sub-pixel, a green sub-pixel, a blue sub-pixel, and a 1 / 2 infrared sub-pixels, when the position of the 1/2 infrared sub-pixel in a second pixel is the same as the position of the 1/2 green sub-pixel in the first pixel, the 1/2 infrared sub-pixel is in another second pixel
  • the position of the pixel is the same as the position of the 1/2 red sub-pixel in the first pixel.
  • the ratio of the infrared sub-pixels in the first pixel unit is 1/16, that is, the point density of the infrared sub-pixels is 1/16.
  • the first pixel array may be an RGB + IR pixel unit at 1/8 density, an RGB + IR pixel unit at 1/12 density, or an RGB + IR pixel unit at 1/16 density as a pixel unit array, and the pixel unit array is further processed. It consists of a periodic array.
  • the first pixel array can also be in other forms, which will not be enumerated here.
  • FIG. 5a to FIG. 5c are only a few corresponding implementation manners, and can be modified based on this, which will not be described one by one here.
  • the density of the infrared sub-pixels in the first pixel unit is 1 / 4n, and n is an integer greater than or equal to 2, and the size of the first pixel array to which the infrared sub-pixels are applicable is not limited.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror, which are sequentially stacked
  • the green sub-pixel includes a semiconductor layer, a metal layer, and a photodiode, which are sequentially stacked.
  • Green filters and micromirrors; blue subpixels include semiconductor layers, metal layers, photodiodes, blue filters and micromirrors stacked in sequence; infrared subpixels include semiconductor layers, metal layers, Photodiodes, infrared filters and micromirrors.
  • the semiconductor layer, metal layer, photodiode, red filter, and micromirror included in the red sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, metal layer, photodiode, green filter, and micromirror included in the corresponding green sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, the metal layer, the photodiode, the blue filter, and the micromirror included in the blue sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer, metal layer, photodiode, infrared filter, and micromirror included in the infrared sub-pixel are arranged in order from bottom to top.
  • the semiconductor layer here may be a silicon substrate, but is not limited thereto.
  • the structure of the red, green, blue, and infrared sub-pixels can be seen in FIG. 6. Although only the blue and infrared sub-pixels are shown in FIG. 6, the structure of the red-green sub-pixels can be obtained based on this.
  • the blue filter can be replaced with a red or green filter to obtain a red sub-pixel or green sub-pixel structure.
  • the red, green, and blue subpixels are used to obtain the color information of the pixels of the composite image, which block the entry of infrared rays; for example, only visible light with a wavelength of 380 to 700nm is allowed to enter, which can directly generate a complete and realistic image under high illumination .
  • the infrared wavelength is 750 ⁇ 1100nm.
  • the infrared filter area can be used to pass the infrared band, which can improve the imaging effect in the dark state and realize the infrared ranging function.
  • the RGB sub-pixel point is a light receiving element corresponding to wavelength light of each RGB color
  • the IR sub-pixel point is a light receiving element corresponding to infrared light.
  • the first image sensor and the second image sensor are each a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled element CCD image sensor, or a quantum thin-film image sensor.
  • the RGB-IR pixel array arrangement of the present disclosure is not limited to the applicable image sensor type. It can be a CMOS-based image sensor, a charge-coupled device (CCD) -based image sensor, or a CCD-based image sensor. Quantum film image sensors can of course also be other types of image sensors. And the image sensor of some embodiments of the present disclosure can be applied to any electronic products including a camera module.
  • the first camera module 11 and the second camera module 12 are connected through a synchronization module 14.
  • the mobile terminal 1 further includes: disposed on the first camera module 11 peripheral infrared transmitting module 15; power supply module 16 connected to first camera module 11, second camera module 12 and infrared transmitting module 15; images connected to first camera module 11, second camera module 12
  • the processor 17, the image processor 17 and the power supply module 16 are all integrated on the main board of the mobile terminal, the infrared transmitting module 15 is connected to the main board of the mobile terminal; and the display module 18 connected to the image processor 17.
  • a synchronization module 14 is connected between the first camera module 11 and the second camera module 12, and by setting the synchronization module 14, the mobile terminal 1 can be controlled to implement the frames of the first camera module 11 and the second camera module 12. Synchronous data output.
  • An infrared transmitting module 15 is provided on the periphery of the first camera module 11.
  • the first camera module 11, the second camera module 12, and the infrared transmitting module 15 are all connected to the power supply module 16, and are used for Power to work.
  • the first camera module 11 and the second camera module 12 are connected to the image processor 17 at the same time, and connected to the display module 18 through the image processor 17, and the light is focused on the first camera module 11 and the second camera module 12
  • the data can be transmitted to the image processor 17, and the image processor 17 processes the data and presents it in the form of a picture on the display module 18.
  • the infrared transmitting module 15 is connected to the main board of the mobile terminal, so the mobile terminal can obtain the time when the infrared transmitting module 15 emits infrared light. Since the image processor 17 is integrated on the motherboard, the mobile terminal can obtain the time when the first camera module 11 receives infrared light.
  • the first camera module 11 further includes: a first lens module 112; a first driving module 113 for driving the first lens module 112 to move; and the first lens module 112 is provided.
  • the first filter module 114 and the first image sensor 111 can pass light wavelengths from 380 nm to 1100 nm.
  • the first lens module 112 is used to focus light.
  • the first lens module 112 is connected to the first driving module 113.
  • the first driving module 113 is used to adjust the first lens module 112 according to the distance of the object to be photographed. s position.
  • a first filter module 114 is provided between the first lens module 112 and the first image sensor 111, where light is focused by the first lens module 112, and after passing through the first filter module 114, it can be focused on the first image sensor 111 pixel array.
  • the first image sensor 111 is connected to the image processor 17.
  • the first filtering module 114 in some embodiments of the present disclosure is a two-pass filtering module that can pass both natural light and infrared light. At this time, after the light is focused by the first lens module 112, it can be filtered by the first filtering module 114, where the first filtering module 114 can be used to pass natural light and infrared light to ensure the imaging effect.
  • the infrared emitting module 15 may be disposed on a periphery of the first lens module 112.
  • the infrared transmitting module 15 emits infrared rays, and the infrared rays will reflect after encountering obstacles. After capturing the reflected infrared rays, photoelectric conversion is performed through the infrared sub-pixels to obtain the time difference between the infrared rays being emitted and receiving.
  • the propagation speed of the camera is fixed, so that the distance between the obstacle and the mobile terminal can be calculated. Finally, the distance from each minimal unit on the obstacle to the mobile terminal can be obtained, and the stereo imaging recording function of the first camera module can be achieved.
  • the method of obtaining the phase difference of infrared light and the distance between each infrared light reflection point on the obstacle and the mobile terminal can refer to the Time of Flight (TOF) technology at this time, which will not be described in detail here.
  • TOF Time of Flight
  • the second camera module 12 further includes: a second lens module 122; a second driving module 123 for driving the second lens module 122 to move; and the second lens module 122 is disposed on the second lens module 122.
  • the second filter module 124 and the second image sensor 121 can pass light wavelengths from 380 nm to 700 nm.
  • the second lens module 122 is also used to focus the light.
  • the second lens module 122 is connected to the second driving module 123.
  • the second driving module 123 is used to adjust the second lens module according to the distance of the object to be photographed. 122 locations.
  • a second filter module 124 is provided between the second lens module 122 and the second image sensor 121, where light is focused through the second lens module 122, and after passing through the second filter module 124, it can be focused on the second image sensor 121 pixel array.
  • the second image sensor 121 is connected to the image processor 17.
  • the second filtering module 124 in some embodiments of the present disclosure is used for natural light to pass. At this time, after the light is focused by the second lens module 122, it can be filtered by the second filtering module 124.
  • the 2PD pixels in the first image sensor 111 and the second image sensor 121 can be used to obtain the phase difference, thereby obtaining the distance between the object and the imaging surface. This enables fast focusing.
  • the mobile terminal uses a first image sensor to make a first camera module and a second image sensor to make a second camera module.
  • the two camera modules are combined to form a dual camera.
  • This combination method can not only ensure fast focus, but also use infrared light to detect the distance between the mobile terminal and the object to be photographed, improve the imaging effect of the image, realize the application function of stereoscopic photography, and also realize the background blur function to ensure the movement.
  • the functions of the terminal are diversified, which improves the user experience and meets the needs of users.
  • Some embodiments of the present disclosure also provide an image shooting method, which is applied to the above mobile terminal. As shown in FIG. 10, the method includes:
  • Step 1001 Depth of field information is obtained through a first camera and a second camera module.
  • the first camera module and the second camera module can determine depth of field information, that is, a range of front-to-back distances of an object to be photographed, which is determined by imaging to obtain a clear image.
  • Step 1002 Obtain the first image data collected by the first camera module and the second image data collected by the second camera according to the depth of field information.
  • the first image data and the second image data are the same frame data.
  • the first image data and the second image data of the same frame can be obtained according to the depth of field information.
  • step 1003 may be performed.
  • Step 1003 Generate a background blur image through triangulation ranging according to the first image data and the second image data.
  • the data can be processed based on the same frame data output by the two camera modules in combination with the principle of triangulation to obtain the background virtual Into the image.
  • the image shooting method of some embodiments of the present disclosure may use a first image sensor to make a first camera module and a second image sensor to make a second camera module.
  • the two camera modules are combined to form a dual camera, and To ensure the synchronization of the data output of the two, this combination method can achieve the background blur function, ensure the diversification of the functions of the mobile terminal, improve the user experience, and meet user needs.
  • Some embodiments of the present disclosure also provide an image shooting method, which is applied to the above-mentioned mobile terminal, where the mobile terminal includes an infrared transmitting module disposed on the periphery of the first camera module; as shown in FIG. 11, the method includes:
  • Step 1101 Infrared light is transmitted through an infrared transmitting module.
  • the infrared transmitting module on the mobile terminal can emit infrared rays.
  • the infrared rays will be reflected after encountering the object to be photographed, and the reflected infrared light will be received by the first camera module of the mobile terminal.
  • the first image sensor of the first camera module forms an RGB-IR pixel array, so photoelectric conversion can be performed through infrared sub-pixels.
  • Step 1102 Obtain the distance between each infrared light reflection point on the to-be-photographed object and the first camera module according to the infrared light reflected by the to-be-photographed object.
  • the first camera module When acquiring the distance between the object to be photographed and the first camera module, it is actually acquiring the distance between the object to be photographed and the imaging surface.
  • the first camera module After the first camera module captures the reflected infrared light, it performs photoelectric conversion through the infrared sub-pixels to obtain the time difference between the infrared emission and the infrared reception. Because the light propagation speed is fixed, the time difference can be multiplied by the propagation speed. Calculate the distance of the obstacle from the first camera module. The time when the first camera module receives the infrared light reflection points is different. Therefore, a distance can be calculated for each infrared light reflection point, so that each infrared light reflection point and the first camera module can be obtained. the distance between.
  • the distance between each infrared light reflection point and the first camera module can also be obtained by acquiring the phase difference of the infrared light.
  • TOF Time of Flight
  • Step 1103 Obtain stereo information according to the distance between each infrared light reflection point on the object to be photographed and the first camera module.
  • the image capturing method of some embodiments of the present disclosure may use a first image sensor to make a first camera module, and cooperate with an infrared transmitting module to implement distance detection between the first camera module and an object to be photographed by using infrared light, and improve Image imaging effect, realizing application functions related to photo stereo.
  • FIG. 12 is a schematic diagram of a hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the mobile terminal 1200 includes, but is not limited to, a radio frequency unit 1201, a network module 1202, an audio output unit 1203, an input unit 1204, a sensor 1205, and a display unit. 1206, a user input unit 1207, an interface unit 1208, a memory 1209, a processor 1210, and a power supply 1211.
  • the mobile terminal 1200 further includes a first camera module and a second camera module adjacent to the first camera module.
  • the first camera module includes a first image sensor and the second camera module includes a second image sensor.
  • the first pixel array corresponding to the first image sensor includes a preset number of first pixel units arranged in a first predetermined manner, and the first pixel unit includes a first pixel and a second pixel adjacent to the first pixel position;
  • the second pixel array corresponding to the second image sensor includes a preset number of second pixel units arranged in a second predetermined manner, and the second pixel unit includes a first pixel;
  • the first pixel includes a red subpixel, a green subpixel, and a blue subpixel
  • the second pixel includes at least one of a red subpixel and a blue subpixel, and a green subpixel and an infrared subpixel
  • the first pixel and the second pixel The pixels are all-pixel dual-core focus pixels, and each of the first pixel and the second pixel includes four full-pixel dual-core focus sub-pixels.
  • the position of the infrared sub-pixel in the second pixel is the same as the position of the red, green, or blue sub-pixel in the first pixel; or
  • the position of the infrared sub-pixel in the second pixel is the same as the position of the first combined sub-pixel in the first pixel, or the position of the second combined sub-pixel in the first pixel;
  • the first combined subpixel is a combination of 1/2 red subpixels and 1/2 green subpixels adjacent to each other; the second combined subpixel is 1/2 green subpixels and 1/2 blue adjacent to each other. Combination of sub-pixels.
  • the first pixel unit includes a second pixel and at least one first pixel.
  • the position of the 1/2 infrared sub-pixel in the second pixel is the same as the position of the 1/2 red sub-pixel, 1/2 green sub-pixel, or 1/2 blue sub-pixel in the first pixel, and two adjacent The 1/2 infrared sub-pixels in the second pixel constitute an infrared sub-pixel.
  • the number of the second pixels is two, and the number of the first pixels is greater than or equal to zero.
  • the red sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a red filter, and a micromirror sequentially stacked
  • the green sub-pixel includes a semiconductor layer, a metal layer, a photodiode, a green filter, and micromirror sequentially stacked.
  • Mirrors; blue sub-pixels include semiconductor layers, metal layers, photodiodes, blue filters, and micromirrors that are sequentially stacked; infrared sub-pixels include semiconductor layers, metal layers, photodiodes, and infrared filters that are sequentially stacked And micromirrors.
  • the first camera module and the second camera module are connected through a synchronization module.
  • the mobile terminal also includes:
  • An infrared emission module disposed on the periphery of the first camera module
  • a power supply module connected to the first camera module, the second camera module and the infrared transmitting module
  • An image processor connected to the first camera module and the second camera module, the image processor and the power supply module are all integrated on the motherboard of the mobile terminal, and the infrared transmitting module is connected to the motherboard of the mobile terminal;
  • a display module connected to the image processor.
  • the first camera module further includes:
  • a first driving module for driving the first lens module to move
  • a first filter module disposed between the first lens module and the first image sensor.
  • the first filter module can pass light wavelengths from 380 nm to 1100 nm.
  • the second camera module further includes:
  • a second driving module for driving the second lens module to move
  • a second filter module disposed between the second lens module and the second image sensor.
  • the second filter module can pass light wavelengths from 380 nm to 700 nm.
  • the first image sensor and the second image sensor are each a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled element CCD image sensor, or a quantum thin film image sensor.
  • the structure of the mobile terminal shown in FIG. 12 does not constitute a limitation on the mobile terminal.
  • the mobile terminal may include more or less components than shown in the figure, or some components may be combined, or different components. Layout.
  • the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a car terminal, a wearable device, a pedometer, and the like.
  • the processor 1210 is configured to obtain depth of field information through the first camera and the second camera module; and obtain the first image data collected by the first camera module and the second image data collected by the second camera according to the depth of field information.
  • One image data and the second image data are the same frame data; according to the first image data and the second image data, a background blur image is generated by triangulation ranging.
  • the processor 1210 is further configured to: emit infrared light through the infrared transmitting module; acquire the distance between each infrared light reflection point on the object to be photographed and the first camera module according to the infrared light reflected by the object to be photographed; The distance between each infrared light reflection point and the first camera module is used to obtain stereo information about the object to be photographed.
  • this combination method can not only ensure fast focusing, but also can use infrared light to detect the distance between the mobile terminal and the object to be photographed, improve the image imaging effect, realize the application function of stereoscopic photography, and also realize the background blur function To ensure the diversification of mobile terminal functions, improve user experience, and meet user needs.
  • the radio frequency unit 1201 may be used to receive and send signals during the transmission and reception of information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 1210; To send uplink data to the base station.
  • the radio frequency unit 1201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1201 can also communicate with a network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 1202, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 1203 may convert audio data received by the radio frequency unit 1201 or the network module 1202 or stored in the memory 1209 into audio signals and output them as sound. Also, the audio output unit 1203 may also provide audio output (for example, call signal reception sound, message reception sound, etc.) related to a specific function performed by the mobile terminal 1200.
  • the audio output unit 1203 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1204 is used to receive audio or video signals.
  • the input unit 1204 may include a graphics processing unit (GPU) 12041 and a microphone 12042.
  • the graphics processor 12041 pairs images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 1206, and the display unit here is the above-mentioned display module.
  • the image frames processed by the graphics processor 12041 may be stored in the memory 1209 (or other storage medium) or transmitted via the radio frequency unit 1201 or the network module 1202.
  • the graphics processor 12041 is the above-mentioned image data processing module.
  • the microphone 12042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 1201 in the case of a telephone call mode and output.
  • the mobile terminal 1200 further includes at least one sensor 1205, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 12061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 12061 and the mobile terminal 1200 when the mobile terminal 1200 moves to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, which can be used to identify mobile terminal attitudes (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc .; sensor 1205 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensors, etc. are not repeated here.
  • the display unit 1206 is configured to display information input by the user or information provided to the user.
  • the display unit 1206 may include a display panel 12061.
  • the display panel 12061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the user input unit 1207 may be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the mobile terminal.
  • the user input unit 1207 includes a touch panel 12071 and other input devices 12072.
  • the touch panel 12071 also known as a touch screen, can collect user's touch operations on or near it (for example, the user uses a finger, a stylus or any suitable object or accessory on the touch panel 12071 or near the touch panel 12071 operating).
  • the touch panel 12071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal caused by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it
  • the processor 1210 receives a command sent by the processor 1210 and executes the command.
  • various types such as resistive, capacitive, infrared, and surface acoustic wave can be used to implement the touch panel 12071.
  • the user input unit 1207 may further include other input devices 12072.
  • other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, and details are not described herein again.
  • the touch panel 12071 may be overlaid on the display panel 12061.
  • the touch panel 12071 detects a touch operation on or near the touch panel 12071, it is transmitted to the processor 1210 to determine the type of the touch event, and the processor 1210 then The type of event provides corresponding visual output on the display panel 12061.
  • the touch panel 12071 and the display panel 12061 are implemented as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 12071 and the display panel 12061 may be integrated. The implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 1208 is an interface through which an external device is connected to the mobile terminal 1200.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, and audio input / output (I / O) port, video I / O port, headphone port, and more.
  • the interface unit 1208 may be used to receive an input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 1200 or may be used to communicate between the mobile terminal 1200 and an external device. Transfer data between devices.
  • the memory 1209 may be used to store software programs and various data.
  • the memory 1209 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one function required application program (such as a sound playback function, an image playback function, etc.), etc .; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1209 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage device.
  • the processor 1210 is a control center of the mobile terminal, and uses various interfaces and lines to connect various parts of the entire mobile terminal.
  • the processor 1210 runs or executes software programs and / or modules stored in the memory 1209 and calls data stored in the memory 1209. , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 1210 may include one or more processing units; optionally, the processor 1210 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1210.
  • the mobile terminal 1200 may further include a power source 1211 (such as a battery) for supplying power to various components.
  • a power source 1211 such as a battery
  • the power source 1211 may be logically connected to the processor 1210 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 1200 includes some functional modules that are not shown, and details are not described herein again.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

本公开提供了一种移动终端及图像拍摄方法,移动终端包括第一和第二摄像头模组,第一摄像头模组的第一图像传感器对应的第一像素阵列包括按照第一预定方式排布的预设数目个第一像素单元,第一像素单元包括第一和第二像素;第二摄像头模组的第二图像传感器对应的第二像素阵列包括按照第二预定方式排布的预设数目个第二像素单元,第二像素单元包括第一像素;其中第一像素包括红绿蓝子像素,第二像素包括红色和蓝色子像素中的至少一种以及绿色和红外子像素,且第一和第二像素均为全像素双核对焦像素,第一和第二像素中各包括四个全像素双核对焦子像素。

Description

移动终端及图像拍摄方法
相关申请的交叉引用
本申请主张在2018年7月19日在中国提交的中国专利申请号No.201810798027.8的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及图像处理技术领域,尤其涉及一种移动终端及图像拍摄方法。
背景技术
目前,智能电子产品已经逐渐成为人们生活中的必需品,拍照功能作为电子产品的一重要配置也在逐渐发展。但随着拍照功能的推广和普及,人们已经不在满足当前的智能电子产品中摄像头仅有的拍照功能,更加期望实现拍照效果多样化、玩法多样化以及功能多样化。
目前市场上,基于互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)的图像传感器像素阵列排布中,最常用的是R(红色)G(绿色)B(蓝色)拜耳像素阵列排布模式,如图1a和图1b所示,但此种排布方式不能检测物体距离,且只能用于接受自然光,在正常光照时拍照记录图像。
全像素双核对焦(full pixel dual core focus,2PD)技术的像素阵列排布模式如图1c和图1d所示,此种排布方式也只能用于接受自然光,用于拍照记录图像,但相对相位检测自动对焦(Phase Detection Auto Focus,PDAF)技术方案,可以增加检测物体距离,更加快速的完成对焦动作。
其中2PD相位检测技术原理说明如下:由图1c和图1d可见,像素阵列中部分R,G和B子像素被一分为二,根据不同入射方向获取的光能量不一样,从而左边子像素点和右边子像素点即构成一对相位检测对;当左边子像素点和右边子像素点亮度值均达到相对最大峰值时,此刻图像相对最清晰,即为合焦,然后通过算法计算获得物距,从而实现快速对焦。
综上所述,CMOS的图像传感器像素阵列排布中无法检测物体距离且只 能接受自然光,2PD技术的像素阵列排布中虽然可以检测物体距离,但是仅可接受自然光,因此相关技术中的图像传感器的像素阵列排布模式,存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
发明内容
本公开的一些实施例提供一种移动终端及图像拍摄方法,以解决相关技术中拍摄时存在拍摄场景受限、对焦缓慢,影响用户拍摄体验的问题。
为了解决上述问题,本公开的一些实施例是这样实现的:
第一方面,本公开的一些实施例提供一种移动终端,包括第一摄像头模组和与第一摄像头模组相邻的第二摄像头模组,第一摄像头模组包括第一图像传感器,第二摄像头模组包括第二图像传感器;
第一图像传感器对应的第一像素阵列包括按照第一预定方式排布的预设数目个第一像素单元,第一像素单元包括第一像素和与第一像素位置相邻的第二像素;
第二图像传感器对应的第二像素阵列包括按照第二预定方式排布的预设数目个第二像素单元,第二像素单元包括第一像素;
第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种以及绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素,第一像素和第二像素中各包括四个全像素双核对焦子像素。
第二方面,本公开的一些实施例提供一种图像拍摄方法,应用于上述的移动终端,该方法包括:
通过第一摄像头和第二摄像头模组获取景深信息;
根据景深信息,获取第一摄像头模组采集的第一图像数据,以及第二摄像头采集的第二图像数据,第一图像数据和第二图像数据为相同帧数据;
根据第一图像数据和第二图像数据,通过三角测距生成背景虚化图像。
第三方面,本公开的一些实施例还提供一种图像拍摄方法,应用于上述的移动终端,移动终端还包括设置于第一摄像头模组周缘的红外发射模块;该方法包括:
通过红外发射模块发射红外光;
根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离;
根据待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离,对待拍摄对象进行立体信息获取。本公开技术方案,通过利用第一图像传感器制成第一摄像头模组、利用第二图像传感器制成第二摄像头模组,两个摄像头模组结合后形成双摄,此种组合方式,不仅能保证快速对焦,且可利用红外光线进行移动终端与待拍摄对象之间的距离检测,提升图像成像效果,实现拍照立体相关应用功能,还能实现背景虚化功能,保证移动终端的功能多样化,提升用户使用体验,满足用户需求。
附图说明
为了更清楚地说明本公开的一些实施例的技术方案,下面将对本公开的一些实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1a表示相关技术中的常规RGB排布示意图;
图1b表示常规像素点的切面图;
图1c表示2PD的像素阵列排布图;
图1d表示2PD像素切面图;
图2表示本公开的一些实施例移动终端的示意图一;
图3a表示本公开的一些实施例第一摄像头模组的示意图;
图3b表示本公开的一些实施例第二摄像头模组的示意图;
图4a表示本公开的一些实施例第一像素单元的示意图之一;
图4b表示本公开的一些实施例第一像素单元的示意图之二;
图4c表示本公开的一些实施例第一像素单元的示意图之三;
图4d表示本公开的一些实施例第一像素单元的示意图之四;
图4e表示本公开的一些实施例第一像素单元的示意图之五;
图5a表示本公开的一些实施例第一像素单元的示意图之六;
图5b表示本公开的一些实施例第一像素单元的示意图之七;
图5c表示本公开的一些实施例第一像素单元的示意图之八;
图6表示本公开的一些实施例像素切面图;
图7表示本公开的一些实施例移动终端的示意图二;
图8表示本公开的一些实施例第一摄像头模组与图像处理器连接示意图;
图9表示本公开的一些实施例第二摄像头模组与图像处理器连接示意图;
图10表示本公开的一些实施例图像拍摄方法示意图一;
图11表示本公开的一些实施例图像拍摄方法示意图二;
图12表示本公开的一些实施例移动终端硬件结构示意图。
具体实施方式
下面将结合本公开的一些实施例中的附图,对本公开的一些实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开的一些实施例提供一种移动终端,如图2、图3a至图3b以及图4a至图4e所示,移动终端1包括第一摄像头模组11和与第一摄像头模组11相邻的第二摄像头模组12,第一摄像头模组11包括第一图像传感器111,第二摄像头模组12包括第二图像传感器121;
第一图像传感器111对应的第一像素阵列包括按照第一预定方式排布的预设数目个第一像素单元,第一像素单元包括第一像素和与第一像素位置相邻的第二像素;
第二图像传感器121对应的第二像素阵列包括按照第二预定方式排布的预设数目个第二像素单元,第二像素单元包括第一像素;
第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种以及绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素,第一像素和第二像素中各包括四个全像素双核对焦子像素。
本公开的一些实施例提供的移动终端1包括第一摄像头模组11以及第二摄像头模组12,其中第一摄像头模组11包括第一图像传感器111、第二摄像头模组12包括第二图像传感器121。第一图像传感器111对应于第一像素阵列,第二图像传感器121对应于第二像素阵列。第一摄像头模组11以及第二摄像头模组12位置相邻。
第一像素阵列包括预设数目个第一像素单元,其中预设数目个第一像素单元按照第一预定方式进行排布,预设数目个第一像素单元均包括第一像素以及第二像素。第二像素阵列包括预设数目个第二像素单元,其中预设数目个第二像素单元按照第二预定方式进行排布。预设数目个第二像素单元均包括第一像素。
其中第一像素与第二像素中的子像素有所区别,第一像素中包含红色子像素(R)、绿色子像素(G)和蓝色子像素(B);第二像素中包括红色子像素和蓝色子像素中的至少一种以及绿色子像素和红外子像素(IR),通过在第二像素中设置红外子像素,可以在接收红外光的情况下进行图像拍摄,实现暗态成像,保证用户拍摄体验。
且本公开的一些实施例的第一像素和第二像素均为全像素双核对焦(2PD)像素,采用2PD像素可以检测物体距离,更加快速的完成对焦动作,这里的第一像素以及第二像素均为2PD像素,也就是第一像素和第二像素中的子像素均为2PD子像素。通过采用2PD像素可以使得第一摄像头模组11以及第二摄像头模组12快速的完成对焦过程。
第一像素中的红色子像素、绿色子像素和蓝色子像素按照一定的方式进行排列,且第一像素中包含四个全像素双核对焦子像素,具体为包含一个红色子像素、一个蓝色子像素以及两个绿色子像素,在这里为了便于区别将两个绿色子像素分别称为第一绿色子像素和第二绿色子像素,其中第一绿色子像素与第二绿色子像素相同。红色子像素与第一绿色子像素相邻,第二绿色子像素位于红色子像素的下方,蓝色子像素位于第一绿色子像素的下方,且第二绿色子像素与蓝色子像素相邻。
第二像素中包含四个全像素双核对焦子像素,具体为包括红色子像素和蓝色子像素中的至少一种以及绿色子像素和红外子像素,即第二像素可以包 括三种子像素或者四种子像素。当第二像素中包含三种子像素时,可以包括红色子像素、绿色子像素以及红外子像素,此时绿色子像素的数量为两个,还可以包括蓝色子像素、绿色子像素以及红外子像素,此时绿色子像素的数量为两个。当第二像素中包含四种子像素时,可以包括红色子像素、蓝色子像素、绿色子像素以及红外子像素。
本公开的一些实施例,通过在RGB像素阵列排布方式上进行改进,将RGB像素阵列排布方式改为RGB-IR(红外)的像素阵列排布方式,可以使得移动终端在接收红外光的情况下进行图像拍摄,实现暗态成像,保证用户拍摄体验,且2PD像素点的设置可实现快速对焦。
同时本公开的一些实施例的图像传感器可以与红外发射模块配合,实现立体拍照的相关应用功能,并可利用两个摄像头模组根据三角测距原理实现背景虚化功能,保证用户拍摄体验的同时,增强移动终端的功能性。
在本公开的一些实施例中,如图4a至图4e所示,红外子像素在第二像素中的位置,与红色子像素、绿色子像素或蓝色子像素在第一像素中的位置相同;或者
红外子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同,或者与第二组合子像素在第一像素中的位置相同;
其中,第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
当红外子像素在第二像素中的位置与红色子像素在第一像素中的位置相同时,第二像素中包括一个蓝色子像素、两个绿色子像素以及一个红外子像素,此时就是在第一像素的基础上,将红色子像素替换为红外子像素。当红外子像素在第二像素中的位置与蓝色子像素在第一像素中的位置相同时,第二像素中包括一个红色子像素、两个绿色子像素以及一个红外子像素,此时就是在第一像素的基础上,将蓝色子像素替换为红外子像素。在红外子像素在第二像素中的位置与一绿色子像素在第一像素中的位置相同时,第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素以及一个红外子像素,此时就是在第一像素的基础上,将其中一绿色子像素替换为红外子像素。
当红外子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同时,第二像素中包括红色子像素、绿色子像素、蓝色子像素以及红外子像素,此时可以在第一像素的基础上取2PD子像素的位置相邻的1/2红色子像素和1/2绿色子像素为红外子像素。
当红外子像素在第二像素中的位置,与第二组合子像素在第一像素中的位置相同时,第二像素中包括红色子像素、绿色子像素、蓝色子像素以及红外子像素,此时可以在第一像素的基础上取2PD子像素的位置相邻的1/2蓝色子像素和1/2绿色子像素为红外子像素。
在上述实施例的基础上,第一像素单元中包括一个第二像素以及至少一个第一像素。
第一像素单元中包含一个第二像素以及至少一个第一像素,其中第一像素单元中像素的数量至少为两个。当第一像素单元中像素的数量为两个时,包括一个第一像素以及一个第二像素,此时红外子像素在第一像素单元中的取点密度为1/8。例如,如图4a所示,第一像素单元中包括一个第一像素以及一个第二像素,第二像素中包括一个红色子像素、两个绿色子像素以及一个红外子像素。或者如图4b所示,第一像素单元中包括一个第一像素以及一个第二像素,第二像素中包括一个红色子像素、一个绿色子像素、一个蓝色子像素以及一个红外子像素。上述两种情况红外子像素在第一像素单元中的占比为1/8,也就是说红外子像素的取点密度为1/8。
当第一像素单元中像素的数量为三个时,包括两个第一像素以及一个第二像素,此时红外子像素在第一像素单元中的密度为1/12。例如,如图4c所示,第一像素单元中包括两个第一像素以及一个第二像素,其中第二像素中包括一个蓝色子像素、两个绿色子像素以及一个红外子像素,则红外子像素在第一像素单元中的占比为1/12,也就是说红外子像素的取点密度为1/12。
当第一像素单元中像素的数量为四个时,包括三个第一像素以及一个第二像素,此时红外子像素在第一像素单元中的密度为1/16。例如,如图4d所示,第一像素单元中包括三个第一像素以及一个第二像素,第二像素中包括蓝色子像素、绿色子像素、红色子像素以及红外子像素,此时可以在第一像素的基础上取2PD子像素的1/2红色子像素和1/2绿色子像素为红外子像素。 或者如图4e所示,第一像素单元中包括三个第一像素以及一个第二像素,其中第二像素中包括蓝色子像素、绿色子像素、红色子像素以及红外子像素,可以在第一像素的基础上取2PD子像素的1/2蓝色子像素和1/2绿色子像素为红外子像素。上述两种情况红外子像素在第一像素单元中的占比为1/16,也就是说红外子像素的取点密度为1/16。
上述图4a至图4e对应的几种红外子像素的取点方式仅用于举例说明,还可以是其他的取点方式,其对应的多种实现方式这里不再一一介绍。红外子像素在第一像素单元中的取点位置(第二像素的位置)在本公开的一些实施例中不做限制。其中红外子像素在第一像素单元中的密度为1/4n,且n为大于或者等于2的整数,且红外子像素点所适用的第一像素阵列大小不限。
在本公开的一些实施例中,如图5a至图5c所示,1/2红外子像素在第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在第一像素中的位置相同,相邻两个第二像素中的1/2红外子像素组成红外子像素。
在第二像素中可以仅包含1/2红外子像素,通过两个第二像素进行组合可以得到一个完整的红外子像素。当第二像素中包含1/2红外子像素时,1/2红外子像素在第二像素中的位置可以与1/2红色子像素在第一像素中的位置相同,还可以与1/2绿色子像素在第一像素中的位置相同,也可以与1/2蓝色子像素在第一像素中的位置相同。
当1/2红外子像素在一第二像素中的位置与1/2红色子像素在第一像素中的位置相同时,则在另一第二像素中1/2红外子像素的位置与1/2绿色子像素在第一像素中的位置相同。当1/2红外子像素在一第二像素中的位置与1/2绿色子像素在第一像素中的位置相同时,则在另一第二像素中1/2红外子像素的位置与1/2蓝色子像素或1/2红色子像素在第一像素中的位置相同。
在上述实施例的基础上,在第一像素单元中,第二像素的数量为两个,第一像素的数量大于或者等于零。
第一像素单元中像素的数量至少为两个,则第一像素单元中包括两个第二像素以及数量大于或者等于零的第一像素。当第一像素单元中像素的数量为两个时,包括两个第二像素,此时红外子像素在第一像素单元中的取点密度为1/8。例如,如图5a所示,第一像素单元中包括两个第二像素,其中每 一第二像素中均包括红色子像素、绿色子像素、蓝色子像素以及1/2红外子像素,此时1/2红外子像素在一第二像素中的位置与1/2绿色子像素在第一像素中的位置相同时,在另一第二像素中1/2红外子像素的位置与1/2蓝色子像素在第一像素中的位置相同。红外子像素在第一像素单元中的占比为1/8,也就是说红外子像素的取点密度为1/8。
当第一像素单元中像素的数量为三个时,包括两个第二像素以及一个第一像素,此时红外子像素在第一像素单元中的取点密度为1/12。例如,如图5b所示,第一像素单元中包括两个第二像素以及一个第一像素,其中每一第二像素中均包括红色子像素、绿色子像素、蓝色子像素以及1/2红外子像素,此时1/2红外子像素在一第二像素中的位置与1/2红色子像素在第一像素中的位置相同时,在另一第二像素中1/2红外子像素的位置与1/2绿色子像素在第一像素中的位置相同。红外子像素在第一像素单元中的占比为1/12,也就是说红外子像素的取点密度为1/12。
当第一像素单元中像素的数量为四个时,包括两个第二像素以及两个第一像素,此时红外子像素在第一像素单元中的取点密度为1/16。例如,如图5c所示,第一像素单元中包括两个第二像素以及两个第一像素,其中每一第二像素中均包括红色子像素、绿色子像素、蓝色子像素以及1/2红外子像素,此时1/2红外子像素在一第二像素中的位置与1/2绿色子像素在第一像素中的位置相同时,在另一第二像素中1/2红外子像素的位置与1/2红色子像素在第一像素中的位置相同。红外子像素在第一像素单元中的占比为1/16,也就是说红外子像素的取点密度为1/16。
第一像素阵列可以是由1/8密度的RGB+IR像素单元、1/12密度的RGB+IR像素单元或1/16密度的RGB+IR像素单元作为一个像素单位阵列,像素单位阵列再进行周期性阵列排布组成。当然第一像素阵列还可以是其他形式,这里不再列举。
上述的图5a至图5c仅仅为几种对应的实施方式,还可以在此基础上进行变形,这里不再一一阐述。其中红外子像素在第一像素单元中的密度为1/4n,且n为大于或者等于2的整数,且红外子像素点所适用的第一像素阵列大小不限。
在本公开的一些实施例中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;红外子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
红色子像素所包含的半导体层、金属层、光电二极管、红色滤光片以及微镜,由下至上依次排列。相应的绿色子像素所包含的半导体层、金属层、光电二极管、绿色滤光片以及微镜,由下至上依次排列。蓝色子像素所包含的半导体层、金属层、光电二极管、蓝色滤光片以及微镜,由下至上依次排列。红外子像素所包含的半导体层、金属层、光电二极管、红外滤光片以及微镜,由下至上依次排列。这里的半导体层可以为硅基板,但并不局限于此。红色、绿色、蓝色以及红外子像素的结构可参见图6所示,图6中虽然只示出了蓝色以及红外子像素,但在此基础上可以将获知红绿子像素的结构。可将蓝色滤光片替换为红色或者绿色滤光片,即可获得红色子像素或绿色子像素的结构。
红色、绿色以及蓝色子像素用于获取合成图像的像素的色彩信息,其阻挡红外线的进入;例如,仅使波长在380~700nm的可见光进入,可在高照度下直接生成色彩完整逼真的图像。红外波长为750~1100nm,红外滤光区可用于使红外波段通过,可提升暗态成像效果,实现红外测距功能。
由以上说明可见,RGB子像素点是对应于每种RGB颜色的波长光的光接收元件,IR子像素点是对应红外光的光接收元件。
在本公开的一些实施例中,第一图像传感器和第二图像传感器均为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
本公开RGB-IR的像素阵列排布方式,适用的图像传感器类型不限,可以是基于CMOS的图像传感器,可以是基于电荷耦合元件(Charge-coupled Device,CCD)的图像传感器,也可以是基于量子薄膜的图像传感器,当然还可以是其他类型的图像传感器。且本公开的一些实施例的图像传感器可适 用于任何包含摄像头模组的电子产品中。
在本公开的一些实施例中,如图2和图7所示,第一摄像头模组11与第二摄像头模组12通过同步模块14连接,移动终端1还包括:设置于第一摄像头模组11周缘的红外发射模块15;与第一摄像头模组11、第二摄像头模组12以及红外发射模块15连接的供电模块16;与第一摄像头模组11、第二摄像头模组12连接的图像处理器17,图像处理器17与供电模块16均集成于移动终端的主板上,红外发射模块15与移动终端的主板连接;以及与图像处理器17连接的显示模块18。
在第一摄像头模组11与第二摄像头模组12之间连接有同步模块14,通过设置同步模块14,可以使得移动终端1控制实现第一摄像头模组11和第二摄像头模组12的帧同步数据输出。
在第一摄像头模组11的周缘设置有红外发射模块15,其中第一摄像头模组11、第二摄像头模组12以及红外发射模块15均连接至供电模块16,用于根据供电模块16提供的电量进行工作。
第一摄像头模组11、第二摄像头模组12同时连接至图像处理器17,通过图像处理器17与显示模块18连接,在光线聚焦在第一摄像头模组11以及第二摄像头模组12上进行光电转换后,可以将数据传输给图像处理器17,图像处理器17对数据进行处理后在显示模块18上以图片的形式呈现。其中红外发射模块15与移动终端的主板连接,因此移动终端可以获取红外发射模块15发射红外光的时刻。由于图像处理器17集成于主板上,移动终端可以获取第一摄像头模组11接收红外光的时刻。
如图2和图8所示,第一摄像头模组11还包括:第一透镜模组112;用于驱动第一透镜模组112移动的第一驱动模块113;设置于第一透镜模组112与第一图像传感器111之间的第一滤波模块114,第一滤波模块114可通过380nm至1100nm的光波长。
第一透镜模组112用于对光线聚焦,第一透镜模组112与第一驱动模块113连接,第一驱动模块113用于随着待拍摄对象的远近,从而进行调整第一透镜模组112的位置。
在第一透镜模组112与第一图像传感器111之间设置有第一滤波模块114, 其中在光线通过第一透镜模组112聚焦,经过第一滤波模块114后,可以聚焦在第一图像传感器111的像素阵列上。第一图像传感器111与图像处理器17连接。
本公开的一些实施例中的第一滤波模块114为自然光以及红外光均可通过的双通滤波模块。此时在光线通过第一透镜模组112聚焦后,可通过第一滤波模块114进行滤波,其中第一滤波模块114可用于自然光和红外光的通过,保证成像效果。
红外发射模块15可设置于第一透镜模组112的周缘。红外发射模块15发出红外线,红外线在遇到障碍物后会发生反射;当捕捉到反射回来的红外光线后,经过红外子像素进行光电转换,可获取红外线从发射到接收到红外线的时间差,由于光的传播速度固定,从而可以计算出障碍物距离移动终端的距离,最终可获取障碍物上每个极小单位到移动终端的距离,实现第一摄像头模组的立体成像记录功能,当然还可以通过获取红外光相位差的方式,获取障碍物上各个红外光反射点与移动终端之间的距离,此时可以参见飞行时间(Time of Flight,TOF)技术,这里不再详细阐述。
如图2和图9所示,第二摄像头模组12还包括:第二透镜模组122;用于驱动第二透镜模组122移动的第二驱动模块123;设置于第二透镜模组122与第二图像传感器121之间的第二滤波模块124,第二滤波模块124可通过380nm至700nm的光波长。
第二透镜模组122同样用于对光线聚焦,第二透镜模组122与第二驱动模块123连接,第二驱动模块123用于随着待拍摄对象的远近,从而进行调整第二透镜模组122的位置。
在第二透镜模组122与第二图像传感器121之间设置有第二滤波模块124,其中在光线通过第二透镜模组122聚焦,经过第二滤波模块124后,可以聚焦在第二图像传感器121的像素阵列上。第二图像传感器121与图像处理器17连接。
本公开的一些实施例中的第二滤波模块124用于自然光通过。此时在光线通过第二透镜模组122聚焦后,可通过第二滤波模块124进行滤波。
其中,在调整第一透镜模组112和第二透镜模组122的位置之后,可以 利用第一图像传感器111和第二图像传感器121中的2PD像素获取相位差,从而获取物体与成像面距离,进而实现快速对焦。
本公开的一些实施例提供的移动终端,通过利用第一图像传感器制成第一摄像头模组、利用第二图像传感器制成第二摄像头模组,两个摄像头模组结合后形成双摄,此种组合方式,不仅能保证快速对焦,且可利用红外光线进行移动终端与待拍摄对象之间的距离检测,提升图像成像效果,实现拍照立体相关应用功能,还能实现背景虚化功能,保证移动终端的功能多样化,提升用户使用体验,满足用户需求。
本公开的一些实施例还提供一种图像拍摄方法,应用于上述的移动终端,如图10所示,该方法包括:
步骤1001、通过第一摄像头和第二摄像头模组获取景深信息。
本公开的一些实施例通过第一摄像头模组和第二摄像头模组,可以确定景深信息,即取得清晰图像的成像所测定的待拍摄物体前后距离范围。
步骤1002、根据景深信息,获取第一摄像头模组采集的第一图像数据,以及第二摄像头采集的第二图像数据,第一图像数据和第二图像数据为相同帧数据。
在确定景深信息之后,可以根据景深信息获取相同帧的第一图像数据和第二图像数据。在获取第一图像数据和第二图像数据之后,可以执行步骤1003。
步骤1003、根据第一图像数据和第二图像数据,通过三角测距生成背景虚化图像。
在获取第一图像数据和第二图像数据之后,由于两个摄像头模组之间的距离已知,可以根据两个摄像头模组输出的相同帧数据结合三角测距原理进行数据处理,获取背景虚化图像。
本公开的一些实施例的图像拍摄方法,可以利用第一图像传感器制成第一摄像头模组、利用第二图像传感器制成第二摄像头模组,两个摄像头模组结合后形成双摄,并保证两者数据输出的同步,此种组合方式,能够实现背景虚化功能,保证移动终端的功能多样化,提升用户使用体验,满足用户需求。
本公开的一些实施例还提供一种图像拍摄方法,应用于上述的移动终端, 其中移动终端包括有设置于第一摄像头模组周缘的红外发射模块;如图11所示,该方法包括:
步骤1101、通过红外发射模块发射红外光。
移动终端上的红外发射模块可以发出红外线,红外线在遇到待拍摄对象后会发生反射,其反射的红外光会被移动终端的第一摄像头模组所接收。其中第一摄像头模组的第一图像传感器形成RGB-IR的像素阵列,因此可通过红外子像素进行光电转换。
步骤1102、根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离。
在获取待拍摄对象与第一摄像头模组之间的距离时,实际为获取待拍摄对象与成像面之间的距离。当第一摄像头模组捕捉到反射回来的红外光线后,经过红外子像素进行光电转换,可获取红外线从发射到接收到红外线的时间差,由于光的传播速度固定,从而可以根据时间差与传播速度乘积的1/2计算出障碍物距离第一摄像头模组的距离。其中第一摄像头模组接收到各个红外光反射点反射红外光的时间有所区别,因此针对每一个红外光反射点可以对应计算一距离,进而可以获得各个红外光反射点与第一摄像头模组之间的距离。当然还可以通过获取红外光的相位差获取各个红外光反射点与第一摄像头模组之间的距离,具体可参见飞行时间(Time of Flight,TOF)技术,这里不再详细阐述。
步骤1103、根据待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离,对待拍摄对象进行立体信息获取。
在获取待拍摄对象与移动终端之间的距离时,具体为获取待拍摄对象上每个极小单位到第一摄像头模组的距离,然后执行对待拍摄对象拍摄的流程,实现第一摄像头模组的立体成像记录功能。
本公开的一些实施例的图像拍摄方法,可以利用第一图像传感器制成第一摄像头模组,配合红外发射模块实现利用红外光线进行第一摄像头模组与待拍摄对象之间的距离检测,提升图像成像效果,实现拍照立体相关应用功能。
图12为实现本公开各个实施例的一种移动终端的硬件结构示意图,该移 动终端1200包括但不限于:射频单元1201、网络模块1202、音频输出单元1203、输入单元1204、传感器1205、显示单元1206、用户输入单元1207、接口单元1208、存储器1209、处理器1210、以及电源1211等部件。
移动终端1200,还包括第一摄像头模组和与第一摄像头模组相邻的第二摄像头模组,第一摄像头模组包括第一图像传感器,第二摄像头模组包括第二图像传感器;
第一图像传感器对应的第一像素阵列包括按照第一预定方式排布的预设数目个第一像素单元,第一像素单元包括第一像素和与第一像素位置相邻的第二像素;
第二图像传感器对应的第二像素阵列包括按照第二预定方式排布的预设数目个第二像素单元,第二像素单元包括第一像素;
第一像素包括红色子像素、绿色子像素和蓝色子像素,第二像素包括红色子像素和蓝色子像素中的至少一种以及绿色子像素和红外子像素,且第一像素和第二像素均为全像素双核对焦像素,第一像素和第二像素中各包括四个全像素双核对焦子像素。
其中,红外子像素在第二像素中的位置,与红色子像素、绿色子像素或蓝色子像素在第一像素中的位置相同;或者
红外子像素在第二像素中的位置,与第一组合子像素在第一像素中的位置相同,或者与第二组合子像素在第一像素中的位置相同;
其中,第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
其中,第一像素单元中包括一个第二像素以及至少一个第一像素。
其中,1/2红外子像素在第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在第一像素中的位置相同,相邻两个第二像素中的1/2红外子像素组成红外子像素。
其中,在第一像素单元中,第二像素的数量为两个,第一像素的数量大于或者等于零。
其中,红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;绿色子像素包括依次堆叠设置的半导体层、金属层、 光电二极管、绿色滤光片以及微镜;蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;红外子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
其中,第一摄像头模组与第二摄像头模组通过同步模块连接。
其中,移动终端还包括:
设置于第一摄像头模组周缘的红外发射模块;
与第一摄像头模组、第二摄像头模组以及红外发射模块连接的供电模块;
与第一摄像头模组、第二摄像头模组连接的图像处理器,图像处理器与供电模块均集成于移动终端的主板上,红外发射模块与移动终端的主板连接;以及
与图像处理器连接的显示模块。
其中,第一摄像头模组还包括:
第一透镜模组;
用于驱动第一透镜模组移动的第一驱动模块;
设置于第一透镜模组与第一图像传感器之间的第一滤波模块,第一滤波模块可通过380nm至1100nm的光波长。
其中,第二摄像头模组还包括:
第二透镜模组;
用于驱动第二透镜模组移动的第二驱动模块;
设置于第二透镜模组与第二图像传感器之间的第二滤波模块,第二滤波模块可通过380nm至700nm的光波长。
其中,第一图像传感器和第二图像传感器均为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
本领域技术人员可以理解,图12中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开的一些实施例中,移动终端包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中处理器1210用于:通过第一摄像头和第二摄像头模组获取景深信息; 根据景深信息,获取第一摄像头模组采集的第一图像数据,以及第二摄像头采集的第二图像数据,第一图像数据和第二图像数据为相同帧数据;根据第一图像数据和第二图像数据,通过三角测距生成背景虚化图像。
处理器1210还用于:通过红外发射模块发射红外光;根据待拍摄对象所反射的红外光获取待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离;根据待拍摄对象上各个红外光反射点与第一摄像头模组之间的距离,对待拍摄对象进行立体信息获取。
这样,通过利用第一图像传感器制成第一摄像头模组、利用第二图像传感器制成第二摄像头模组,两个摄像头模组结合后形成双摄,并通过同步模块实现两者数据输出的同步,此种组合方式,不仅能保证快速对焦,且可利用红外光线进行移动终端与待拍摄对象之间的距离检测,提升图像成像效果,实现拍照立体相关应用功能,还能实现背景虚化功能,保证移动终端的功能多样化,提升用户使用体验,满足用户需求。
应理解的是,本公开的一些实施例中,射频单元1201可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器1210处理;另外,将上行的数据发送给基站。通常,射频单元1201包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元1201还可以通过无线通信系统与网络和其他设备通信。
移动终端通过网络模块1202为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元1203可以将射频单元1201或网络模块1202接收的或者在存储器1209中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元1203还可以提供与移动终端1200执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元1203包括扬声器、蜂鸣器以及受话器等。
输入单元1204用于接收音频或视频信号。输入单元1204可以包括图形处理器(Graphics Processing Unit,GPU)12041和麦克风12042,图形处理器12041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头) 获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元1206上,这里的显示单元即为上述的显示模块。经图形处理器12041处理后的图像帧可以存储在存储器1209(或其它存储介质)中或者经由射频单元1201或网络模块1202进行发送。其中图形处理器12041即为上述的图像数据处理模块。麦克风12042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元1201发送到移动通信基站的格式输出。
移动终端1200还包括至少一种传感器1205,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板12061的亮度,接近传感器可在移动终端1200移动到耳边时,关闭显示面板12061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别移动终端姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器1205还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元1206用于显示由用户输入的信息或提供给用户的信息。显示单元1206可包括显示面板12061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板12061。
用户输入单元1207可用于接收输入的数字或字符信息,以及产生与移动终端的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元1207包括触控面板12071以及其他输入设备12072。触控面板12071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板12071上或在触控面板12071附近的操作)。触控面板12071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐 标,再送给处理器1210,接收处理器1210发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板12071。除了触控面板12071,用户输入单元1207还可以包括其他输入设备12072。具体地,其他输入设备12072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板12071可覆盖在显示面板12061上,当触控面板12071检测到在其上或附近的触摸操作后,传送给处理器1210以确定触摸事件的类型,随后处理器1210根据触摸事件的类型在显示面板12061上提供相应的视觉输出。虽然在图12中,触控面板12071与显示面板12061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板12071与显示面板12061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元1208为外部装置与移动终端1200连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元1208可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端1200内的一个或多个元件或者可以用于在移动终端1200和外部装置之间传输数据。
存储器1209可用于存储软件程序以及各种数据。存储器1209可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1209可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器1210是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器1209内的软件程序和/或模块,以及调用存储在存储器1209内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器1210可包括一个或多个处理单 元;可选的,处理器1210可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1210中。
移动终端1200还可以包括给各个部件供电的电源1211(比如电池),可选的,电源1211可以通过电源管理系统与处理器1210逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,移动终端1200包括一些未示出的功能模块,在此不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (15)

  1. 一种移动终端,包括第一摄像头模组和与所述第一摄像头模组相邻的第二摄像头模组,所述第一摄像头模组包括第一图像传感器,所述第二摄像头模组包括第二图像传感器;
    所述第一图像传感器对应的第一像素阵列包括按照第一预定方式排布的预设数目个第一像素单元,所述第一像素单元包括第一像素和与所述第一像素位置相邻的第二像素;
    所述第二图像传感器对应的第二像素阵列包括按照第二预定方式排布的预设数目个第二像素单元,所述第二像素单元包括第一像素;
    所述第一像素包括红色子像素、绿色子像素和蓝色子像素,所述第二像素包括所述红色子像素和所述蓝色子像素中的至少一种以及所述绿色子像素和红外子像素,且所述第一像素和所述第二像素均为全像素双核对焦像素,所述第一像素和所述第二像素中各包括四个全像素双核对焦子像素。
  2. 根据权利要求1所述的移动终端,其中,
    所述红外子像素在所述第二像素中的位置,与所述红色子像素、所述绿色子像素或所述蓝色子像素在所述第一像素中的位置相同;或者
    所述红外子像素在所述第二像素中的位置,与第一组合子像素在所述第一像素中的位置相同,或者与第二组合子像素在所述第一像素中的位置相同;
    其中,所述第一组合子像素是位置相邻的1/2红色子像素和1/2绿色子像素的组合;所述第二组合子像素是位置相邻的1/2绿色子像素和1/2蓝色子像素的组合。
  3. 根据权利要求2所述的移动终端,其中,
    所述第一像素单元中包括一个所述第二像素以及至少一个所述第一像素。
  4. 根据权利要求1所述的移动终端,其中,
    1/2红外子像素在所述第二像素中的位置与1/2红色子像素、1/2绿色子像素或1/2蓝色子像素在所述第一像素中的位置相同,相邻两个所述第二像素中的1/2红外子像素组成所述红外子像素。
  5. 根据权利要求4所述的移动终端,其中,在所述第一像素单元中,所 述第二像素的数量为两个,所述第一像素的数量大于或者等于零。
  6. 根据权利要求1所述的移动终端,其中,
    所述红色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红色滤光片以及微镜;
    所述绿色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、绿色滤光片以及微镜;
    所述蓝色子像素包括依次堆叠设置的半导体层、金属层、光电二极管、蓝色滤光片以及微镜;
    所述红外子像素包括依次堆叠设置的半导体层、金属层、光电二极管、红外滤光片以及微镜。
  7. 根据权利要求1所述的移动终端,其中,所述第一摄像头模组与所述第二摄像头模组通过同步模块连接。
  8. 根据权利要求1所述的移动终端,还包括:
    设置于所述第一摄像头模组周缘的红外发射模块;
    与所述第一摄像头模组、所述第二摄像头模组以及所述红外发射模块连接的供电模块;
    与所述第一摄像头模组、所述第二摄像头模组连接的图像处理器,所述图像处理器与所述供电模块均集成于所述移动终端的主板上,所述红外发射模块与所述移动终端的主板连接;以及
    与所述图像处理器连接的显示模块。
  9. 根据权利要求1所述的移动终端,其中,所述第一摄像头模组还包括:
    第一透镜模组;
    用于驱动所述第一透镜模组移动的第一驱动模块;
    设置于所述第一透镜模组与所述第一图像传感器之间的第一滤波模块,所述第一滤波模块可通过380nm至1100nm的光波长。
  10. 根据权利要求1所述的移动终端,其中,所述第二摄像头模组还包括:
    第二透镜模组;
    用于驱动所述第二透镜模组移动的第二驱动模块;
    设置于所述第二透镜模组与所述第二图像传感器之间的第二滤波模块,所述第二滤波模块可通过380nm至700nm的光波长。
  11. 根据权利要求1所述的移动终端,其中,所述第一图像传感器和所述第二图像传感器均为互补金属氧化物半导体CMOS图像传感器、电荷耦合元件CCD图像传感器或量子薄膜图像传感器。
  12. 一种图像拍摄方法,应用于如权利要求1所述的移动终端,所述方法包括:
    通过所述第一摄像头和所述第二摄像头模组获取景深信息;
    根据所述景深信息,获取所述第一摄像头模组采集的第一图像数据,以及所述第二摄像头采集的第二图像数据,所述第一图像数据和所述第二图像数据为相同帧数据;
    根据所述第一图像数据和所述第二图像数据,通过三角测距生成背景虚化图像。
  13. 一种图像拍摄方法,应用于如权利要求1所述的移动终端,所述移动终端还包括设置于第一摄像头模组周缘的红外发射模块,所述方法包括:
    通过所述红外发射模块发射红外光;
    根据待拍摄对象所反射的红外光获取所述待拍摄对象上各个红外光反射点与所述第一摄像头模组之间的距离;
    根据所述待拍摄对象上各个红外光反射点与所述第一摄像头模组之间的距离,对所述待拍摄对象进行立体信息获取。
  14. 一种移动终端,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,所述处理器执行所述计算机程序时实现如权利要求12-13任一项所述图像拍摄方法中的步骤。
  15. 一种计算机可读存储介质,其上存储有计算机程序,其中,该计算机程序被处理器执行时实现如权利要求12-13任一项所述图像拍摄方法中的步骤。
PCT/CN2019/096126 2018-07-19 2019-07-16 移动终端及图像拍摄方法 WO2020015626A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19838874.6A EP3826292A4 (en) 2018-07-19 2019-07-16 MOBILE TERMINAL AND IMAGE CAPTURE PROCEDURE
US17/151,767 US20240014236A9 (en) 2018-07-19 2021-01-19 Mobile terminal and image photographing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810798027.8A CN108900772A (zh) 2018-07-19 2018-07-19 一种移动终端及图像拍摄方法
CN201810798027.8 2018-07-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/151,767 Continuation US20240014236A9 (en) 2018-07-19 2021-01-19 Mobile terminal and image photographing method

Publications (1)

Publication Number Publication Date
WO2020015626A1 true WO2020015626A1 (zh) 2020-01-23

Family

ID=64351070

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/096126 WO2020015626A1 (zh) 2018-07-19 2019-07-16 移动终端及图像拍摄方法

Country Status (4)

Country Link
US (1) US20240014236A9 (zh)
EP (1) EP3826292A4 (zh)
CN (1) CN108900772A (zh)
WO (1) WO2020015626A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112616009A (zh) * 2020-12-31 2021-04-06 维沃移动通信有限公司 电子设备及其摄像模组

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108900772A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种移动终端及图像拍摄方法
CN112738386A (zh) * 2021-03-30 2021-04-30 北京芯海视界三维科技有限公司 一种传感器、拍摄模组、图像获取方法
CN114125240A (zh) * 2021-11-30 2022-03-01 维沃移动通信有限公司 图像传感器、摄像模组、电子设备及拍摄方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
WO2017171412A2 (ko) * 2016-03-30 2017-10-05 엘지전자 주식회사 이미지 처리 장치 및 이동 단말기
CN207354459U (zh) * 2017-10-19 2018-05-11 维沃移动通信有限公司 一种摄像头模组和移动终端
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108900772A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种移动终端及图像拍摄方法

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8848047B2 (en) * 2006-09-28 2014-09-30 Fujifilm Corporation Imaging device and endoscopic apparatus
KR101610705B1 (ko) * 2008-12-10 2016-04-11 삼성전자주식회사 카메라를 구비한 단말기 및 그 단말기에서 이미지 처리 방법
KR20110040468A (ko) * 2009-10-14 2011-04-20 삼성전자주식회사 휴대 단말기의 카메라 운용 방법 및 장치
WO2013054160A1 (en) * 2011-10-11 2013-04-18 Sony Ericsson Mobile Communications Ab Light sensitive, low height, and high dynamic range camera
CN105556944B (zh) * 2012-11-28 2019-03-08 核心光电有限公司 多孔径成像系统和方法
US9667933B2 (en) * 2013-07-01 2017-05-30 Omnivision Technologies, Inc. Color and infrared filter array patterns to reduce color aliasing
CN109246339B (zh) * 2013-08-01 2020-10-23 核心光电有限公司 用于对对象或场景进行成像的双孔径数字摄影机
US11265534B2 (en) * 2014-02-08 2022-03-01 Microsoft Technology Licensing, Llc Environment-dependent active illumination for stereo matching
CN103986877B (zh) * 2014-05-29 2017-09-26 宇龙计算机通信科技(深圳)有限公司 一种图像获取终端和图像获取方法
US9516295B2 (en) * 2014-06-30 2016-12-06 Aquifi, Inc. Systems and methods for multi-channel imaging based on multiple exposure settings
US10761292B2 (en) * 2015-06-29 2020-09-01 Lg Innotek Co., Ltd. Dual camera module and optical device
US10044959B2 (en) * 2015-09-24 2018-08-07 Qualcomm Incorporated Mask-less phase detection autofocus
US20160373664A1 (en) * 2015-10-27 2016-12-22 Mediatek Inc. Methods And Apparatus of Processing Image And Additional Information From Image Sensor
WO2017100309A1 (en) * 2015-12-07 2017-06-15 Delta Id, Inc. Image sensor configured for dual mode operation
US9906706B2 (en) * 2015-12-23 2018-02-27 Visera Technologies Company Limited Image sensor and imaging device
KR101751140B1 (ko) * 2015-12-24 2017-06-26 삼성전기주식회사 이미지 센서 및 카메라 모듈
EP3451055B1 (en) * 2016-04-28 2023-07-05 LG Innotek Co., Ltd. Lens driving mechanism, camera module, and optical device
KR102470945B1 (ko) * 2016-06-15 2022-11-25 엘지전자 주식회사 이동 단말기
US10547829B2 (en) * 2016-06-16 2020-01-28 Samsung Electronics Co., Ltd. Image detecting device and image detecting method using the same
CN205861983U (zh) * 2016-08-01 2017-01-04 新科实业有限公司 相机模组
CN107968103B (zh) * 2016-10-20 2020-03-17 昆山国显光电有限公司 像素结构及其制造方法、显示装置
CN106488137A (zh) * 2016-11-29 2017-03-08 广东欧珀移动通信有限公司 双摄像头对焦方法、装置和终端设备
US20180213217A1 (en) * 2017-01-23 2018-07-26 Multimedia Image Solution Limited Equipment and method for promptly performing calibration and verification of intrinsic and extrinsic parameters of a plurality of image capturing elements installed on electronic device
US10444415B2 (en) * 2017-02-14 2019-10-15 Cista System Corp. Multispectral sensing system and method
CN106982328B (zh) * 2017-04-28 2020-01-10 Oppo广东移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN107105141B (zh) * 2017-04-28 2019-06-28 Oppo广东移动通信有限公司 图像传感器、图像处理方法、成像装置和移动终端
JP7144458B2 (ja) * 2017-06-30 2022-09-29 ポライト アーエスアー モバイル機器に組み込むための複数のカメラを有するモジュール
CN107483717A (zh) * 2017-07-19 2017-12-15 广东欧珀移动通信有限公司 红外补光灯的设置方法及相关产品
US10397465B2 (en) * 2017-09-01 2019-08-27 Qualcomm Incorporated Extended or full-density phase-detection autofocus control
KR102385844B1 (ko) * 2017-09-15 2022-04-14 삼성전자 주식회사 제 1 이미지 센서에서 제공된 신호를 이용하여 제 2 이미지 센서에서 데이터를 획득하는 방법 및 전자 장치
CN109756713B (zh) * 2017-11-08 2021-12-21 超威半导体公司 图像捕获装置、执行处理的方法以及计算机可读介质
JP2019175912A (ja) * 2018-03-27 2019-10-10 ソニーセミコンダクタソリューションズ株式会社 撮像装置、及び、画像処理システム
CN108900751A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种图像传感器、移动终端及拍摄方法
US11405535B2 (en) * 2019-02-28 2022-08-02 Qualcomm Incorporated Quad color filter array camera sensor configurations
CN210093262U (zh) * 2019-06-12 2020-02-18 Oppo广东移动通信有限公司 移动终端
KR20230045461A (ko) * 2021-09-28 2023-04-04 삼성전자주식회사 화이트 밸런스를 수행하는 영상 획득 장치 및 이를 포함하는 전자 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017171412A2 (ko) * 2016-03-30 2017-10-05 엘지전자 주식회사 이미지 처리 장치 및 이동 단말기
CN107040724A (zh) * 2017-04-28 2017-08-11 广东欧珀移动通信有限公司 双核对焦图像传感器及其对焦控制方法和成像装置
CN207354459U (zh) * 2017-10-19 2018-05-11 维沃移动通信有限公司 一种摄像头模组和移动终端
CN108271012A (zh) * 2017-12-29 2018-07-10 维沃移动通信有限公司 一种深度信息的获取方法、装置以及移动终端
CN108900772A (zh) * 2018-07-19 2018-11-27 维沃移动通信有限公司 一种移动终端及图像拍摄方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3826292A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112616009A (zh) * 2020-12-31 2021-04-06 维沃移动通信有限公司 电子设备及其摄像模组
CN112616009B (zh) * 2020-12-31 2022-08-02 维沃移动通信有限公司 电子设备及其摄像模组

Also Published As

Publication number Publication date
EP3826292A1 (en) 2021-05-26
US20220367550A1 (en) 2022-11-17
US20240014236A9 (en) 2024-01-11
EP3826292A4 (en) 2021-08-04
CN108900772A (zh) 2018-11-27

Similar Documents

Publication Publication Date Title
WO2020015627A1 (zh) 图像传感器、移动终端及图像拍摄方法
WO2020015626A1 (zh) 移动终端及图像拍摄方法
CN108900750B (zh) 一种图像传感器及移动终端
WO2020015532A1 (zh) 图像传感器、移动终端及拍摄方法
CN108965666B (zh) 一种移动终端及图像拍摄方法
JP5542247B2 (ja) 撮像素子及び撮像装置
WO2020015560A1 (zh) 图像传感器及移动终端
WO2019091426A1 (zh) 摄像头组件、图像获取方法及移动终端
WO2019144956A1 (zh) 图像传感器、镜头模组、移动终端、人脸识别方法及装置
CN110913144B (zh) 图像处理方法及摄像装置
US11996421B2 (en) Image sensor, mobile terminal, and image capturing method
CN108965703A (zh) 一种图像传感器、移动终端及图像拍摄方法
CN110248050B (zh) 一种摄像头模组及移动终端
CN115278000A (zh) 图像传感器、图像生成方法、摄像头模组及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19838874

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019838874

Country of ref document: EP