WO2017101555A1 - 图像传感器的成像方法、成像装置和电子装置 - Google Patents

图像传感器的成像方法、成像装置和电子装置 Download PDF

Info

Publication number
WO2017101555A1
WO2017101555A1 PCT/CN2016/100649 CN2016100649W WO2017101555A1 WO 2017101555 A1 WO2017101555 A1 WO 2017101555A1 CN 2016100649 W CN2016100649 W CN 2016100649W WO 2017101555 A1 WO2017101555 A1 WO 2017101555A1
Authority
WO
WIPO (PCT)
Prior art keywords
photosensitive
pixels
image
frame
merged
Prior art date
Application number
PCT/CN2016/100649
Other languages
English (en)
French (fr)
Inventor
雷辉
Original Assignee
广东欧珀移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东欧珀移动通信有限公司 filed Critical 广东欧珀移动通信有限公司
Priority to MYPI2017702287A priority Critical patent/MY189048A/en
Priority to AU2016370337A priority patent/AU2016370337B2/en
Priority to EP16874615.4A priority patent/EP3226538A4/en
Priority to JP2017534320A priority patent/JP6564462B2/ja
Priority to KR1020177017185A priority patent/KR20170094242A/ko
Priority to US15/546,490 priority patent/US10225465B2/en
Priority to SG11201705233RA priority patent/SG11201705233RA/en
Priority to KR1020187036258A priority patent/KR102041068B1/ko
Publication of WO2017101555A1 publication Critical patent/WO2017101555A1/zh
Priority to ZA2017/05170A priority patent/ZA201705170B/en
Priority to US15/823,358 priority patent/US9979883B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/447Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by preserving the colour pattern with or without loss of information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14618Containers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/045Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/045Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
    • H04N2209/046Colour interpolation to calculate the missing colour values

Definitions

  • the present invention relates to the field of imaging technologies, and in particular, to an imaging method, an imaging device, and an electronic device of an image sensor.
  • the multi-frame synthesis technique adopted by the mobile phone in the related art has a problem that it takes a long time to wait for the multi-frame data because it is necessary to acquire multi-frame data.
  • the multi-frame synthesis technique adopted by the mobile phone in the related art has a problem that it takes a long time to wait for the multi-frame data because it is necessary to acquire multi-frame data.
  • ghosting is easily generated after the synthesis.
  • an object of the present invention aims to solve at least one of the technical problems in the related art to some extent.
  • an object of the present invention is to provide an image sensor imaging method that requires only one frame output of an image sensor to obtain a high dynamic range image by synthesis, thereby greatly reducing multi-frame synthesis.
  • the time for waiting for the data frame, and because the data for multi-frame synthesis comes from the same frame of the image sensor, can prevent ghost generation, thereby greatly improving the user experience.
  • a second object of the present invention is to provide an image forming apparatus.
  • a third object of the present invention is to provide an electronic device.
  • a fourth object of the present invention is to provide a mobile terminal.
  • a fifth object of the present invention is to provide a non-volatile computer storage medium.
  • an embodiment of the first aspect of the present invention provides an imaging method of an image sensor, the image sensor including a photosensitive pixel array and a filter disposed on the photosensitive pixel array, the filter including An array of filter units, each of the filter unit and a plurality of adjacent photosensitive pixels in the photosensitive pixel array covered by the filter unit together form a merged pixel, the imaging method comprising the steps of: reading the light-sensing An output of the pixel array, combining pixel values of the photosensitive pixels of different merged pixels from the read single-frame high-resolution image to obtain a multi-frame low-resolution image; performing the multi-frame low-resolution image synthesis.
  • An embodiment of the second aspect of the present invention provides an imaging device, an image sensor, comprising: a photosensitive pixel array; a filter disposed on the photosensitive pixel array, the filter comprising a filter unit array Each of the filter unit and the adjacent plurality of the photosensitive pixels in the photosensitive pixel array covered by the filter unit together form a merged pixel; and an image processing module connected to the image sensor, the image processing module And reading the output of the photosensitive pixel array, and extracting pixel values of the photosensitive pixels of different merged pixels from the read single-frame high-resolution image to obtain a multi-frame low-resolution image, and The multi-frame low resolution image is synthesized.
  • An embodiment of the third aspect of the present invention provides an electronic device comprising the image forming apparatus of the second aspect of the present invention.
  • a fourth aspect of the present invention provides a mobile terminal, including a housing, a processor, a memory, a circuit board, a power supply circuit, and an image sensor, wherein the circuit board is disposed inside a space enclosed by the housing, The processor, the memory and the image sensor are disposed on the circuit board; the power circuit is configured to supply power to each circuit or device of the mobile terminal; the image sensor includes a photosensitive pixel array and a setting a filter on the photosensitive pixel array, the filter comprising an array of filter units, each of the filter unit and a plurality of adjacent photosensitive pixels in the photosensitive pixel array covered by the filter unit Forming a merged pixel; the memory is for storing executable program code; the processor running a program corresponding to the executable program code by reading executable program code stored in the memory for execution The following steps:
  • the multi-frame low resolution image is synthesized.
  • a fifth aspect of the present invention provides a non-volatile computer storage medium storing one or more programs, when the one or more programs are executed by a device, causing the device An imaging method of the image sensor of the embodiment of the first aspect of the invention is performed.
  • the invention realizes that only one frame output of the image sensor is needed, and a high dynamic range image can be obtained by synthesis, thereby greatly reducing the waiting time of the data frame in the multi-frame synthesis, and the data for multi-frame synthesis comes from Figure Like the same frame of the sensor, it can prevent the generation of ghosts, which greatly enhances the user experience.
  • FIG. 1A is a flow chart of an image forming method of an image sensor according to an embodiment of the present invention.
  • FIG. 1B is a flow chart of an image forming method of an image sensor according to an embodiment of the present invention.
  • 1C is a flow chart of an image forming method of an image sensor according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of obtaining a multi-frame low resolution image in accordance with an embodiment of the present invention
  • FIG. 3 is a block schematic diagram of an image forming apparatus in accordance with one embodiment of the present invention.
  • FIG. 4A is a schematic diagram of a filter unit array in accordance with one embodiment of the present invention.
  • FIG. 4B is a schematic structural view of an image sensor according to an embodiment of the present invention.
  • FIG. 4C is a schematic structural view of an image sensor according to another embodiment of the present invention.
  • Figure 5 is a schematic illustration of a photosensitive pixel and associated circuitry in accordance with one embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 1A is a flow chart of an imaging method of an image sensor in accordance with one embodiment of the present invention.
  • the image sensor includes a photosensitive pixel array and a filter disposed on the photosensitive pixel array
  • the filter includes an array of filter units
  • the filter unit array includes a plurality of filter units
  • each filter unit and the filter A plurality of adjacent photosensitive pixels in the photosensitive pixel array covered by the unit together constitute a combined pixel.
  • adjacent four merged pixels constitute one merged pixel unit
  • four adjacent filter units in each merged pixel unit include a red filter unit and a blue filter. Unit and two green filter units.
  • each of the filter unit 1315 and the adjacent four photosensitive pixels 111 in the photosensitive pixel array 11 covered by the filter unit 1315 together form a merged pixel 14 .
  • the adjacent four merged pixels together form a combined pixel unit including sixteen photosensitive pixels 111.
  • the adjacent four photosensitive pixels 111 share a filter unit 1315 of the same color, for example, adjacent to the dotted line in FIG.
  • the four photosensitive pixels Gr1, Gr2, Gr3, and Gr4 correspond to the green filter unit 1315.
  • Gr, R, B, and Gb are respectively used to identify the color of the filter unit 1315
  • numerals 1, 2, 3, and 4 are used to identify the positions of the four adjacent photosensitive pixels 111 below the filter unit 1315.
  • R is used to identify the red filter unit 1315
  • B is used to identify the blue filter unit 1315
  • Gr, Gb is used to identify the green filter unit 1315.
  • the filter unit 1315 shared by the adjacent four photosensitive pixels 111 may be of a unitary construction or may be assembled and connected by four independent filters.
  • the filter unit 1315 is an integrated structure in the embodiment of the present invention.
  • an imaging method of an image sensor according to an embodiment of the present invention includes the following steps:
  • pixel values of the photosensitive pixels of different merged pixels are extracted from the read single-frame high-resolution image to be combined to obtain a multi-frame low-resolution image, which includes: The pixel values of the photosensitive pixels of the same position in which the different merged pixels are extracted in the single-frame high-resolution image are combined to obtain a multi-frame low-resolution image.
  • the position of the extracted pixel in the single-frame high-resolution image may also be adjusted according to the requirements of the actual composite image, for example, extracting different combined pixels from the read single-frame high-resolution image.
  • the pixel values of the position of the photosensitive pixels are combined to obtain a multi-frame low-resolution image.
  • the pixel values of the photosensitive pixels 111 at the same position of the four different merged pixels 14 are respectively extracted from the single-frame high-resolution image to obtain a 4-frame low-resolution image.
  • the obtained four photosensitive pixels 111 of the first frame low resolution image are extracted from the same positions of the four filter units 1315 included in the adjacent four merged pixels 14 at the same position, Gr1, R1, B1, The pixel value of the photosensitive pixel 111 corresponding to Gb1.
  • the acquired multi-frame low-resolution images are synthesized to generate a high dynamic range image.
  • the imaging method of the embodiment of the present invention only needs one frame output of the image sensor, and can obtain a high dynamic range image by synthesizing, thereby greatly reducing the waiting time of the data frame in the multi-frame synthesis, and also being used for multi-frame synthesis.
  • the data comes from the same frame output of the image sensor, thus preventing the generation of ghosts, which greatly enhances the user experience.
  • each filter unit and adjacent n*n photosensitive pixels in the photosensitive pixel array covered by the filter unit together form a merged pixel.
  • the imaging method of the image sensor specifically includes the following steps:
  • S102 Synthesize an image of at least m frames of low resolution; wherein n, m are both natural numbers greater than 1, and m is less than or equal to n*n.
  • one merged pixel includes n*n photosensitive pixels
  • the pixel values of the photosensitive pixels of different merged pixels are extracted from the read single-frame high-resolution image for combination, and at most n*n frame low resolution can be obtained.
  • the image can acquire at least m frames of low resolution images for multi-frame synthesis.
  • each of the filter units and the adjacent 2*2 photosensitive pixels in the photosensitive pixel array covered by the filter unit together form a merged pixel.
  • the imaging method specifically includes the following steps:
  • S201 Read an output of the photosensitive pixel array, and combine pixel values of the photosensitive pixels of different combined pixels from the read single-frame high-resolution image to obtain a 4-frame low-resolution image.
  • the frame rate of the 16M image sensor in the dark is 8 frames.
  • the multi-frame synthesis method in the related art requires the image sensor to output 4 frames of data, that is, multi-frame.
  • the time for waiting for the data frame in the synthesis is 0.5 s.
  • the imaging method of the image sensor of the embodiment of the present invention only needs the image sensor to output 1 frame of data, and extracts different merged pixels from the read high-resolution image of the 1 frame.
  • the 1 frame data can be divided into 4 4M images, that is, the time for waiting for the data frame in the multi-frame synthesis only needs 0.125s, thereby greatly reducing the waiting data in the multi-frame synthesis.
  • the time of the frame thus giving the user a better photo experience.
  • each filter unit covering a plurality of photosensitive pixels can be any n*m structure (n, m) besides n*n (for example, 2*2, 3*3, 4*4) structures.
  • n*m for example, 2*2, 3*3, 4*4 structures.
  • the resolution of the obtained low-resolution image is limited, for example, if the photosensitive The pixel array has a pixel value of 16M.
  • Using a 2*2 structure four low-resolution images with a resolution of 4M are obtained, while with a 4*4 structure, only 16 low-resolution images with a resolution of 1M can be obtained. . Therefore, the 2*2 structure is a better arrangement to enhance image brightness and sharpness while minimizing the resolution.
  • the image sensor further includes a lens array disposed above the filter unit, the lens array including a plurality of microlenses, each microlens being disposed corresponding to a photosensitive pixel, wherein the lens array is used to concentrate the light The photosensitive portion of the photosensitive pixel below the filter, thereby increasing the received light intensity of the photosensitive pixel to improve image quality.
  • the present invention also proposes an image forming apparatus.
  • an imaging apparatus 100 includes an image sensor 10 and an image processing module 20 connected to the image sensor 10.
  • the image sensor 10 includes a photosensitive pixel array 11 and a filter 13 disposed above the photosensitive pixel array 11.
  • the filter 13 includes a filter unit array 131, which is included in the filter unit array 131 A plurality of filter units 1315.
  • Each of the filter units 1315 and the adjacent plurality of photosensitive pixels 111 located under the filter unit 1315 together constitute one merged pixel 14.
  • four adjacent merged pixels 14 form a merged pixel unit (not shown).
  • the plurality of filter units 1315 arranged adjacent to each of the merged pixel units includes a red filter unit 1315, a blue filter unit 1315, and two green filter units 1315.
  • each of the filter units 1315 covers four adjacent photosensitive pixels 111 numbered 1, 2, 3, 4 in the photosensitive pixel array 11, as shown in FIG. 4B, and each filter unit 1315 is located at the filter.
  • the adjacent four adjacent photosensitive pixels 111 under the light unit 1315 collectively constitute one merged pixel 14.
  • the adjacent four merged pixels 14 constitute a combined pixel unit containing sixteen photosensitive pixels 111.
  • the adjacent four photosensitive pixels 111 share one filter unit 1315 of the same color, and the adjacent four filter units 1315 (including one red filter unit 1315, one blue filter unit 1315, and two green filters) Units 1315) collectively form a set of filter structures 1313.
  • the filter unit 1315 shared by the adjacent four photosensitive pixels 111 may be an integrated structure or assembled and connected by four independent filters.
  • the filter unit 1315 common to the four adjacent photosensitive pixels 111 is of an integral configuration (please refer to FIG. 4B).
  • the image processing module 20 is configured to read the output of the photosensitive pixel array 11 and combine the pixel values of the photosensitive pixels 111 of different merged pixels from the read single-frame high-resolution image to obtain a multi-frame low-resolution image. And synthesizing images with multiple frames of low resolution.
  • the image processing module 20 is specifically configured to: combine pixel values of the photosensitive pixels 111 at the same position of different merged pixels from the read single-frame high-resolution image to obtain a multi-frame low. Resolution image.
  • the extracted pixel position in the single-frame high-resolution image may also be adjusted according to the requirements of the actual composite image.
  • the image processing module 20 extracts different from the read single-frame high-resolution image.
  • the pixel values of the photosensitive pixels at different positions of the merged pixels are combined to obtain a multi-frame low-resolution image.
  • the image processing module 20 extracts the pixel values of the photosensitive pixels 111 at the same position of the four different merged pixels 14 from the single-frame high-resolution image to obtain a 4-frame low-resolution image.
  • the obtained four photosensitive pixels 111 of the first frame low resolution image are extracted from the same positions of the four filter units 1315 included in the adjacent four merged pixels 14 at the same position, Gr1, R1, B1, The pixel value of the photosensitive pixel 111 corresponding to Gb1.
  • the image processing module 20 synthesizes the acquired multi-frame low-resolution images to generate a high dynamic range image.
  • the image processing module only needs one frame output of the image sensor, and can obtain a high dynamic range image by means of synthesis, thereby greatly reducing the waiting time of the data frame in the multi-frame synthesis, and Multi-frame composite data comes from the same frame output of the image sensor, thus preventing the generation of ghosts, which greatly enhances user experience.
  • each of the filter unit 1315 and the adjacent n*n photosensitive pixels 111 in the photosensitive pixel array 11 covered by the filter unit 1315 together form a merged pixel
  • the image processing module 20 specifically For reading an output of the photosensitive pixel array 11, extracting pixel values of the photosensitive pixels 111 of adjacent merged pixels from the read single-frame high-resolution image, and combining to obtain an image of at least m frames of low resolution, and Synthesizing at least m frames of low resolution images; wherein n, m are both natural numbers greater than 1, and m is less than or equal to n*n.
  • each of the filter unit 1315 and the adjacent 2*2 photosensitive pixels 111 in the photosensitive pixel array 11 covered by the filter unit 1315 together form a merged pixel 14, and the image processing module 20 Specifically, the output of the photosensitive pixel array 11 is read, and the pixel values of the photosensitive pixels 111 of different merged pixels are extracted from the read single-frame high-resolution image to be combined to obtain a 4-frame low-resolution image, and 4 frames of low resolution images are synthesized.
  • the frame rate of the 16M image sensor in the dark is 8 frames.
  • the multi-frame synthesis method in the related art requires the image sensor to output 4 frames of data, that is, multi-frame.
  • the time for waiting for the data frame in the synthesis is 0.5 s.
  • the image processing module 20 extracts the photosensitive pixels of the different merged pixels from the 1 frame of the high resolution image.
  • the 1 frame data can be divided into 4 4M images, that is, the time for waiting for the data frame in the multi-frame synthesis only needs 0.125s, thereby greatly reducing the waiting for data frames in the multi-frame synthesis. Time, thus giving users a better photo experience.
  • each of the light filtering units 1315 covering the plurality of photosensitive pixels 111 may be any n*m structure except for the n*n (for example, 2*2, 3*3, 4*4) structure.
  • m is a natural number). Since the number of the photosensitive pixels 111 that can be arranged on the photosensitive pixel array 11 is limited, and the number of the photosensitive pixels 111 covered by each of the filter units 1315 is excessive, the resolution of the obtained low-resolution image is limited. For example, if the pixel value of the photosensitive pixel array 11 is 16M, 4 resolutions of 4M low resolution images can be obtained with a 2*2 structure, and 16 resolutions of 1M can be obtained with a 4*4 structure. Low resolution image. Therefore, the 2*2 structure is a better arrangement to enhance image brightness and sharpness while minimizing the resolution.
  • each merged pixel 14 of image sensor 10 further includes a lens array 15 disposed above filter unit 1315.
  • Each microlens 151 on the lens array 15 corresponds to one photosensitive pixel 111, including shape, size, and position.
  • the microlens 151 is used to condense light onto the photosensitive portion 112 of the photosensitive pixel 111 to enhance the received light intensity of the photosensitive pixel 111, thereby improving the image quality.
  • each filter unit 1315 corresponds to 2*2 photosensitive pixels 111 and 2*2 microlenses 151.
  • FIG. 5 is a schematic diagram of a photosensitive pixel and related circuits.
  • the photosensitive pixel 111 includes a photodiode 1113.
  • the connection relationship between the photosensitive pixel 111 and the switch tube 1115, the source follower 1117 (source follower) and the analog-to-digital converter 17 (analog-to-digital converter) is as shown in FIG. 5. That is, one photosensitive pixel 111 corresponds to one source follower 1117 and one analog to digital converter 17.
  • the photodiode 1113 is used to convert the illumination into electric charge, and the generated electric charge is proportional to the illumination intensity; the switch tube 1115 is configured to control the conduction of the circuit according to the control signals of the row selection logic unit 41 and the column selection logic unit 43. And disconnected, when the circuit is turned on, the source follower 1117 is used to convert the charge signal generated by the photodiode 1113 by illumination into a voltage signal. Analog to digital converter 17 is used to convert the voltage signal into a digital signal and transmit it to image processing module 20 for processing.
  • the row selection logic unit 41 and the column selection logic unit 43 are connected to a control module (not shown) of the imaging device 100, and are controlled by the control module of the imaging device 100.
  • the image processing module only needs to obtain one frame output of the image sensor, and can obtain a high dynamic range image by synthesizing, thereby greatly reducing the waiting time of the data frame in the multi-frame synthesis, and
  • the data used for multi-frame synthesis comes from the same frame of the image sensor, thus preventing the generation of ghosts, which greatly enhances the user experience.
  • the present invention also proposes an electronic device.
  • the electronic device includes the imaging device of the embodiment of the present invention.
  • the electronic device of the embodiment of the present invention since only one frame output of the image sensor is required at the time of shooting, a high dynamic range image can be obtained by synthesizing, thereby greatly reducing the waiting data in the multi-frame synthesis.
  • the time of the frame because the data used for multi-frame synthesis comes from the same frame of the image sensor, thereby preventing the generation of ghosts, thereby greatly improving the user experience.
  • the present invention also proposes a mobile terminal.
  • FIG. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • the mobile terminal 60 proposed in this embodiment includes a housing 601, a processor 602, a memory 603, a circuit board 604, a power supply circuit 605, and an image sensor 606.
  • the circuit board 604 is disposed in the housing 601. Inside the space, the processor 602, the memory 603 and the image sensor 606 are disposed on the circuit board 604; the power circuit 605 is used to supply power to the various circuits or devices of the mobile terminal 60; the image sensor 606 includes a photosensitive pixel array and is disposed in the photosensitive a filter on the pixel array, the filter comprises an array of filter units, each filter unit and a plurality of adjacent pixels in the photosensitive pixel array covered by the filter unit together form a merged pixel;
  • the executable program code is stored; the processor 602 runs the program corresponding to the executable program code by reading the executable program code stored in the memory 603 for performing the following steps:
  • the mobile terminal of the embodiment of the present invention reads a program corresponding to the executable program code by the processor reading the executable program code stored in the memory, for performing the following steps: reading the output of the photosensitive pixel array, and reading from In the single-frame high-resolution image, the pixel values of the photosensitive pixels of different merged pixels are combined to obtain a multi-frame low-resolution image, and the multi-frame low-resolution image is synthesized.
  • the present invention also provides a non-volatile computer storage medium storing one or more programs, when the one or more programs are executed by one device, causing the device to perform the present invention.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include at least one of the features, either explicitly or implicitly.
  • the meaning of "a plurality” is at least two, such as two, three, etc., unless specifically defined otherwise.
  • the terms “installation”, “connected”, “connected”, “fixed” and the like shall be understood broadly, and may be either a fixed connection or a detachable connection, unless explicitly stated and defined otherwise. , or integrated; can be mechanical or electrical connection; can be directly connected, or indirectly connected through an intermediate medium, can be the internal communication of two elements or the interaction of two elements, unless otherwise specified Limited.
  • the specific meanings of the above terms in the present invention can be understood on a case-by-case basis.
  • the first feature "on” or “under” the second feature may be a direct contact of the first and second features, or the first and second features may be indirectly through an intermediate medium, unless otherwise explicitly stated and defined. contact.
  • the first feature "above”, “above” and “above” the second feature may be that the first feature is directly above or above the second feature, or merely that the first feature level is higher than the second feature.
  • the first feature is “below”, “below” and “below” the second feature
  • the face may be that the first feature is directly below or below the second feature, or merely that the first feature level is less than the second feature.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuit, ASIC with suitable combinational logic gate, Programmable Gate Array (PGA), Field programmable gate array (FPGA), etc.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as stand-alone products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

公开了一种图像传感器的成像方法、成像装置和电子装置,图像传感器包括感光像素阵列及设置在感光像素阵列上的滤光片,滤光片包括滤光单元阵列,每个滤光单元和该滤光单元所覆盖感光像素阵列中相邻的多个感光像素共同构成一合并像素,成像方法包括:读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;对多帧低分辨率的图像进行合成。该成像方法,只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,既减少了等待数据帧的时间,又能够防止鬼影的产生。

Description

图像传感器的成像方法、成像装置和电子装置
相关申请的交叉引用
本申请要求广东欧珀移动通信有限公司于2015年12月18日提交的、发明名称为“图像传感器的成像方法、成像装置和电子装置”的、中国专利申请号“201510963341.3”的优先权。
技术领域
本发明涉及成像技术领域,尤其涉及一种图像传感器的成像方法、成像装置和电子装置。
背景技术
目前,手机拍照功能的多样化赢得了广大用户的喜爱,很多手机在拍照时采用了多帧合成技术,即让手机的图像传感器连续出几张图像,再由软件来合成,以达到不同的拍摄效果(例如HDR、夜景效果等),以丰富使用体验。
但是,相关技术中的手机所采用的多帧合成技术,由于需要获取多帧数据,存在等待多帧数据所需时间较长的问题。另外,如果在拍摄多帧数据时画面中有物体在移动,那么合成之后很容易产生鬼影。
发明内容
本发明旨在至少在一定程度上解决相关技术中的技术问题之一。为此,本发明的一个目的在于提出一种图像传感器的成像方法,该成像方法只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少了多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧,从而能够防止鬼影的产生,进而大大提升了用户体验。
本发明的第二个目的在于提出一种成像装置。
本发明的第三个目的在于提出一种电子装置。
本发明的第四个目的在于提出一种移动终端。
本发明的第五个目的在于提出一种非易失性计算机存储介质。
为了实现上述目的,本发明第一方面实施例提出了一种图像传感器的成像方法,所述图像传感器包括感光像素阵列及设置在所述感光像素阵列上的滤光片,所述滤光片包括滤光单元阵列,每个所述滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素,所述成像方法包括以下步骤:读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;对所述多帧低分辨率的图像进行合成。
本发明第二方面实施例提出了一种成像装置,图像传感器,所述图像传感器包括:感光像素阵列;设置于所述感光像素阵列上的滤光片,所述滤光片包括滤光单元阵列,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个所述感光像素共同构成一合并像素;以及与所述图像传感器相连的图像处理模块,所述图像处理模块用于读取所述感光像素阵列的输出,并从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像,以及对所述多帧低分辨率的图像进行合成。
本发明第三方面实施例提出了一种电子装置,包括本发明第二方面实施例的成像装置。
本发明第四方面实施例提出了一种移动终端,包括壳体、处理器、存储器、电路板、电源电路和图像传感器,其中,所述电路板安置在所述壳体围成的空间内部,所述处理器、所述存储器和所述图像传感器设置在所述电路板上;所述电源电路,用于为所述移动终端的各个电路或器件供电;所述图像传感器包括感光像素阵列及设置在所述感光像素阵列上的滤光片,所述滤光片包括滤光单元阵列,每个所述滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素;所述存储器用于存储可执行程序代码;所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行以下步骤:
读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;
对所述多帧低分辨率的图像进行合成。
本发明第五方面实施例提供了一种非易失性计算机存储介质,所述计算机存储介质存储有一个或者多个程序,当所述一个或者多个程序被一个设备执行时,使得所述设备执行以本发明第一方面实施例的图像传感器的成像方法。
本发明实现了只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少了多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图 像传感器的同一帧,从而可以防止鬼影的产生,进而大大提升了用户体验。
附图说明
图1A是根据本发明一个实施例的图像传感器的成像方法的流程图;
图1B是根据本发明一个具体实施例的图像传感器的成像方法的流程图;
图1C是根据本发明一个具体实施例的图像传感器的成像方法的流程图;
图2是根据本发明一个具体实施例获得多帧低分辨率的图像的原理图;
图3是根据本发明一个实施例的成像装置的方框示意图;
图4A是根据本发明一个实施例的滤光单元阵列的示意图;
图4B是根据本发明一个实施例的图像传感器的结构示意图;
图4C是根据本发明另一个实施例的图像传感器的结构示意图;
图5是根据本发明一个实施例的感光像素及相关电路的示意图;
图6是根据本发明一个实施例的终端设备的结构示意图。
具体实施方式
下面详细描述本发明的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本发明,而不能理解为对本发明的限制。
下面参考附图描述本发明实施例的图像传感器的成像方法、成像装置和电子装置。
图1A是根据本发明一个实施例的图像传感器的成像方法的流程图。
首先对本发明实施例的方法所采用的图像传感器进行说明。
具体地,图像传感器包括感光像素阵列及设置在感光像素阵列上的滤光片,滤光片包括滤光单元阵列,滤光单元阵列包括多个滤光单元,每个滤光单元和该滤光单元所覆盖感光像素阵列中相邻的多个感光像素共同构成一合并像素。
在本发明的一个实施例中,相邻的四个合并像素构成一个合并像素单元,每个合并像素单元中相邻排布的四个滤光单元包括一个红色滤光单元、一个蓝色滤光单元和两个绿色滤光单元。
请同时参阅图2和图4B,每个滤光单元1315和该滤光单元1315所覆盖感光像素阵列11中相邻的四个感光像素111共同构成一个合并像素14。相邻的四个合并像素共同组成一个包括十六个感光像素111的合并像素单元。
相邻的四个感光像素111共用一个同颜色的滤光单元1315,例如图2中虚线框内相邻 的四个感光像素Gr1、Gr2、Gr3和Gr4对应绿色的滤光单元1315。在图2中,Gr、R、B、Gb分别用于标识滤光单元1315的颜色,数字1,2,3,4用于标识滤光单元1315下方相邻的四个感光像素111的位置。具体地,R用于标识红色的滤光单元1315,B用于标识蓝色的滤光单元1315,Gr、Gb用于标识绿色的滤光单元1315。
相邻的四个感光像素111所共用的滤光单元1315可以为一体构造,或者由四个独立的滤光片组装连接在一起。优选地,本发明实施例中滤光单元1315为一体构造。
请再次参阅图1A,本发明实施例的图像传感器的成像方法,包括以下步骤:
S1,读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
在本发明的一个实施例中,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像,具体包括:从读取的单帧高分辨率图像中抽取不同合并像素的相同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
可以理解地,本发明实施例中对单帧高分辨率图像中抽取像素位置也可以根据实际合成图像的需求作调整,比如:从读取的单帧高分辨率图像中抽取不同合并像素的不同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
请一并参阅图2和图4B,从单帧高分辨率图像中分别抽取4个不同合并像素14的相同位置的感光像素111的像素值进行组合,以获得4帧低分辨率的图像。举例来说,所获得的第一帧低分辨率的图像的四个感光像素111均抽取自相邻的四个合并像素14所包含的四个滤光单元1315相同位置处Gr1,R1,B1,Gb1对应的感光像素111的像素值。
S2,对多帧低分辨率的图像进行合成。
具体地,将获取到的多帧低分辨率的图像进行合成,以生成高动态范围的图像。
本发明实施例的成像方法,只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧输出,从而防止鬼影的产生,进而大大提升了用户体验。
在本发明的一个具体实施例中,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的n*n个感光像素共同构成一个合并像素。如图1B所示,图像传感器的成像方法具体包括以下步骤:
S101,读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取相邻合并像素的感光像素的像素值进行组合,以获得至少m帧低分辨率的图像。
S102,对至少m帧低分辨率的图像进行合成;其中,n,m均为大于1的自然数,m取值小于等于n*n。
具体地,由于一个合并像素包括n*n个感光像素,那么从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,最多可以获得n*n帧低分辨率的图像,根据实际需求,可以获取至少m帧低分辨率的图像用于多帧合成。
在本发明的一个具体实施例中,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的2*2个感光像素共同构成一个合并像素。如图1C所示,成像方法具体包括以下步骤:
S201,读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得4帧低分辨率的图像。
S202,对4帧低分辨率的图像进行合成。
举例来讲,假设16M图像传感器在暗处的帧率为8帧,若采用4帧数据进行多帧合成,那么相关技术中的多帧合成方式需要图像传感器输出4帧数据,也就是说多帧合成中等待数据帧的时间为0.5s;而本发明实施例的图像传感器的成像方法,则只需要图像传感器输出1帧数据,经过从读取的该1帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,就能将该1帧数据分成4张4M的图像,也就是说多帧合成中等待数据帧的时间仅需要0.125s,从而大大减少了多帧合成中等待数据帧的时间,从而给用户带来更好的拍照体验。
另外,在对4帧低分辨率的图像进行合成时,由于4帧4M的图像是从图像传感器的同一帧图像中分离出来的,差异很小,从而可以减小鬼影的产生。
可以理解的是,每个滤光单元覆盖多个感光像素的结构除了n*n(例如2*2、3*3、4*4)结构外,甚至可以是任意n*m结构(n,m为自然数)。由于感光像素阵列上可排列的感光像素的数目是有限的,每个滤光单元所覆盖的感光像素过多的话,所获得的低分辨率的图像的分辨率大小会受到限制,如,若感光像素阵列的像素值为16M,采用2*2结构会获得4张分辨率为4M的低分辨率的图像,而采用4*4结构就只能得到16张分辨率为1M的低分辨率的图像。因此2*2结构是一个较佳排列方式,在尽量少牺牲分辨率的前提下提升图像亮度及清晰度。
在本发明的一个实施例中,图像传感器还包括设置在滤光单元上方的透镜阵列,透镜阵列包括多个微透镜,每个微透镜与一个感光像素对应设置,该透镜阵列用于将光线汇聚到滤光片下方的感光像素的感光部分,从而提升感光像素的受光强度以改善图像画质。
为了实现上述实施例,本发明还提出了一种成像装置。
图3是根据本发明一个实施例的成像装置的方框示意图。如图3所示,本发明实施例的成像装置100,包括:图像传感器10和与图像传感器10相连的图像处理模块20。
请一并参阅图4A和图4B,图像传感器10包括感光像素阵列11及设置于感光像素阵列11上方的滤光片13。该滤光片13包括滤光单元阵列131,该滤光单元阵列131中包括 多个滤光单元1315。每个滤光单元1315和位于该滤光单元1315下方的相邻排布的多个感光像素111共同构成一个合并像素14。在本发明的一个实施例中,四个相邻的合并像素14构成一个合并像素单元(图未示)。每个合并像素单元中相邻排布的多个滤光单元1315包括一个红色滤光单元1315、一个蓝色滤光单元1315和两个绿色滤光单元1315。
以每个滤光单元1315覆盖感光像素阵列11中相邻的编号为1,2,3,4的四个感光像素111为例,如图4B所示,每个滤光单元1315和位于该滤光单元1315下方的相邻排布的四个感光像素111共同构成一个合并像素14。相邻的四个合并像素14组成一个包含十六个感光像素111的合并像素单元。
相邻的四个感光像素111共用一个同颜色的滤光单元1315,而相邻的四个滤光单元1315(包括一个红色滤光单元1315、一个蓝色滤光单元1315和两个绿色滤光单元1315)共同构成一组滤光结构1313。
其中,相邻的四个感光像素111所共用的滤光单元1315可以为一体构造,或者由四个独立的滤光片组装连接在一起。优选地,四个相邻的感光像素111所共用的滤光单元1315为一体构造(请参阅图4B)。
图像处理模块20用于读取感光像素阵列11的输出,并从读取的单帧高分辨率图像中抽取不同合并像素的感光像素111的像素值进行组合,以获得多帧低分辨率的图像,以及对多帧低分辨率的图像进行合成。
在本发明的一个实施例中,图像处理模块20具体用于:从读取的单帧高分辨率图像中抽取不同合并像素的相同位置的感光像素111的像素值进行组合,以获得多帧低分辨率的图像。
可以理解地,本发明实施例中对单帧高分辨率图像中抽取像素位置也可以根据实际合成图像的需求作调整,比如:图像处理模块20从读取的单帧高分辨率图像中抽取不同合并像素的不同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
请一并参阅图2和图4B,图像处理模块20从单帧高分辨率图像中抽取4个不同合并像素14的相同位置的感光像素111的像素值进行组合,以获得4帧低分辨率的图像。举例来说,所获得的第一帧低分辨率的图像的四个感光像素111均抽取自相邻的四个合并像素14所包含的四个滤光单元1315相同位置处Gr1,R1,B1,Gb1对应的感光像素111的像素值。
进一步地,图像处理模块20将获取到的多帧低分辨率的图像进行合成,以生成高动态范围的图像。
本发明实施例的成像装置,图像处理模块只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧输出,从而防止鬼影的产生,进而大大提升了 用户体验。
在本发明的一个实施例中,每个滤光单元1315和该滤光单元1315所覆盖的感光像素阵列11中相邻的n*n个感光像素111共同构成一个合并像素,图像处理模块20具体用于:读取感光像素阵列11的输出,从读取的单帧高分辨率图像中抽取相邻合并像素的感光像素111的像素值进行组合,以获得至少m帧低分辨率的图像,并对至少m帧低分辨率的图像进行合成;其中,n,m均为大于1的自然数,m取值小于等于n*n。
在本发明的一个实施例中,每个滤光单元1315和该滤光单元1315所覆盖的感光像素阵列11中相邻的2*2个感光像素111共同构成一个合并像素14,图像处理模块20具体用于:读取感光像素阵列11的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素111的像素值进行组合,以获得4帧低分辨率的图像,并对4帧低分辨率的图像进行合成。
举例来讲,假设16M图像传感器在暗处的帧率为8帧,若采用4帧数据进行多帧合成,那么相关技术中的多帧合成方式需要图像传感器输出4帧数据,也就是说多帧合成中等待数据帧的时间为0.5s;而本发明实施例的成像装置,则只需要图像传感器输出1帧数据,图像处理模块20从该1帧高分辨率图像中抽取不同合并像素的感光像素111的像素值进行组合,就能将该1帧数据分成4张4M的图像,也就是说多帧合成中等待数据帧的时间仅需要0.125s,从而大大减少了多帧合成中等待数据帧的时间,从而给用户带来更好的拍照体验。
另外,在对4帧低分辨率的图像进行合成时,由于4帧4M的图像是从图像传感器的同一帧图像中分离出来的,差异很小,从而可以减小鬼影的产生。
可以理解的是,每个滤光单元1315覆盖多个感光像素111的结构除了n*n(例如2*2、3*3、4*4)结构外,甚至可以是任意n*m结构(n,m为自然数)。由于感光像素阵列11上可排列的感光像素111的数目是有限的,每个滤光单元1315所覆盖的感光像素111过多的话,所获得的低分辨率的图像的分辨率大小会受到限制,如,若感光像素阵列11的像素值为16M,采用2*2结构会获得4张分辨率为4M的低分辨率的图像,而采用4*4结构就只能得到16张分辨率为1M的低分辨率的图像。因此2*2结构是一个较佳排列方式,在尽量少牺牲分辨率的前提下提升图像亮度及清晰度。
请参阅图4C,在本发明的一个实施例中,图像传感器10的每个合并像素14还包括设置在滤光单元1315上方的透镜阵列15。该透镜阵列15上的每个微透镜151与一个感光像素111对应,包括形状、大小、位置对应。微透镜151用于将光线汇聚到感光像素111的感光部分112上,以提升感光像素111的受光强度,从而改善成像画质。在某些实施方式中,每个滤光单元1315对应2*2个感光像素111及2*2个微透镜151。
请参阅图5,图5为感光像素及相关电路的示意图。在本发明的一个实施例中,感光像素111包括光电二极管1113。感光像素111与开关管1115、源极跟随器1117(source follower)和模数转换器17(anaolog-to-digital converter)的连接关系如图5所示。即一个感光像素111对应采用一个源极跟随器1117和一个模数转换器17。
其中,光电二极管1113用于将光照转化为电荷,且产生的电荷与光照强度成比例关系;开关管1115用于根据行选择逻辑单元41及列选择逻辑单元43的控制信号来控制电路的导通及断开,当电路导通时,源极跟随器1117用于将光电二极管1113经光照产生的电荷信号转化为电压信号。模数转换器17用于将电压信号转换为数字信号,并传输至图像处理模块20进行处理。其中,行选择逻辑单元41及列选择逻辑单元43与成像装置100的控制模块(图未示)相连,并由成像装置100的控制模块控制。
本发明实施例的成像装置,图像处理模块只需要获得图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少了多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧,从而防止鬼影的产生,进而大大提升了用户体验。
为了实现上述实施例,本发明还提出了一种电子装置。该电子装置包括本发明实施例的成像装置。
本发明实施例的电子装置,由于具有了该成像装置,在拍摄时只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少了多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧,从而防止鬼影的产生,进而大大提升了用户体验。
为了实现上述实施例,本发明还提出一种移动终端。
参见图6,图6是根据本发明一个实施例的终端设备的结构示意图。
如图6所示,本实施例提出的移动终端60包括:壳体601、处理器602、存储器603、电路板604、电源电路605和图像传感器606,其中,电路板604安置在壳体601围成的空间内部,处理器602、存储器603和图像传感器606设置在电路板604上;电源电路605,用于为移动终端60的各个电路或器件供电;图像传感器606包括感光像素阵列及设置在感光像素阵列上的滤光片,滤光片包括滤光单元阵列,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素;存储器603用于存储可执行程序代码;处理器602通过读取存储器603中存储的可执行程序代码来运行与可执行程序代码对应的程序,以用于执行以下步骤:
读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;
对多帧低分辨率的图像进行合成。
需要说明的是,前述对图像传感器的成像方法实施例的解释说明也适用于该实施例的移动终端,其实现原理类似,此处不再赘述。
本发明实施例的移动终端,通过处理器读取存储器中存储的可执行程序代码来运行与可执行程序代码对应的程序,以用于执行以下步骤:读取感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像,对多帧低分辨率的图像进行合成。由此,实现了只需要图像传感器的一帧输出,就能通过合成的方式获得高动态范围的图像,从而大大减少了多帧合成中等待数据帧的时间,又由于用于多帧合成的数据来自图像传感器的同一帧,从而可以防止鬼影的产生,进而大大提升了用户体验。
为了实现上述实施例,本发明还提出一种非易失性计算机存储介质,计算机存储介质存储有一个或者多个程序,当一个或者多个程序被一个设备执行时,使得设备执行以本发明第一方面实施例的图像传感器的成像方法。
在本发明的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本发明的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
在本发明中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系,除非另有明确的限定。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。
在本发明中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下 面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本发明的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA), 现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本发明各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (18)

  1. 一种图像传感器的成像方法,其特征在于,所述图像传感器包括感光像素阵列及设置在所述感光像素阵列上的滤光片,所述滤光片包括滤光单元阵列,每个所述滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素,所述成像方法包括以下步骤:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;
    对所述多帧低分辨率的图像进行合成。
  2. 如权利要求1所述的图像传感器的成像方法,其特征在于,所述从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像,具体包括:从读取的单帧高分辨率图像中抽取不同合并像素的相同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
  3. 如权利要求1所述的图像传感器的成像方法,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的n*n个感光像素共同构成一个合并像素,所述成像方法具体包括以下步骤:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取相邻合并像素的感光像素的像素值进行组合,以获得至少m帧低分辨率的图像;
    对所述至少m帧低分辨率的图像进行合成;其中,n,m均为大于1的自然数,m取值小于等于n*n。
  4. 如权利要求3所述的图像传感器的成像方法,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的2*2个感光像素共同构成一个合并像素,所述成像方法具体包括以下步骤:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得4帧低分辨率的图像;
    对所述4帧低分辨率的图像进行合成。
  5. 如权利要求1所述的图像传感器的成像方法,其特征在于,相邻的四个合并像素共同构成一个合并像素单元,每个合并像素单元中相邻排布的四个滤光单元包括一个红色滤光单元、一个蓝色滤光单元和两个绿色滤光单元。
  6. 一种成像装置,其特征在于,包括:
    图像传感器,所述图像传感器包括:
    感光像素阵列;
    设置于所述感光像素阵列上的滤光片,所述滤光片包括滤光单元阵列,每个所述滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素;以及
    与所述图像传感器相连的图像处理模块,所述图像处理模块用于读取所述感光像素阵列的输出,并从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像,以及对所述多帧低分辨率的图像进行合成。
  7. 如权利要求6所述的成像装置,其特征在于,所述图像处理模块具体用于:从读取的单帧高分辨率图像中抽取不同合并像素的相同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
  8. 如权利要求6所述的成像装置,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的n*n个感光像素共同构成一个合并像素,所述图像处理模块具体用于:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取相邻合并像素的感光像素的像素值进行组合,以获得至少m帧低分辨率的图像,并对所述至少m帧低分辨率的图像进行合成;其中,n,m均为大于1的自然数,m取值小于等于n*n。
  9. 如权利要求8所述的成像装置,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的2*2个感光像素共同构成一个合并像素,所述图像处理模块具体用于:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得4帧低分辨率的图像,并对所述4帧低分辨率的图像进行合成。
  10. 如权利要求6所述的成像装置,其特征在于,相邻的四个合并像素构成一个合并像素单元,每个合并像素单元中相邻排布的四个滤光单元包括一个红色滤光单元、一个蓝色滤光单元和两个绿色滤光单元。
  11. 如权利要求6所述的成像装置,其特征在于,每个合并像素还包括设置在滤光单元上方的透镜阵列,该透镜阵列用于将光线汇聚到滤光片下方的感光像素上。
  12. 如权利要求11所述的成像装置,其特征在于,该透镜阵列包括多个微透镜,每个微透镜与一个感光像素对应设置。
  13. 一种移动终端,其特征在于,所述移动终端包括壳体、处理器、存储器、电路板、电源电路和图像传感器,其中,所述电路板安置在所述壳体围成的空间内部,所述处理器、所述存储器和所述图像传感器设置在所述电路板上;所述电源电路,用于为所述移动终端的各个电路或器件供电;所述图像传感器包括感光像素阵列及设置在所述感光像素阵列上 的滤光片,所述滤光片包括滤光单元阵列,每个所述滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的多个感光像素共同构成一合并像素;所述存储器用于存储可执行程序代码;所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行以下步骤:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得多帧低分辨率的图像;
    对所述多帧低分辨率的图像进行合成。
  14. 如权利要求13所述的移动终端,其特征在于,所述处理器,具体用于:
    从读取的单帧高分辨率图像中抽取不同合并像素的相同位置的感光像素的像素值进行组合,以获得多帧低分辨率的图像。
  15. 如权利要求13所述的移动终端,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的n*n个感光像素共同构成一个合并像素,所述处理器,具体用于:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取相邻合并像素的感光像素的像素值进行组合,以获得至少m帧低分辨率的图像;
    对所述至少m帧低分辨率的图像进行合成;其中,n,m均为大于1的自然数,m取值小于等于n*n。
  16. 如权利要求15所述的移动终端,其特征在于,每个滤光单元和该滤光单元所覆盖的感光像素阵列中相邻的2*2个感光像素共同构成一个合并像素,所述处理器,具体用于:
    读取所述感光像素阵列的输出,从读取的单帧高分辨率图像中抽取不同合并像素的感光像素的像素值进行组合,以获得4帧低分辨率的图像;
    对所述4帧低分辨率的图像进行合成。
  17. 如权利要求13所述的移动终端,其特征在于,相邻的四个合并像素共同构成一个合并像素单元,每个合并像素单元中相邻排布的四个滤光单元包括一个红色滤光单元、一个蓝色滤光单元和两个绿色滤光单元。
  18. 一种非易失性计算机存储介质,其特征在于,所述计算机存储介质存储有一个或者多个程序,所述程序用于执行根据权利要求1-5任一项所述的图像传感器的成像方法。
PCT/CN2016/100649 2015-12-18 2016-09-28 图像传感器的成像方法、成像装置和电子装置 WO2017101555A1 (zh)

Priority Applications (10)

Application Number Priority Date Filing Date Title
MYPI2017702287A MY189048A (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging apparatus, and electronic device
AU2016370337A AU2016370337B2 (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging device and electronic device
EP16874615.4A EP3226538A4 (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging device and electronic device
JP2017534320A JP6564462B2 (ja) 2015-12-18 2016-09-28 画像センサの結像方法、結像装置及び電子装置
KR1020177017185A KR20170094242A (ko) 2015-12-18 2016-09-28 이미지 센서의 이미징 방법, 이미징 장치 및 전자 장치
US15/546,490 US10225465B2 (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging apparatus, and electronic device
SG11201705233RA SG11201705233RA (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging apparatus, and electronic device
KR1020187036258A KR102041068B1 (ko) 2015-12-18 2016-09-28 이미지 센서의 이미징 방법, 이미징 장치 및 전자 장치
ZA2017/05170A ZA201705170B (en) 2015-12-18 2017-07-31 Imaging method for image sensor, imaging device and electronic device
US15/823,358 US9979883B2 (en) 2015-12-18 2017-11-27 Imaging method for image sensor, imaging apparatus, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510963341.3 2015-12-18
CN201510963341.3A CN105578005B (zh) 2015-12-18 2015-12-18 图像传感器的成像方法、成像装置和电子装置

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/546,490 A-371-Of-International US10225465B2 (en) 2015-12-18 2016-09-28 Imaging method for image sensor, imaging apparatus, and electronic device
US15/823,358 Continuation US9979883B2 (en) 2015-12-18 2017-11-27 Imaging method for image sensor, imaging apparatus, and electronic device

Publications (1)

Publication Number Publication Date
WO2017101555A1 true WO2017101555A1 (zh) 2017-06-22

Family

ID=55887622

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100649 WO2017101555A1 (zh) 2015-12-18 2016-09-28 图像传感器的成像方法、成像装置和电子装置

Country Status (11)

Country Link
US (2) US10225465B2 (zh)
EP (1) EP3226538A4 (zh)
JP (2) JP6564462B2 (zh)
KR (2) KR102041068B1 (zh)
CN (1) CN105578005B (zh)
AU (1) AU2016370337B2 (zh)
MY (1) MY189048A (zh)
SG (1) SG11201705233RA (zh)
TW (1) TWI613918B (zh)
WO (1) WO2017101555A1 (zh)
ZA (1) ZA201705170B (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578005B (zh) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN106604001B (zh) * 2016-11-29 2018-06-29 广东欧珀移动通信有限公司 图像处理方法、图像处理装置、成像装置及电子装置
CN106507068B (zh) 2016-11-29 2018-05-04 广东欧珀移动通信有限公司 图像处理方法及装置、控制方法及装置、成像及电子装置
CN106412407B (zh) 2016-11-29 2019-06-07 Oppo广东移动通信有限公司 控制方法、控制装置及电子装置
CN107734231B (zh) * 2017-11-07 2019-12-27 西北核技术研究所 一种基于滤光的成像系统动态范围扩展方法
CN107864317B (zh) * 2017-11-07 2020-03-13 西北核技术研究所 一种基于衰减掩膜的瞬态成像动态范围扩展方法
CN107786818B (zh) * 2017-11-07 2020-06-26 西北核技术研究所 一种基于多色滤光的瞬态成像动态范围扩展方法
CN108322669B (zh) * 2018-03-06 2021-03-23 Oppo广东移动通信有限公司 图像获取方法及装置、成像装置和可读存储介质
CN109446937B (zh) * 2018-10-12 2022-03-01 贵州民族大学 屏下指纹识别仪
CN110166678A (zh) * 2019-06-26 2019-08-23 京东方科技集团股份有限公司 图像采集结构及其采集方法、显示装置
CN110505384B (zh) * 2019-08-29 2021-05-14 Oppo广东移动通信有限公司 成像系统、终端和图像获取方法
CN112599577B (zh) * 2019-12-03 2023-03-21 Oppo广东移动通信有限公司 一种显示屏组件以及电子装置
KR20220069485A (ko) 2020-11-20 2022-05-27 유니스주식회사 접착제 및 그 제조방법

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070025718A1 (en) * 2005-07-29 2007-02-01 Keiichi Mori Digital camera, image capture method, and image capture control program
CN102457683A (zh) * 2010-11-03 2012-05-16 索尼公司 透镜和颜色过滤器布置、超分辨率相机系统及方法
CN103067660A (zh) * 2007-05-10 2013-04-24 爱西斯创新有限公司 图像捕捉设备及方法
CN103531603A (zh) * 2013-10-30 2014-01-22 上海集成电路研发中心有限公司 一种cmos图像传感器
US20140267351A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Monochromatic edge geometry reconstruction through achromatic guidance
CN104159049A (zh) * 2014-08-15 2014-11-19 北京思比科微电子技术股份有限公司 一种图像传感器及其工作方法
CN105516698A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105516700A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578005A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6999119B1 (en) 1998-04-10 2006-02-14 Nikon Corporation Image-capturing element, image-capturing circuit for processing signal from image-capturing element, image-capturing device, driving method of image-capturing element
JP4501350B2 (ja) * 2003-03-18 2010-07-14 ソニー株式会社 固体撮像装置および撮像装置
JP2006311240A (ja) * 2005-04-28 2006-11-09 Olympus Corp 撮像装置
KR100809345B1 (ko) * 2006-06-16 2008-03-05 삼성전자주식회사 영상 생성 장치 및 방법
US7769229B2 (en) * 2006-11-30 2010-08-03 Eastman Kodak Company Processing images having color and panchromatic pixels
US8242426B2 (en) * 2006-12-12 2012-08-14 Dolby Laboratories Licensing Corporation Electronic camera having multiple sensors for capturing high dynamic range images and related methods
US7745779B2 (en) * 2008-02-08 2010-06-29 Aptina Imaging Corporation Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers
JP2011015219A (ja) * 2009-07-02 2011-01-20 Toshiba Corp 固体撮像装置
US8724928B2 (en) 2009-08-31 2014-05-13 Intellectual Ventures Fund 83 Llc Using captured high and low resolution images
JP2011097568A (ja) * 2009-10-02 2011-05-12 Sanyo Electric Co Ltd 撮像装置
US8878950B2 (en) * 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
JP2013021660A (ja) * 2011-07-14 2013-01-31 Sony Corp 画像処理装置、撮像装置、および画像処理方法、並びにプログラム
JP2013066140A (ja) 2011-08-31 2013-04-11 Sony Corp 撮像装置、および信号処理方法、並びにプログラム
JP5889049B2 (ja) * 2012-03-09 2016-03-22 オリンパス株式会社 画像処理装置、撮像装置及び画像処理方法
JP2014086863A (ja) * 2012-10-23 2014-05-12 Sony Corp 撮像装置、および画像処理方法、並びにプログラム
WO2014165244A1 (en) * 2013-03-13 2014-10-09 Pelican Imaging Corporation Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US9106784B2 (en) * 2013-03-13 2015-08-11 Pelican Imaging Corporation Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
WO2014144391A1 (en) * 2013-03-15 2014-09-18 Rambus Inc. Threshold-monitoring, conditional-reset image sensor
JP6180882B2 (ja) * 2013-10-31 2017-08-16 ソニーセミコンダクタソリューションズ株式会社 固体撮像装置、信号処理装置、および電子機器
CN103681721B (zh) * 2013-12-30 2018-10-16 上海集成电路研发中心有限公司 具有高动态范围的图像传感器像素阵列
CN103686007B (zh) * 2013-12-31 2018-11-09 上海集成电路研发中心有限公司 单次拍摄生成高动态范围图像的图像传感器
CN103716558B (zh) * 2013-12-31 2018-11-09 上海集成电路研发中心有限公司 高动态像素阵列、像素单元及图像传感器
US9247117B2 (en) * 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
US9438866B2 (en) 2014-04-23 2016-09-06 Omnivision Technologies, Inc. Image sensor with scaled filter array and in-pixel binning
US9888198B2 (en) * 2014-06-03 2018-02-06 Semiconductor Components Industries, Llc Imaging systems having image sensor pixel arrays with sub-pixel resolution capabilities

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070025718A1 (en) * 2005-07-29 2007-02-01 Keiichi Mori Digital camera, image capture method, and image capture control program
CN103067660A (zh) * 2007-05-10 2013-04-24 爱西斯创新有限公司 图像捕捉设备及方法
CN102457683A (zh) * 2010-11-03 2012-05-16 索尼公司 透镜和颜色过滤器布置、超分辨率相机系统及方法
US20140267351A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Monochromatic edge geometry reconstruction through achromatic guidance
CN103531603A (zh) * 2013-10-30 2014-01-22 上海集成电路研发中心有限公司 一种cmos图像传感器
CN104159049A (zh) * 2014-08-15 2014-11-19 北京思比科微电子技术股份有限公司 一种图像传感器及其工作方法
CN105516698A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105516700A (zh) * 2015-12-18 2016-04-20 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578005A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置
CN105578071A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 图像传感器的成像方法、成像装置和电子装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3226538A4 *

Also Published As

Publication number Publication date
US20180020159A1 (en) 2018-01-18
US9979883B2 (en) 2018-05-22
US10225465B2 (en) 2019-03-05
JP6676704B2 (ja) 2020-04-08
AU2016370337B2 (en) 2018-06-28
TWI613918B (zh) 2018-02-01
CN105578005A (zh) 2016-05-11
TW201724847A (zh) 2017-07-01
EP3226538A1 (en) 2017-10-04
KR20170094242A (ko) 2017-08-17
JP6564462B2 (ja) 2019-08-21
MY189048A (en) 2022-01-23
KR20180135125A (ko) 2018-12-19
EP3226538A4 (en) 2017-12-20
AU2016370337A1 (en) 2017-07-20
KR102041068B1 (ko) 2019-11-05
CN105578005B (zh) 2018-01-19
US20180077349A1 (en) 2018-03-15
SG11201705233RA (en) 2017-07-28
ZA201705170B (en) 2018-12-19
JP2018186512A (ja) 2018-11-22
JP2018502506A (ja) 2018-01-25

Similar Documents

Publication Publication Date Title
WO2017101555A1 (zh) 图像传感器的成像方法、成像装置和电子装置
WO2017101451A1 (zh) 成像方法、成像装置及电子装置
US9661210B2 (en) Image pickup device and image pickup apparatus
TWI617196B (zh) 圖像感測器、成像裝置、行動終端及成像方法
CN105516698A (zh) 图像传感器的成像方法、成像装置和电子装置
US10290079B2 (en) Image processing method and apparatus, and electronic device
US20190122337A1 (en) Image processing method and apparatus, and electronic device
WO2018098983A1 (zh) 图像处理方法及装置、控制方法及装置、成像及电子装置
WO2018098981A1 (zh) 控制方法、控制装置、电子装置和计算机可读存储介质
EP3902242A1 (en) Image sensor, imaging apparatus, electronic device, image processing system, and signal processing method
JP6461429B2 (ja) 画像センサ、制御方法及び電子機器
CN105578071A (zh) 图像传感器的成像方法、成像装置和电子装置
CN105516700A (zh) 图像传感器的成像方法、成像装置和电子装置
CN105516696A (zh) 图像传感器、成像方法、成像装置和电子装置
TW201724848A (zh) 圖像感測器及終端與成像方法
US11171169B2 (en) Image sensor, imaging device and imaging method
WO2017101562A1 (zh) 图像传感器及具有其的终端、成像方法
JP2017054030A (ja) 撮像装置、シェーディング補正方法
TW202143697A (zh) 行動終端、圖像獲取方法及電腦可讀儲存媒介
JP2012231197A (ja) 撮像装置、電子カメラ装置および電子カメラシステム
JP2016039617A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP2009159219A (ja) 固体撮像素子、撮像装置、及び固体撮像素子の駆動方法

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20177017185

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017534320

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2016874615

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11201705233R

Country of ref document: SG

ENP Entry into the national phase

Ref document number: 2016370337

Country of ref document: AU

Date of ref document: 20160928

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15546490

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874615

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE