WO2018196703A1 - 图像传感器、对焦控制方法、成像装置和移动终端 - Google Patents
图像传感器、对焦控制方法、成像装置和移动终端 Download PDFInfo
- Publication number
- WO2018196703A1 WO2018196703A1 PCT/CN2018/084022 CN2018084022W WO2018196703A1 WO 2018196703 A1 WO2018196703 A1 WO 2018196703A1 CN 2018084022 W CN2018084022 W CN 2018084022W WO 2018196703 A1 WO2018196703 A1 WO 2018196703A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focus
- photosensitive unit
- photosensitive
- output value
- microlenses
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000003384 imaging method Methods 0.000 title claims abstract description 25
- 238000004590 computer program Methods 0.000 claims description 8
- 101710199120 Gene 30 protein Proteins 0.000 description 6
- 101710192063 Gene 31 protein Proteins 0.000 description 6
- 101710085030 Gene 32 protein Proteins 0.000 description 6
- 101710126859 Single-stranded DNA-binding protein Proteins 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000000873 masking effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
- G02B7/346—Systems for automatic generation of focusing signals using different areas in a pupil plane using horizontal and vertical areas in the pupil plane, i.e. wide area autofocusing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/46—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0043—Inhomogeneous or irregular arrays, e.g. varying shape, size, height
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/18—Focusing aids
- G03B13/20—Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
Definitions
- the present disclosure relates to the field of electronic technologies, and in particular, to an image sensor, a focus control method, an imaging device, and a mobile terminal.
- image sensors with 16M-4M structure.
- the image sensor outputs images in 4M mode under dark conditions, which can improve the signal-to-noise ratio and improve noise performance.
- the image is output through the 16M mode, and the image with higher definition can be obtained by using the interpolation restoration algorithm.
- the present disclosure aims to solve at least one of the technical problems in the related art to some extent.
- the present disclosure provides a focus control method for an image sensor, wherein the image sensor includes: a photosensitive unit array, a filter unit array disposed on the photosensitive unit array, and a microlens array disposed on the filter unit array, wherein The microlens array includes a first microlens and a second microlens, a first microlens covering a focus photosensitive unit, and N*N second microlenses covering an unfocused photosensitive unit, wherein N is a positive integer
- the focus control The method comprises the steps of: controlling the photosensitive unit array to enter a focus mode; reading an output value of a part of the photosensitive pixels in the focus photosensitive unit as a first output value; reading an output value of another part of the photosensitive pixels in the focus photosensitive unit as a second output Value; focus control based on the first output value and the second output value.
- the focus control method of the image sensor of the present disclosure is based on a first microlens of the image sensor covering a focus photosensitive unit, and the N*N second microlenses covering a non-focus photosensitive unit structure, using a part of the photosensitive pixels in the focus photosensitive unit
- the output value and the output value of another part of the photosensitive pixel are phase-focused, which can effectively improve the focusing speed.
- an image sensor including an array of photosensitive cells; a filter unit array disposed on the photosensitive cell array; a microlens array positioned above the filter cell array; wherein the microlens array includes a microlens and a second microlens, a first microlens covering a focus photosensitive unit, and N*N second microlenses covering an unfocused photosensitive unit, wherein N is a positive integer.
- the image sensor of the present disclosure provides a hardware foundation for improving picture quality and improving focusing speed based on a first microlens covering a focus photosensitive unit and N*N second microlenses covering an unfocused photosensitive unit structure.
- a further embodiment of the present disclosure provides an imaging apparatus including: the image sensor described above; and a control module that controls the photosensitive unit array to enter a focus mode; and reads an output value of a portion of the photosensitive pixels in the focus photosensitive unit And as a first output value; reading an output value of another part of the photosensitive pixels in the focus photosensitive unit and as a second output value; performing focus control according to the first output value and the second output value.
- a first microlens based on the image sensor covers a focus photosensitive unit, and N*N second microlenses cover the structure of an unfocused photosensitive unit, and the output values of a part of the photosensitive pixels in the focus photosensitive unit are utilized.
- the output value of another part of the photosensitive pixels is controlled by focusing, which can effectively improve the focusing speed.
- Yet another aspect of the present disclosure also provides a mobile terminal including a housing, a processor, a memory, a circuit board, and a power supply circuit, wherein the circuit board is disposed inside a space enclosed by the housing, and the processor and the memory are disposed at a circuit board; a power circuit for powering various circuits or devices of the mobile terminal; a memory for storing executable program code; and a processor for operating the executable program code by reading executable program code stored in the memory A program for performing a focus control method of the image sensor described above.
- a first microlens based on the image sensor covers one focus photosensitive unit, and N*N second microlenses cover the structure of one non-focus photosensitive unit, and the output of a part of the photosensitive pixels in the focus photosensitive unit is utilized.
- the value and the output value of another part of the photosensitive pixels are used for focus control, which can effectively improve the focusing speed.
- Another aspect of the present disclosure is directed to a computer program product that, when executed by a processor, implements a focus control method of an image sensor as described in an embodiment of the above.
- Another aspect of the present disclosure is directed to a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements a focus control method of an image sensor as described in an embodiment of the above.
- FIG. 1 is a cross-sectional view of an image sensor in accordance with an embodiment of the present disclosure
- FIG. 2 is a top plan view of an image sensor in which both a focus photosensitive unit and an unfocused photosensitive unit include 2*2 photosensitive pixels, according to an embodiment of the present disclosure
- FIG. 3 is a schematic view showing a distribution of a focus photosensitive unit in an image sensor according to an embodiment of the present disclosure
- FIG. 4 is a flowchart of a focus control method of an image sensor according to an embodiment of the present disclosure
- FIG. 5 is a schematic diagram of a dividing effect of a focus photosensitive unit 2*2 photosensitive pixels according to an embodiment of the present disclosure
- FIG. 6 is a schematic diagram of an effect of focusing of an image sensor according to an embodiment of the present disclosure
- FIG. 7 is a flowchart of an imaging method of an image sensor according to an embodiment of the present disclosure.
- FIG. 8 is a schematic diagram of an effect of imaging according to an embodiment of the present disclosure.
- FIG. 9 is a block diagram of an imaging device in accordance with an embodiment of the present disclosure.
- FIG. 10 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
- FIG. 1 is a cross-sectional view of an image sensor according to an embodiment of the present disclosure
- FIG. 2 is a plan view of an image sensor in which both a focus photosensitive unit and an unfocused photosensitive unit each include 2*2 photosensitive pixels, according to an embodiment of the present disclosure.
- the image sensor 100 includes a photosensitive cell array 10, a filter unit array 20, and a microlens array 30.
- the filter unit array 20 is disposed on the photosensitive cell array 10, and the microlens array 30 is disposed on the filter unit array 20.
- the photosensitive cell array 10 includes a plurality of focus photosensitive units 11 and a plurality of non-focus photosensitive units 12.
- the focus photosensitive unit 11 and the non-focus photosensitive unit 12 are all photosensitive units, including N*N photosensitive pixels 110.
- the microlens array 30 includes a first microlens 31 and a second microlens 32.
- the first microlens 31 covers one filter unit 21 and one focus photosensitive unit 11.
- the N*N second microlenses 32 cover one filter unit 21 and one non-focus photosensitive unit 12.
- the focus photosensitive unit 11 and the non-focus photosensitive unit 12 each include N*N photosensitive pixels 110. In FIG. 2, the focus photosensitive unit 11 and the non-focus photosensitive unit 12 each include 2*2 photosensitive pixels 110.
- the microlens array 30 includes a horizontal center line and a vertical center line, and four side lines, and the microlens array has a plurality of first microlenses.
- the plurality of first microlenses include a first set of first microlenses disposed at a horizontal center line and a second set of first microlenses disposed at a vertical center line, and a third group disposed on four side lines of the microlens array It is the first microlens.
- the focus photosensitive unit 11 covered by the first microlens that is, Gp in the figure, is scattered throughout the image sensor, 3% to 5% of the total number of pixels, and the image sensor center area Gp is distributed. More dense, the edge area is more sparsely distributed, and the phase information of the center of the picture is preferentially obtained, and the focusing speed is effectively improved without affecting the image quality.
- the lens density of the first group of first microlenses and the second group of first microlenses may be greater than the lens density of the third group of first microlenses, thereby causing the focus photosensitive unit of the central region
- the amount of light entering is relatively large, which improves the focus speed and shooting effect.
- the filter unit array 20 adopts a Bayer structure, and each of the filter units 21 corresponds to N*N photosensitive pixels 110, that is, N*N photosensitive pixels 110 correspond to the filter unit 21 of the same color.
- N*N photosensitive pixels 110 form a group and share a first microlens 31, and the photosensitive pixels 110 in the focus photosensitive unit correspond to the filter unit 21 of the same color.
- the image sensor of the present disclosure provides a hardware foundation for improving picture quality and improving focusing speed based on a first microlens covering a focus photosensitive unit and N*N second microlenses covering an unfocused photosensitive unit structure.
- FIG. 4 is a flowchart of a focus control method of an image sensor according to an embodiment of the present disclosure. As shown in FIG. 4, the method includes the following steps:
- the output value of a part of the photosensitive pixels in the focus photosensitive unit is read as the first output value, and the focus photosensitive unit includes 2*2 photosensitive pixels as an example.
- 2*2 photosensitive pixels in the focus photosensitive unit can be divided into two parts of the left side and the right side, and some of the photosensitive pixels in the focus photosensitive unit can be the focus photosensitive unit 2*2 photosensitive.
- the two photosensitive pixels on the left side of the pixel, that is, the output values of the two photosensitive pixels on the left side of the focus photosensitive unit are used as the first output value.
- the 2*2 photosensitive pixels in the focus photosensitive unit can be divided into two parts: the upper side and the lower side, and a part of the photosensitive pixels in the focus photosensitive unit can be the focus photosensitive unit 2*2 photosensitive pixels.
- the two photosensitive pixels on the upper side that is, the output values of the two photosensitive pixels on the upper side of the focus photosensitive unit are used as the first output value.
- the two diagonal lines of the photosensitive unit can be divided into two parts, that is, the photosensitive pixel in the upper left corner and the photosensitive pixel in the lower right corner are part of the photosensitive pixel in the lower right corner, and the photosensitive light in the lower left corner is used.
- the pixel and the photosensitive pixel in the upper right corner are taken as another part.
- the division of the photosensitive photosensitive unit 2*2 photosensitive pixels can read the output value of the photosensitive pixel at "1" in the focus photosensitive unit Gp as the first output value.
- the output value of the photosensitive pixel on the left side of the photosensitive photosensitive unit 2*2 photosensitive pixels and the output value of the photosensitive pixel on the right side are respectively taken as the first output value and the second output value.
- the output values Gp30 and Gp32 of the two photosensitive pixels on the left side of the focus photosensitive unit Gp are taken as the first output value
- the output values Gp31 and Gp33 of the other two photosensitive pixels, that is, the right two photosensitive pixels are taken as The second output value.
- a photosensitive pixel structure design (also referred to as masked pixels, masked pixels) that is adjacent and paired in an image sensor is generally used.
- the structure is more complicated than the ordinary photosensitive pixel structure. It is usually necessary to change the structure of the ordinary photosensitive pixel itself or add a light shielding portion separately on the photosensitive pixel structure so as to be directed to the light in a specific direction in the plurality of directions of light on the shielding pixel.
- the photosensitive portion of the masking pixel cannot be reached, and the light except the specific direction can reach the photosensitive portion of the masking pixel.
- the masking pixels are usually arranged in pairs, adjacent and symmetrically, and the shielding pixels arranged in pairs are used for multiple
- the directional light is separated, and the imaging beams in multiple directions on the shielding pixels arranged in pairs are separated into two parts, for example, left and right, by comparing the phase difference between the left and right portions of the light imaging (ie, Collect the output of the masked pixels set in pairs) to calculate the distance the lens needs to move .
- a first microlens is used to cover one focus photosensitive unit, and each of the focus photosensitive units includes N*N photosensitive pixels, that is, one first microlens corresponds to N*N photosensitive pixels. Therefore, the phase difference information of the imaged image can be obtained by comparing the light signals in different directions, and the distance information of the captured object is further obtained according to the phase difference information, thereby providing a data basis for phase focusing and depth of field information testing.
- the phase focus detection can be realized only by using the cooperation design of the micro lens unit, the filter unit, and the focus photosensitive unit, without changing the structure of the ordinary photosensitive pixel itself or separately increasing the photosensitive pixel structure. A light occlusion section, the implementation of phase focus detection is also simpler.
- the first phase is generated.
- the value is Gp1.
- phase difference information between Gp1 and Gp2 can be acquired, and the phase difference information can be converted into the focus distance information, and the position of the lens can be adjusted according to the focus distance information to realize phase focusing, and the implementation of the phase focus detection is also simpler.
- the output values of the left and right photosensitive pixels of the focus photosensitive unit 2*2 photosensitive pixels are respectively used as the first output value and the second output value, and the phase difference information in the left and right direction can be detected;
- the output values of the photosensitive pixels on the upper and lower sides of the unit 2*2 photosensitive pixels are respectively used as the first output value and the second output value, and the phase difference information in the up and down direction can be detected;
- the photosensitive pixels on the two diagonal lines of the focus photosensitive unit are The output values are used as the first output value and the second output value, respectively, and the phase difference information in the oblique direction can be detected.
- the focus control method of the embodiment of the present disclosure obtains the phase information of the incident light rays at different angles by reading the output values of the photosensitive pixels of different portions in the focus photosensitive unit, and detects the phase information in different directions, thereby improving the focusing speed under the dark light. To make the focus more accurate.
- the focus control method of the image sensor of the present disclosure is based on a first microlens of the image sensor covering a focus photosensitive unit, and the N*N second microlenses covering a non-focus photosensitive unit structure, using a part of the photosensitive pixels in the focus photosensitive unit The output value and the output value of another part of the photosensitive pixel are used for focus control to increase the focusing speed.
- an embodiment of the present disclosure also proposes an imaging method of the image sensor.
- the imaging method of the image sensor includes:
- the photosensitive cell array When the camera is aimed at an object, the photosensitive cell array enters an imaging mode.
- the focus photosensitive unit and the non-focus photosensitive unit include 2*2 photosensitive pixels.
- blue B0, green G1, green G3, and red R4 constitute a Bayer RGB array. Exposure of the focus sensor unit and the non-focus sensor unit, reading the output values of the focus sensor unit Gp30, Gp31, Gp32, and Gp33, and the output values of the non-focus sensor unit B00, B01, B02, B03, Gb10, Gb11, Gb12, Gb13 and many more.
- a merged image is generated based on the pixel values of the focus photosensitive unit and the non-focus photosensitive unit.
- the imaging method of the image sensor uses the sum of the output values of the N*N photosensitive pixels in the photosensitive unit as the pixel value of the photosensitive unit, and generates a combination according to the pixel values of the focus photosensitive unit and the non-focus photosensitive unit.
- the image can effectively improve the imaging sensitivity and signal-to-noise ratio of the image.
- FIG. 9 is a block diagram of an imaging device, including the image sensor 910 and control module 920 of the above-described aspects, as shown in FIG. 9, in accordance with an embodiment of the present disclosure.
- the control module 920 controls the photosensitive cell array to enter the focus mode; reads an output value of a part of the photosensitive pixels in the focus photosensitive unit as a first output value; and reads an output value of another portion of the photosensitive pixels in the focus photosensitive unit as a second output value; Focus control is performed based on the first output value and the second output value.
- the control module 920 is specifically configured to: generate a first phase value according to the first output value; generate a second phase value according to the second output value; perform focus control according to the first phase value and the second phase value.
- the control module 920 is further configured to: control the photosensitive unit array to enter the imaging mode; control the focus photosensitive unit and the non-focus photosensitive unit to perform exposure, and read output values of the focus photosensitive unit and the non-focus photosensitive unit; and the same focus photosensitive unit
- the output values of the N*N photosensitive pixels or the N*N photosensitive pixels of the same non-focusing photosensitive unit are added to obtain pixel values of the focus photosensitive unit and the non-focus photosensitive unit to generate a combined image.
- a first microlens based on the image sensor covers a focus photosensitive unit, and N*N second microlenses cover the structure of an unfocused photosensitive unit, and the output values of a part of the photosensitive pixels in the focus photosensitive unit are utilized.
- the output value of another part of the photosensitive pixels is controlled by focusing, which can effectively improve the focusing speed.
- a further embodiment of the present disclosure further provides a mobile terminal.
- the mobile terminal includes a housing 101, a processor 102, a memory 103, a circuit board 104, and a power supply circuit 105, wherein the circuit board 104 is disposed inside a space surrounded by the housing 101, the processor 102 and the memory. 103 is disposed on the circuit board 104; the power supply circuit 105 is configured to supply power to various circuits or devices of the mobile terminal; the memory 103 is used to store executable program code; and the processor 102 reads the executable program code stored in the memory 103 by A program corresponding to the executable program code is executed for performing the focus control method of the image sensor of the above aspect.
- a first microlens based on the image sensor covers one focus photosensitive unit, and N*N second microlenses cover the structure of one non-focus photosensitive unit, and the output of a part of the photosensitive pixels in the focus photosensitive unit is utilized.
- the value and the output value of another part of the photosensitive pixels are used for focus control, which can effectively improve the focusing speed.
- Embodiments of the present disclosure also provide a computer program product that implements a focus control method of an image sensor as described in the above embodiments when instructions in the computer program product are executed by the processor.
- the embodiment of the present disclosure further provides a computer readable storage medium having stored thereon a computer program, which when executed by the processor, implements a focus control method of the image sensor as described in the above embodiments.
- a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
- computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
- the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
- portions of the present disclosure can be implemented in hardware, software, firmware, or a combination thereof.
- multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
- a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Studio Devices (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
本公开提出一种图像传感器、对焦控制方法、成像装置和移动终端,涉及电子技术领域,其中,图像传感器包括:感光单元阵列、设置在感光单元阵列上的滤光单元阵列和位于滤光单元阵列之上的微透镜阵列,其中,微透镜阵列包括第一微透镜和第二微透镜,一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元,其中,N为正整数。本公开实施例的图像传感器,可以获取对焦感光单元中两部分感光像素的输出值进行相位对焦,能有效提高对焦速度。本公开还公开了一种图像传感器的对焦控制方法、成像装置和移动终端。
Description
相关申请的交叉引用
本公开要求广东欧珀移动通信有限公司于2017年04月28日提交的、发明名称为“图像传感器、对焦控制方法、成像装置和移动终端”的、中国专利申请号“201710297448.8”的优先权。
本公开涉及电子技术领域,尤其涉及一种图像传感器、对焦控制方法、成像装置和移动终端。
随着技术的不断更新,越来越多的厂家开始使用16M-4M结构的图像传感器。该图像传感器在暗光条件下,通过4M模式输出图像,可以提高信噪比,改善噪声表现。在环境光照比较好时,通过16M模式输出图像,利用插值还原算法,可以得到清晰度更高的图像。
发明内容
本公开旨在至少在一定程度上解决相关技术中的技术问题之一。
本公开一方面提出一种图像传感器的对焦控制方法,其中,图像传感器包括:感光单元阵列、设置在感光单元阵列上的滤光单元阵列和位于滤光单元阵列之上的微透镜阵列,其中,微透镜阵列包括第一微透镜和第二微透镜,一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元,其中,N为正整数,该对焦控制方法包括以下步骤:控制感光单元阵列进入对焦模式;读取对焦感光单元中一部分感光像素的输出值并作为第一输出值;读取对焦感光单元中另一部分感光像素的输出值并作为第二输出值;根据第一输出值和第二输出值进行对焦控制。
本公开的图像传感器的对焦控制方法,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行相位对焦,能有效提高对焦速度。
本公开另一方面提出一种图像传感器,该图像传感器包括感光单元阵列;设置在感光 单元阵列上的滤光单元阵列;位于滤光单元阵列之上的微透镜阵列;其中,微透镜阵列包括第一微透镜和第二微透镜,一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元,其中,N为正整数。
本公开的图像传感器,基于一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,为提高画面质量和提高对焦速度提供硬件基础。
本公开的再一方面实施例提出一种成像装置,该成像装置包括:上述的图像传感器;和控制模块,控制模块控制感光单元阵列进入对焦模式;读取对焦感光单元中一部分感光像素的输出值并作为第一输出值;读取对焦感光单元中另一部分感光像素的输出值并作为第二输出值;根据第一输出值和第二输出值进行对焦控制。
本公开的成像装置,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行对焦控制,能有效提高对焦速度。
本公开又一方面还提出一种移动终端,该移动终端包括壳体、处理器、存储器、电路板和电源电路,其中,电路板安置在壳体围成的空间内部,处理器和存储器设置在电路板上;电源电路,用于为移动终端的各个电路或器件供电;存储器用于存储可执行程序代码;处理器通过读取存储器中存储的可执行程序代码来运行与可执行程序代码对应的程序,以用于执行上述的图像传感器的对焦控制方法。
本公开实施例的移动终端,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行对焦控制,能有效提高对焦速度。
本公开另一方面提出一种计算机程序产品,当计算机程序产品中的指令由处理器执行时,实现如上述一方面实施例所述的图像传感器的对焦控制方法。
本公开另一方面提出一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,实现如上述一方面实施例所述的图像传感器的对焦控制方法。
本公开附加的方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本公开的实践了解到。
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是根据本公开的一个实施例的图像传感器的剖面图;
图2是根据本公开的一个实施例的对焦感光单元和非对焦感光单元均包括2*2个感光像素的图像传感器的俯视图;
图3是根据本公开的一个实施例的图像传感器中对焦感光单元的分布示意图;
图4是根据本公开的一个实施例的图像传感器的对焦控制方法的流程图;
图5是根据本公开的一个实施例的对焦感光单元2*2个感光像素的划分效果示意图;
图6是根据本公开的一个实施例的图像传感器的对焦的效果示意图;
图7是根据本公开的一个实施例的图像传感器的成像方法的流程图;
图8是根据本公开的一个实施例的成像的效果示意图;
图9是根据本公开的一个实施例的成像装置的框图;
图10是根据本公开的一个实施例的移动终端的结构示意图。
下面详细描述本公开的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本公开,而不能理解为对本公开的限制。
下面参考附图描述本公开实施例的图像传感器、对焦控制方法、成像装置和移动终端
图1是根据本公开的一个实施例的图像传感器的剖面图,图2是根据本公开的一个实施例的对焦感光单元和非对焦感光单元均包括2*2个感光像素的图像传感器的俯视图。
如图1和图2所示,该图像传感器100包括感光单元阵列10、滤光单元阵列20和微透镜阵列30。
其中,滤光单元阵列20设置在感光单元阵列10上,微透镜阵列30位于滤光单元阵列20之上。感光单元阵列10包括多个对焦感光单元11和多个非对焦感光单元12。对焦感光单元11和非对焦感光单元12均为感光单元,包括N*N个感光像素110。微透镜阵列30包括第一微透镜31和第二微透镜32。第一微透镜31覆盖一个滤光单元21和一个对焦感光单元11。N*N个第二微透镜32覆盖一个滤光单元21和一个非对焦感光单元12。对焦感光单元11和非对焦感光单元12均包括N*N个感光像素110。图2中,对焦感光单元11和非对焦感光单元12均包括2*2个感光像素110。
在本公开的一个实施例中,如图3中所示,微透镜阵列30包括水平中心线和竖直中心线,以及四个边线,微透镜阵列有多个第一微透镜。多个第一微透镜包括设置在水平中心线的第一组第一微透镜和设置在竖直中心线的第二组第一微透镜,以及设置在微透镜阵列四个边线上的第三组为第一微透镜。
从图3可以看出,由第一微透镜覆盖的对焦感光单元11,即图中Gp,在整个图像传感 器中零散分布,占总像素个数的3%~5%,图像传感器中心区域Gp分布更密集,边缘区域分布较为稀疏,优先获取画面中心的相位信息,在不影响画质的情况下,有效提高对焦速度。
由于透镜密度越大,透镜的折射率越大,聚光能力越强,为了使中心区域的对焦感光单元聚集较多的光线,以提升对焦速度和拍摄效果。在本公开的一个实施例中,可使第一组第一微透镜和第二组第一微透镜的透镜密度大于第三组第一微透镜的透镜密度,从而使中心区域的对焦感光单元的进光量相对边缘较大,进而提升对焦速度和拍摄效果。
在本公开实施方式中,滤光单元阵列20采用拜耳结构,每个滤光单元21对应N*N个感光像素110,即N*N个感光像素110对应相同颜色的滤光单元21。
概括地说,本公开实施例的图像传感器100中,N*N个感光像素110组成一组并共用一个第一微透镜31,对焦感光单元内的感光像素110对应相同颜色的滤光单元21。
本公开的图像传感器,基于一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,为提高画面质量和提高对焦速度提供硬件基础。
基于图1-图3中图像传感器的结构,下面对本公开实施例的图像传感器的对焦控制方法进行说明。图4是根据本公开的一个实施例的图像传感器的对焦控制方法的流程图,如图4所示,该方法包括以下步骤:
S41,控制感光单元阵列进入对焦模式。
例如,通过手机对物体进行拍照时,对准要拍摄的物体,点击屏幕进行对焦,这时感光单元阵列进入对焦模式。
S42,读取对焦感光单元中一部分感光像素的输出值并作为第一输出值。
进入对焦模式后,读取对焦感光单元中一部分感光像素的输出值作为第一输出值,以对焦感光单元包含2*2个感光像素为例。
在本公开的一个实施例中,可将对焦感光单元中2*2个感光像素分为左侧和右侧两个部分,对焦感光单元中的一部分感光像素可以是对焦感光单元2*2个感光像素中左侧的两个感光像素,即将对焦感光单元中左侧的两个感光像素的输出值作为第一输出值。
在另外一个实施例中,可将对焦感光单元中2*2个感光像素分为上侧和下侧两个部分,对焦感光单元中的一部分感光像素可以是对焦感光单元2*2个感光像素中上侧的两个感光像素,即将对焦感光单元中上侧的两个感光像素的输出值作为第一输出值。
在又一个实施例中,也可以对焦感光单元两条对角线将2*2个感光像素分为两部分,即将左上角的感光像素与右下角的感光像素作为其中的一部分,左下角的感光像素与右上角的感光像素作为另一部分。
上述对对焦感光单元2*2个感光像素的划分情况,如图5所示,可读取对焦感光单元 Gp中“1”处感光像素的输出值作为第一输出值。
S43,读取对焦感光单元中另一部分感光像素的输出值并作为第二输出值。
如图5所示,当读取图5中“1”处感光像素的输出值作为第一输出值时,读取对焦感光单元中另一部分感光像素的输出值并作为第二输出值,也就是读取“2”处感光像素的输出值作为第二输出值。
以读取对焦感光单元2*2个感光像素左侧的感光像素的输出值和右侧的感光像素的输出值分别作为第一输出值和第二输出值为例。如图6所示,当将对焦感光单元Gp左侧两个感光像素的输出值Gp30和Gp32作为第一输出值时,将另外一部分感光像素即右侧两个感光像素的输出值Gp31和Gp33作为第二输出值。
S44,根据第一输出值和第二输出值进行对焦控制。
在相关技术中,一般地,为了实现PDAF(Phase Detection Auto Focus,相位检测自动对焦),通常利用图像传感器内相邻且成对设置的感光像素结构设计(又称遮蔽像素,masked pixels,遮蔽像素结构相较于普通感光像素结构更加复杂,通常需要改变普通感光像素本身结构或者在感光像素结构上单独增加一个光线遮挡部,以使得射向遮蔽像素上的多个方向光线中特定方向上的光线不能到达遮蔽像素的感光部分,而除了特定方向之外的光线则可以到达遮蔽像素的感光部分,换言之,遮蔽像素通常成对、邻近且对称的设置,成对设置的遮蔽像素用于对多个方向的光线进行分离),将射向成对设置的遮蔽像素上的多个方向上的成像光束分离成比如左和右两部分,通过对比左、右两部分光线成像后的相位差(即通过采集成对设置的遮蔽像素的输出)来计算镜头需要移动的距离。
而在本公开的实施例中,基于一个第一微透镜覆盖一个对焦感光单元,而每个对焦感光单元包括N*N个感光像素,即一个第一微透镜对应N*N个感光像素。所以,通过不同方向的光线信号对比可以获取成像图像的相位差信息,进一步地根据相位差信息获得拍摄物体的距离信息,为相位对焦和景深信息测试提供数据基础。显然,本公开实施例中,只需要利用微透镜单元、滤光单元和对焦感光单元的配合设计,就可以实现相位对焦的检测,而无需改变普通感光像素本身结构或者在感光像素结构上单独增加一个光线遮挡部,相位对焦检测的实现方式也更加简单。
如图6所示,在获取第一输出值和第二输出值之后,可求出左侧两个感光像素的输出值Gp30与Gp32之间的和,即Gp1=Gp30+Gp32,生成第一相位值Gp1。同样,可求出右侧两个感光像素的输出值Gp31与Gp33之间的和,即Gp2=Gp31+Gp33,生成第二相位值Gp2。从而,可以获取Gp1和Gp2之间的相位差信息,进而可以将相位差信息转换为对焦距离信息,根据对焦距离信息调节镜头的位置实现相位对焦,相位对焦检测的实现方式也更加简单。
在本公开的实施例中,将对焦感光单元2*2个感光像素左右两侧感光像素的输出值分别作为第一输出值和第二输出值,可以检测左右方向的相位差信息;将对焦感光单元2*2个感光像素上下两侧感光像素的输出值分别作为第一输出值和第二输出值,可以检测上下方向的相位差信息;将对焦感光单元两条对角线上的感光像素的输出值分别作为第一输出值和第二输出值,可检测斜向的相位差信息。
本公开实施例提出的对焦控制方法,通过读取对焦感光单元中不同部分的感光像素的输出值,获取不同角度入射光线的相位信息,进行不同方向相位信息检测,提高了暗光下的对焦速度,使对焦更准确。
本公开的图像传感器的对焦控制方法,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行对焦控制,提高对焦速度。
另外,基于图1-图3中图像传感器的结构,本公开实施例还提出了一种图像传感器的成像方法。
如图7所示,该图像传感器的成像方法包括:
S71,控制感光单元阵列进入成像模式。
例如,用手机摄像头物体进行拍照,当摄像头对准物体时,感光单元阵列进入成像模式。
S72,控制对焦感光单元和非对焦感光单元进行曝光,并读取对焦感光单元和非对焦感光单元的输出值。
以对焦感光单元和非对焦感光单元均包括2*2个感光像素为例。如图8所示,蓝色B0、绿色G1、绿色G3和红色R4组成了一个拜耳RGB阵列。对对焦感光单元和非对焦感光单元进行曝光,读取对焦感光单元的输出值Gp30、Gp31、Gp32和Gp33,非对焦感光单元的输出值B00、B01、B02、B03、Gb10、Gb11、Gb12、Gb13等等。
S73,将同一对焦感光单元的N*N个感光像素或同一非对焦感光单元的N*N个感光像素的输出值相加以得到对焦感光单元和非对焦感光单元的像素值从而生成合并图像。
如图8所示,将同一对焦感光单元的2*2个感光像素的输出值Gp30、Gp31、Gp32和Gp33相加,即Gp30+Gp31+Gp32+Gp33=G3,得到对焦感光单元的像素值G3。将同一非对焦感光单元的2*2个感光像素输出值B00、B01、B02、B03相加,即B00+B01+B02+B03=B0,得到该非对焦感光单元的像素值B0。同理,可得到非对焦感光单元的像素值,绿色G1=Gb10+Gb11+Gb12+Gb13,红色R4=R40+R41+R42+R43等等。根据对焦感光单元和非对焦感光单元的像素值生成合并图像。
本公开实施例提出的图像传感器的成像方法,将感光单元内的N*N个感光像素的输出 值之和作为该感光单元的像素值,根据对焦感光单元和非对焦感光单元的像素值生成合并图像,可有效地提升图像的成像灵敏度和信噪比。
下面对本公开再一方面实施例的成像装置进行说明。
图9是根据本公开的一个实施例的成像装置的框图,如图9所示,该成像装置900包括上述方面的图像传感器910和控制模块920。
控制模块920控制感光单元阵列进入对焦模式;读取对焦感光单元中一部分感光像素的输出值并作为第一输出值;读取对焦感光单元中另一部分感光像素的输出值并作为第二输出值;根据第一输出值和第二输出值进行对焦控制。
控制模块920具体用于:根据第一输出值生成第一相位值;根据第二输出值生成第二相位值;根据第一相位值和第二相位值进行对焦控制。
控制模块920还用于:控制感光单元阵列进入成像模式;控制对焦感光单元和非对焦感光单元进行曝光,并读取对焦感光单元和所述非对焦感光单元的输出值;将同一对焦感光单元的N*N个感光像素或同一非对焦感光单元的N*N个感光像素的输出值相加以得到对焦感光单元和非对焦感光单元的像素值从而生成合并图像。
本公开的成像装置,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行对焦控制,能有效提高对焦速度。
本公开再一方面实施例还提出一种移动终端。
如图10所示,该移动终端包括壳体101、处理器102、存储器103、电路板104和电源电路105,其中,电路板104安置在壳体101围成的空间内部,处理器102和存储器103设置在电路板104上;电源电路105,用于为移动终端的各个电路或器件供电;存储器103用于存储可执行程序代码;处理器102通过读取存储器103中存储的可执行程序代码来运行与可执行程序代码对应的程序,以用于执行上述方面的图像传感器的对焦控制方法。
本公开实施例的移动终端,基于图像传感器的一个第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元的结构,利用对焦感光单元中一部分感光像素的输出值与另外一部分感光像素的输出值,进行对焦控制,能有效提高对焦速度。
本公开实施例还提出一种计算机程序产品,当计算机程序产品中的指令由处理器执行时,实现如上述实施例所述的图像传感器的对焦控制方法。
本公开实施例还提出一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,实现如上述实施例所述的图像传感器的对焦控制方法。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任 何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本公开的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
需要说明的是,在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本公开的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示 例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本公开的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
尽管上面已经示出和描述了本公开的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本公开的限制,本领域的普通技术人员在本公开的范围内可以对上述实施例进行变化、修改、替换和变型。
Claims (17)
- 一种图像传感器的对焦控制方法,其特征在于,所述图像传感器包括:感光单元阵列、设置在所述感光单元阵列上的滤光单元阵列和位于所述滤光单元阵列之上的微透镜阵列,其中,所述微透镜阵列包括第一微透镜和第二微透镜,一个所述第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元,其中,N为正整数,所述方法包括以下步骤:控制所述感光单元阵列进入对焦模式;读取所述对焦感光单元中一部分感光像素的输出值并作为第一输出值;读取所述对焦感光单元中另一部分感光像素的输出值并作为第二输出值;根据所述第一输出值和第二输出值进行对焦控制。
- 如权利要求1所述的方法,其特征在于,所述根据所述第一输出值和第二输出值进行对焦控制具体包括:根据所述第一输出值生成第一相位值;根据所述第二输出值生成第二相位值;根据所述第一相位值和第二相位值进行对焦控制。
- 如权利要求1-2任一项所述的方法,其特征在于,所述微透镜阵列包括水平中心线和竖直中心线,所述第一微透镜为多个,所述多个第一微透镜包括:设置在所述水平中心线的第一组第一微透镜;和设置在所述竖直中心线的第二组第一微透镜。
- 如权利要求3所述的方法,其特征在于,所述微透镜阵列包括四个边线,所述多个第一微透镜还包括:设置在所述四个边线的第三组第一微透镜。
- 如权利要求4所述的方法,其特征在于,所述第一组第一微透镜和所述第二组第一微透镜的透镜密度大于所述第三组第一微透镜的透镜密度。
- 如权利要求1-5任一项所述的方法,其特征在于,所述对焦感光单元和所述非对焦感光单元包括N*N个感光像素,所述方法还包括:控制所述感光单元阵列进入成像模式;控制所述对焦感光单元和所述非对焦感光单元进行曝光,并读取所述对焦感光单元和所述非对焦感光单元的输出值;将同一所述对焦感光单元的N*N个感光像素或同一所述非对焦感光单元的N*N个感光像素的输出值相加以得到所述对焦感光单元和所述非对焦感光单元的像素值从而生成合 并图像。
- 一种图像传感器,其特征在于,包括:感光单元阵列;设置在所述感光单元阵列上的滤光单元阵列;位于所述滤光单元阵列之上的微透镜阵列;其中,所述微透镜阵列包括第一微透镜和第二微透镜,一个所述第一微透镜覆盖一个对焦感光单元,N*N个第二微透镜覆盖一个非对焦感光单元,其中,N为正整数。
- 如权利要求7所述的图像传感器,其特征在于,所述微透镜阵列包括水平中心线和竖直中心线,所述第一微透镜为多个,所述多个第一微透镜包括:设置在所述水平中心线的第一组第一微透镜;和设置在所述竖直中心线的第二组第一微透镜。
- 如权利要求8所述的图像传感器,其特征在于,所述微透镜阵列包括四个边线,所述多个第一微透镜还包括:设置在所述四个边线的第三组第一微透镜。
- 如权利要求9所述的图像传感器,其特征在于,所述第一组第一微透镜和所述第二组第一微透镜的透镜密度大于所述第三组第一微透镜的透镜密度。
- 如权利要求7-10任一项所述的图像传感器,其特征在于,所述对焦感光单元和所述非对焦感光单元包括N*N个感光像素。
- 一种成像装置,其特征在于,包括:如权利要求7-11任一项所述的图像传感器;和控制模块,所述控制模块控制所述感光单元阵列进入对焦模式;读取所述对焦感光单元中一部分感光像素的输出值并作为第一输出值;读取所述对焦感光单元中另一部分感光像素的输出值并作为第二输出值;根据所述第一输出值和第二输出值进行对焦控制。
- 如权利要求12所述的成像装置,其特征在于,所述控制模块具体用于:根据所述第一输出值生成第一相位值;根据所述第二输出值生成第二相位值;根据所述第一相位值和第二相位值进行对焦控制。
- 如权利要求12-13任一项所述的成像装置,其特征在于,所述控制模块还用于:控制所述感光单元阵列进入成像模式;控制所述对焦感光单元和所述非对焦感光单元进行曝光,并读取所述对焦感光单元和所述非对焦感光单元的输出值;将同一所述对焦感光单元的N*N个感光像素或同一所述非对焦感光单元的N*N个感光像素的输出值相加以得到所述对焦感光单元和所述非对焦感光单元的像素值从而生成合并图像。
- 一种移动终端,包括壳体、处理器、存储器、电路板和电源电路,其中,所述电路板安置在所述壳体围成的空间内部,所述处理器和所述存储器设置在所述电路板上;所述电源电路,用于为所述移动终端的各个电路或器件供电;所述存储器用于存储可执行程序代码;所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行如权利要求1至6中任一项所述的图像传感器的对焦控制方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品中的指令由处理器执行时,实现如权利要求1-6任一项所述的图像传感器的对焦控制方法。
- 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时,实现如权利要求1-6任一项所述的图像传感器的对焦控制方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18790001.4A EP3606026A4 (en) | 2017-04-28 | 2018-04-23 | IMAGE SENSOR, FOCUS CONTROL METHOD, IMAGING DEVICE, AND MOBILE TERMINAL |
US16/664,327 US11108943B2 (en) | 2017-04-28 | 2019-10-25 | Image sensor, focusing control method, and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710297448.8 | 2017-04-28 | ||
CN201710297448.8A CN107135340A (zh) | 2017-04-28 | 2017-04-28 | 图像传感器、对焦控制方法、成像装置和移动终端 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/664,327 Continuation US11108943B2 (en) | 2017-04-28 | 2019-10-25 | Image sensor, focusing control method, and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018196703A1 true WO2018196703A1 (zh) | 2018-11-01 |
Family
ID=59715130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/084022 WO2018196703A1 (zh) | 2017-04-28 | 2018-04-23 | 图像传感器、对焦控制方法、成像装置和移动终端 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11108943B2 (zh) |
EP (1) | EP3606026A4 (zh) |
CN (1) | CN107135340A (zh) |
WO (1) | WO2018196703A1 (zh) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107105141B (zh) | 2017-04-28 | 2019-06-28 | Oppo广东移动通信有限公司 | 图像传感器、图像处理方法、成像装置和移动终端 |
CN107135340A (zh) | 2017-04-28 | 2017-09-05 | 广东欧珀移动通信有限公司 | 图像传感器、对焦控制方法、成像装置和移动终端 |
CN109977706A (zh) * | 2017-12-27 | 2019-07-05 | 格科微电子(上海)有限公司 | 具有相位对焦功能的扫码设备 |
CN109905600A (zh) * | 2019-03-21 | 2019-06-18 | 上海创功通讯技术有限公司 | 成像方法、成像装置及计算机可读存储介质 |
CN111818267A (zh) * | 2020-08-14 | 2020-10-23 | 深圳市汇顶科技股份有限公司 | 图像传感器、光学模组、对焦方法及电子设备 |
CN113691699B (zh) * | 2021-08-02 | 2023-06-20 | 维沃移动通信有限公司 | 成像芯片组件、摄像模组及其对焦方法和电子设备 |
CN113992856A (zh) * | 2021-11-30 | 2022-01-28 | 维沃移动通信有限公司 | 图像传感器、摄像模组和电子设备 |
CN114157788A (zh) * | 2021-11-30 | 2022-03-08 | 信利光电股份有限公司 | 一种解决摄像头拍摄特殊纹理形状对焦失败的方法、摄像模组及拍摄装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7745779B2 (en) * | 2008-02-08 | 2010-06-29 | Aptina Imaging Corporation | Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers |
CN104064577A (zh) * | 2014-07-16 | 2014-09-24 | 上海集成电路研发中心有限公司 | 自动对焦的图像传感器 |
CN105611124A (zh) * | 2015-12-18 | 2016-05-25 | 广东欧珀移动通信有限公司 | 图像传感器、成像方法、成像装置及电子装置 |
CN106549025A (zh) * | 2015-09-16 | 2017-03-29 | 台湾积体电路制造股份有限公司 | 用于复合网格结构的相位检测自动对焦(phaf)像素的微透镜 |
CN107135340A (zh) * | 2017-04-28 | 2017-09-05 | 广东欧珀移动通信有限公司 | 图像传感器、对焦控制方法、成像装置和移动终端 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6638786B2 (en) | 2002-10-25 | 2003-10-28 | Hua Wei Semiconductor (Shanghai ) Co., Ltd. | Image sensor having large micro-lenses at the peripheral regions |
US8049801B2 (en) | 2006-09-14 | 2011-11-01 | Nikon Corporation | Image sensor and imaging apparatus |
JP4961993B2 (ja) * | 2006-12-18 | 2012-06-27 | 株式会社ニコン | 撮像素子、焦点検出装置および撮像装置 |
JP2010252277A (ja) * | 2009-04-20 | 2010-11-04 | Panasonic Corp | 固体撮像装置及び電子カメラ |
WO2012039180A1 (ja) * | 2010-09-24 | 2012-03-29 | 富士フイルム株式会社 | 撮像デバイス及び撮像装置 |
JP5672989B2 (ja) | 2010-11-05 | 2015-02-18 | ソニー株式会社 | 撮像装置 |
JP2013066140A (ja) * | 2011-08-31 | 2013-04-11 | Sony Corp | 撮像装置、および信号処理方法、並びにプログラム |
US9973678B2 (en) * | 2015-01-14 | 2018-05-15 | Invisage Technologies, Inc. | Phase-detect autofocus |
US10044959B2 (en) * | 2015-09-24 | 2018-08-07 | Qualcomm Incorporated | Mask-less phase detection autofocus |
CN105609516B (zh) * | 2015-12-18 | 2019-04-12 | Oppo广东移动通信有限公司 | 图像传感器及输出方法、相位对焦方法、成像装置和终端 |
CN105611122B (zh) * | 2015-12-18 | 2019-03-01 | Oppo广东移动通信有限公司 | 图像传感器及输出方法、相位对焦方法、成像装置和终端 |
-
2017
- 2017-04-28 CN CN201710297448.8A patent/CN107135340A/zh active Pending
-
2018
- 2018-04-23 EP EP18790001.4A patent/EP3606026A4/en not_active Withdrawn
- 2018-04-23 WO PCT/CN2018/084022 patent/WO2018196703A1/zh unknown
-
2019
- 2019-10-25 US US16/664,327 patent/US11108943B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7745779B2 (en) * | 2008-02-08 | 2010-06-29 | Aptina Imaging Corporation | Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers |
CN104064577A (zh) * | 2014-07-16 | 2014-09-24 | 上海集成电路研发中心有限公司 | 自动对焦的图像传感器 |
CN106549025A (zh) * | 2015-09-16 | 2017-03-29 | 台湾积体电路制造股份有限公司 | 用于复合网格结构的相位检测自动对焦(phaf)像素的微透镜 |
CN105611124A (zh) * | 2015-12-18 | 2016-05-25 | 广东欧珀移动通信有限公司 | 图像传感器、成像方法、成像装置及电子装置 |
CN107135340A (zh) * | 2017-04-28 | 2017-09-05 | 广东欧珀移动通信有限公司 | 图像传感器、对焦控制方法、成像装置和移动终端 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3606026A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20200059593A1 (en) | 2020-02-20 |
CN107135340A (zh) | 2017-09-05 |
US11108943B2 (en) | 2021-08-31 |
EP3606026A1 (en) | 2020-02-05 |
EP3606026A4 (en) | 2020-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018196703A1 (zh) | 图像传感器、对焦控制方法、成像装置和移动终端 | |
TWI636314B (zh) | 雙核對焦圖像感測器及其對焦控制方法、成像裝置和行動終端 | |
US10764522B2 (en) | Image sensor, output method, phase focusing method, imaging device, and terminal | |
TWI651582B (zh) | 影像感測器、影像處理方法、成像裝置和行動終端 | |
WO2018196704A1 (zh) | 双核对焦图像传感器及其对焦控制方法和成像装置 | |
US10594962B2 (en) | Image sensor, imaging device, mobile terminal and imaging method for producing high resolution image | |
CN106982329B (zh) | 图像传感器、对焦控制方法、成像装置和移动终端 | |
CN107040702B (zh) | 图像传感器、对焦控制方法、成像装置和移动终端 | |
WO2018099011A1 (zh) | 图像处理方法、图像处理装置、成像装置及电子装置 | |
CN107105140B (zh) | 双核对焦图像传感器及其对焦控制方法和成像装置 | |
WO2018098982A1 (zh) | 图像处理方法、图像处理装置、成像装置及电子装置 | |
CN107124536B (zh) | 双核对焦图像传感器及其对焦控制方法和成像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18790001 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018790001 Country of ref document: EP Effective date: 20191024 |