CN107040702B - Image sensor, focusing control method, imaging device and mobile terminal - Google Patents

Image sensor, focusing control method, imaging device and mobile terminal Download PDF

Info

Publication number
CN107040702B
CN107040702B CN201710296853.8A CN201710296853A CN107040702B CN 107040702 B CN107040702 B CN 107040702B CN 201710296853 A CN201710296853 A CN 201710296853A CN 107040702 B CN107040702 B CN 107040702B
Authority
CN
China
Prior art keywords
photosensitive unit
focusing
micro
lens
microlenses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710296853.8A
Other languages
Chinese (zh)
Other versions
CN107040702A (en
Inventor
曾元清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710296853.8A priority Critical patent/CN107040702B/en
Publication of CN107040702A publication Critical patent/CN107040702A/en
Application granted granted Critical
Publication of CN107040702B publication Critical patent/CN107040702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

The invention discloses an image sensor, a focusing control method, an imaging device and a mobile terminal, wherein the image sensor comprises: the micro-lens array comprises a first micro-lens, a second micro-lens and a third micro-lens, wherein the first micro-lens and the second micro-lens are of a double-layer structure, one first micro-lens covers N-N second micro-lenses, the N-N second micro-lenses cover one focusing photosensitive unit, the N-N third micro-lenses cover one non-focusing photosensitive unit, and N is a positive integer. The image sensor of the embodiment of the invention can focus more light rays on the photosensitive pixels of the focusing photosensitive unit, thereby improving the focusing speed in a dark light environment. The invention also discloses a focusing control method of the image sensor, an imaging device and a mobile terminal.

Description

Image sensor, focusing control method, imaging device and mobile terminal
Technical Field
The invention relates to the technical field of electronics, in particular to an image sensor, a focusing control method, an imaging device and a mobile terminal.
Background
With the continuous updating of technology, more and more manufacturers are beginning to use image sensors of 16M-4M architecture. The image sensor outputs images in a 4M mode under the dark light condition, so that the signal-to-noise ratio can be improved, and the noise performance can be improved. When the ambient illumination is good, the image is output in a 16M mode, and the image with higher definition can be obtained by utilizing an interpolation reduction algorithm. However, in the dim light condition, the 4M mode focusing speed is slow.
Disclosure of Invention
The object of the present invention is to solve at least to some extent one of the above mentioned technical problems.
In order to solve the above problem, an aspect of the present invention provides a focus control method of an image sensor, wherein the image sensor includes: the focusing control method comprises the following steps of: controlling the photosensitive unit array to enter a focusing mode; reading output values of a part of photosensitive pixels in the focusing photosensitive unit as first output values; reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value; and carrying out focusing control according to the first output value and the second output value.
According to the focusing control method of the image sensor, the first micro lens and the second micro lens based on the image sensor are of a double-layer structure, one first micro lens covers N x N second micro lenses, the N x N second micro lenses cover one focusing photosensitive unit, and the N x N third micro lenses cover one non-focusing photosensitive unit.
In order to solve the above problem, another aspect of the present invention provides an image sensor including an array of photosensitive cells; a light filtering unit array arranged on the photosensitive unit array; a micro lens array positioned above the filter unit array; the micro-lens array comprises first micro-lenses, second micro-lenses and third micro-lenses, the first micro-lenses and the second micro-lenses are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit, and the N x N third micro-lenses cover one non-focusing photosensitive unit, wherein N is a positive integer.
The image sensor is based on the structure that the first micro-lens and the second micro-lens are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit, and the N x N third micro-lenses cover one non-focusing photosensitive unit, so that more light rays can be focused on photosensitive pixels of the focusing photosensitive unit, and a hardware basis is provided for improving the picture quality and the focusing speed.
An embodiment of another aspect of the present invention provides an image forming apparatus, including: the image sensor described above; the control module controls the photosensitive unit array to enter a focusing mode; reading output values of a part of photosensitive pixels in the focusing photosensitive unit as first output values; reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value; and carrying out focusing control according to the first output value and the second output value.
The imaging device is based on the structure that the first micro-lens and the second micro-lens of the image sensor are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit, and the N x N third micro-lenses cover one non-focusing photosensitive unit.
The invention also provides a mobile terminal, which comprises a shell, a processor, a memory, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for executing the above-described focus control method of the image sensor.
According to the mobile terminal provided by the embodiment of the invention, the first micro lens and the second micro lens based on the image sensor are of a double-layer structure, one first micro lens covers N x N second micro lenses, the N x N second micro lenses cover one focusing photosensitive unit, and the N x N third micro lenses cover one non-focusing photosensitive unit.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a cross-sectional view of an image sensor according to one embodiment of the invention;
FIG. 2 is a top view of an image sensor in which both the focusing photosensitive cells and the non-focusing photosensitive cells include 2 x 2 photosensitive pixels according to one embodiment of the present invention;
FIG. 3 is a schematic diagram of the distribution of focus sensitive cells in an image sensor according to one embodiment of the present invention;
FIG. 4 is a flowchart of a focus control method of an image sensor according to one embodiment of the present invention;
fig. 5 is a schematic diagram illustrating the division effect of 2 × 2 photosensitive pixels of the focusing photosensitive unit according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating the effect of data processing when an image sensor is in focus according to an embodiment of the invention;
FIG. 7 is a flow chart of a method of imaging an image sensor according to one embodiment of the invention;
FIG. 8 is a diagram illustrating the effect of data processing during imaging of an image sensor according to an embodiment of the present invention;
FIG. 9 is a block diagram of an imaging device according to one embodiment of the invention;
fig. 10 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
An image sensor, a focus control method, an imaging device, and a mobile terminal according to embodiments of the present invention are described below with reference to the accompanying drawings.
Fig. 1 is a cross-sectional view of an image sensor according to an embodiment of the present invention, and fig. 2 is a top view of an image sensor in which a focusing photosensitive unit and a non-focusing photosensitive unit each include 2 × 2 photosensitive pixels according to an embodiment of the present invention.
As shown in fig. 1 and 2, the image sensor 100 includes a photosensitive cell array 10, a filter cell array 20, and a microlens array 30.
Wherein, the filter unit array 20 is disposed on the photosensitive unit array 10, and the microlens array 30 is disposed on the filter unit array 20. The photosensitive cell array 10 includes a plurality of in-focus photosensitive cells 11 and a plurality of out-of-focus photosensitive cells 12. The focusing photosensitive unit 11 and the non-focusing photosensitive unit 12 are both photosensitive units, and include N × N photosensitive pixels 110. The microlens array 30 includes first, second, and third microlenses 31, 32, and 33. The first microlenses 31 and the second microlenses 32 are of a double-layer structure, one first microlens 31 covers N number of second microlenses 32, N number of second microlenses 32 covers one focusing photosensitive unit 11, and N number of third microlenses 33 covers one non-focusing photosensitive unit 12, where N is a positive integer. The first microlens 31 and the third microlens 33 are circular, and the second microlens 32 is rectangular.
For example, in fig. 2, one first microlens 31 covers 2 × 2 second microlenses 32, and each of the focusing photosensitive unit 11 and the non-focusing photosensitive unit 12 includes 2 × 2 photosensitive pixels 110.
According to the image sensor provided by the embodiment of the invention, the focusing photosensitive unit is covered by the two micro lenses with the two layers, so that more light rays can be focused on the focusing photosensitive unit, and the first micro lens covers the N x N second micro lenses, so that more light rays can be focused on each photosensitive pixel of the focusing photosensitive unit, and the focusing speed in a dark environment is improved.
In one embodiment of the present invention, as shown in fig. 3, the microlens array 30 includes a horizontal center line and a vertical center line, and four side lines, and the microlens array 30 has a plurality of first microlenses 31. The plurality of first microlenses 31 includes a first group of first microlenses 31 disposed on the horizontal center line and a second group of first microlenses 31 disposed on the vertical center line, and a third group of first microlenses 31 disposed on the four edges of the microlens array 30.
Since one first microlens 31 covers N × N second microlenses 32, the distribution of the second microlenses 32 corresponds to the distribution of the first microlenses 31 in fig. 3.
As can be seen from fig. 3, the focusing photosensitive units 11 covered by the first microlenses 31, that is, Gp in the figure, are distributed scattered in the entire image sensor, accounting for 3% to 5% of the total number of pixels, the central area Gp of the image sensor is distributed more densely, the edge area is distributed sparsely, the phase information at the center of the image is preferentially acquired, and the focusing speed is effectively increased without affecting the image quality.
The higher the lens density is, the higher the refractive index of the lens is, and the stronger the light condensation capacity is, so that the focusing photosensitive unit in the central area can gather more light rays, and the focusing speed and the shooting effect can be improved. In an embodiment of the invention, the lens density of the first group of first microlenses and the second group of first microlenses can be made greater than the lens density of the third group of first microlenses, so that the light-entering amount of the focusing photosensitive unit in the central area is larger than that of the edge, thereby improving the focusing speed and the shooting effect.
The image sensor is based on the structure that the first micro-lens and the second micro-lens are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit, and the N x N third micro-lenses cover one non-focusing photosensitive unit, so that more light rays can be focused on photosensitive pixels of the focusing photosensitive unit, and a hardware basis is provided for improving the picture quality and the focusing speed.
Based on the structures of the image sensor in fig. 1-3, the following describes a focus control method of the image sensor according to an embodiment of the present invention. Fig. 4 is a flowchart of a focus control method of an image sensor according to an embodiment of the present invention, as shown in fig. 4, the method including the steps of:
and S41, controlling the photosensitive unit array to enter a focusing mode.
For example, when an object is photographed by a mobile phone, the object to be photographed is aligned, the screen is clicked to focus, and then the photosensitive cell array enters a focusing mode.
S42, the output values of a part of the photosensitive pixels in the focusing photosensitive unit are read as the first output values.
After entering the focusing mode, the output values of a part of the photosensitive pixels in the focusing photosensitive unit are read as first output values, and the focusing photosensitive unit includes 2 × 2 photosensitive pixels as an example.
In an embodiment of the present invention, the 2 × 2 photosensitive pixels in the focusing photosensitive unit may be divided into two parts, i.e., a left part and a right part, and a part of the photosensitive pixels in the focusing photosensitive unit may be two left photosensitive pixels in the 2 × 2 photosensitive pixels, that is, output values of the two left photosensitive pixels in the focusing photosensitive unit are used as the first output values.
In another embodiment, the 2 × 2 photosensitive pixels in the focusing photosensitive unit may be divided into two upper and lower portions, and a portion of the photosensitive pixels in the focusing photosensitive unit may be two upper photosensitive pixels in the 2 × 2 photosensitive pixels in the focusing photosensitive unit, that is, the output values of the two upper photosensitive pixels in the focusing photosensitive unit are used as the first output values.
In another embodiment, two diagonal lines of the focus photosensitive unit may also divide 2 × 2 photosensitive pixels into two parts, that is, the photosensitive pixel at the upper left corner and the photosensitive pixel at the lower right corner are taken as one part of the two parts, and the photosensitive pixel at the lower left corner and the photosensitive pixel at the upper right corner are taken as the other part of the two parts.
In the above-described division of the focusing photosensitive unit 2 × 2 photosensitive pixels, as shown in fig. 5, the output value of the photosensitive pixel at "1" in the focusing photosensitive unit Gp can be read as the first output value.
And S43, reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value.
As shown in fig. 5, when the output value of the photosensitive pixel at "1" in fig. 5 is read as the first output value, the output value of another part of the photosensitive pixels in the in-focus photosensitive cell is read and used as the second output value, that is, the output value of the photosensitive pixel at "2" is read as the second output value.
Take the example of reading the output values of the photosensitive pixels on the left and right sides of the focusing photosensitive unit 2 × 2 photosensitive pixels as the first output value and the second output value, respectively. When the photosensitive unit is to be focused, as shown in FIG. 6Output value Gp of two photosensitive pixels on the left side of Gp30And Gp32As the first output value, the output values Gp of the other part of the photosensitive pixels, i.e., the two right photosensitive pixels, are set31And Gp33As a second output value.
And S44, performing focusing control according to the first output value and the second output value.
In the related art, in order to realize Phase Detection Auto Focus (PDAF), a structure design of adjacent and paired photosensitive pixels in an image sensor is generally used (also called shielded pixels, which is more complex than a common photosensitive pixel structure, and generally needs to change a structure of the common photosensitive pixel itself or separately add a light shielding portion on the photosensitive pixel structure, so that light rays in a specific direction of light rays in a plurality of directions emitted to the shielded pixels cannot reach a photosensitive portion of the shielded pixels, and light rays except the specific direction can reach a photosensitive portion of the shielded pixels, in other words, the shielded pixels are generally paired, adjacently and symmetrically arranged, and the paired shielded pixels are used for separating light rays in the plurality of directions), so that imaging light beams in the plurality of directions emitted to the paired shielded pixels are separated into two parts, such as left and right parts, the distance that the lens needs to move is calculated by comparing the phase difference of the left and right light after imaging (namely, by collecting the output of the shielding pixels arranged in pairs).
In the embodiment of the present invention, a first microlens and N × N second microlenses based on a double-layer structure cover one focusing photosensitive unit, and each focusing photosensitive unit includes N × N photosensitive pixels, that is, one first microlens and N × N second microlenses correspond to N × N photosensitive pixels. Therefore, phase difference information of the imaging image can be obtained through comparison of light signals in different directions, distance information of a shot object is further obtained according to the phase difference information, and a data basis is provided for phase focusing and depth of field information testing. Obviously, in the embodiment of the invention, the detection of phase focusing can be realized only by utilizing the matching design of the micro-lens unit, the filtering unit and the focusing photosensitive unit, without changing the structure of a common photosensitive pixel or independently adding a light shielding part on the photosensitive pixel structure, and the realization mode of the phase focusing detection is simpler.
As shown in fig. 6, after the first output value and the second output value are acquired, the output values Gp of the two photosensitive pixels on the left side can be found30And Gp32Of (c) or of (g) or1=Gp30+Gp32Generating a first phase value Gp1. Similarly, the output values Gp of the two right-hand photosensitive pixels can be found31And Gp33Of (c) or of (g) or2=Gp31+Gp33Generating a second phase value Gp2. Thus, Gp can be acquired1And Gp2The phase difference information can be converted into focusing distance information, the position of the lens is adjusted according to the focusing distance information to realize phase focusing, and the realization mode of phase focusing detection is simpler.
In the embodiment of the invention, the output values of the photosensitive pixels at the left and right sides of the focusing photosensitive unit 2 × 2 photosensitive pixels are respectively used as the first output value and the second output value, so that the phase difference information in the left and right directions can be detected; the output values of the photosensitive pixels at the upper and lower sides of the focusing photosensitive unit 2 × 2 photosensitive pixels are respectively used as a first output value and a second output value, so that phase difference information in the upper and lower directions can be detected; the output values of the photosensitive pixels on two diagonal lines of the focusing photosensitive unit are respectively used as a first output value and a second output value, and oblique phase difference information can be detected.
According to the focusing control method provided by the embodiment of the invention, the phase information of incident light rays with different angles is obtained by reading the output values of the photosensitive pixels of different parts in the focusing photosensitive unit, and the phase information detection in different directions is carried out, so that the focusing speed under dark light is improved, and the focusing is more accurate.
According to the focusing control method of the image sensor, the first micro lens and the second micro lens based on the image sensor are of a double-layer structure, one first micro lens covers N x N second micro lenses, the N x N second micro lenses cover one focusing photosensitive unit, and the N x N third micro lenses cover one non-focusing photosensitive unit.
In addition, based on the structures of the image sensors in fig. 1 to 3, the embodiment of the invention also provides an imaging method of the image sensor.
As shown in fig. 7, the imaging method of the image sensor includes:
and S71, controlling the photosensitive unit array to enter an imaging mode.
For example, a photograph is taken with a camera object of a mobile phone, and when the camera is directed at the object, the array of photosensitive cells enters an imaging mode.
And S72, controlling the focusing photosensitive unit and the non-focusing photosensitive unit to expose, and reading the output values of the focusing photosensitive unit and the non-focusing photosensitive unit.
Take the example that the focusing photosensitive unit and the non-focusing photosensitive unit each include 2 × 2 photosensitive pixels. In one embodiment of the present invention, blue B is shown in FIG. 80Green color G1Green color G3And red R4A bayer RGB array is formed. Exposing the focusing photosensitive unit and the non-focusing photosensitive unit, and reading the output value Gp of the focusing photosensitive unit30、Gp31、Gp32And Gp33Output value B of the non-focusing photosensitive cell00、B01、B02、B03、Gb10、Gb11、Gb12、Gb13And so on.
And S73, adding the output values of the N × N photosensitive pixels of the same focusing photosensitive unit or the N × N photosensitive pixels of the same non-focusing photosensitive unit to obtain the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit so as to generate a combined image.
As shown in fig. 8, the output values Gp of 2 × 2 photosensitive pixels of the same focusing photosensitive unit are compared30、Gp31、Gp32And Gp33Adding, i.e. Gp30+Gp31+Gp32+Gp33=G3Obtaining the pixel value G of the focusing photosensitive unit3. Outputting the values B of 2 x 2 photosensitive pixels of the same non-focusing photosensitive unit00、B01、B02、B03Addition, i.e. B00+B01+B02+B03=B0Obtaining the pixel value B of the non-focusing photosensitive unit0. Similarly, the pixel value of the non-focusing photosensitive unit, green G, can be obtained1=Gb10+Gb11+Gb12+Gb13Red color R4=R40+R41+R42+R43And so on. And generating a combined image according to the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit.
According to the imaging method of the image sensor, provided by the embodiment of the invention, the sum of the output values of N × N photosensitive pixels in the photosensitive unit is used as the pixel value of the photosensitive unit, and the combined image is generated according to the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit, so that the imaging sensitivity and the signal-to-noise ratio of the image can be effectively improved.
An image forming apparatus according to an embodiment of still another aspect of the present invention is explained below.
Fig. 9 is a block diagram of an imaging apparatus according to an embodiment of the present invention, and as shown in fig. 9, the imaging apparatus 900 includes the image sensor 910 and the control module 920 of the above-described aspect.
The control module 920 controls the photosensitive cell array to enter a focusing mode; reading output values of a part of photosensitive pixels in the focusing photosensitive unit as first output values; reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value; and carrying out focusing control according to the first output value and the second output value.
The control module 920 is specifically configured to: generating a first phase value according to the first output value; generating a second phase value according to the second output value; and carrying out focusing control according to the first phase value and the second phase value.
The control module 920 is further configured to: controlling the photosensitive unit array to enter an imaging mode; controlling a focusing photosensitive unit and a non-focusing photosensitive unit to be exposed, and reading output values of the focusing photosensitive unit and the non-focusing photosensitive unit; and adding the output values of the N × N photosensitive pixels of the same focusing photosensitive unit or the N × N photosensitive pixels of the same non-focusing photosensitive unit to obtain the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit so as to generate a combined image.
The imaging device is based on the structure that the first micro-lens and the second micro-lens of the image sensor are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit, and the N x N third micro-lenses cover one non-focusing photosensitive unit.
In another aspect, an embodiment of the present invention further provides a mobile terminal.
As shown in fig. 10, the mobile terminal includes a housing 101, a processor 102, a memory 103, a circuit board 104, and a power supply circuit 105, wherein the circuit board 104 is disposed inside a space enclosed by the housing 101, and the processor 102 and the memory 103 are disposed on the circuit board 104; a power supply circuit 105 for supplying power to each circuit or device of the mobile terminal; the memory 103 is used for storing executable program codes; the processor 102 runs a program corresponding to the executable program code by reading the executable program code stored in the memory 103 for executing the focus control method of the image sensor of the above-described aspect.
According to the mobile terminal provided by the embodiment of the invention, the first micro lens and the second micro lens based on the image sensor are of a double-layer structure, one first micro lens covers N x N second micro lenses, the N x N second micro lenses cover one focusing photosensitive unit, and the N x N third micro lenses cover one non-focusing photosensitive unit.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It should be noted that in the description of the present specification, reference to the description of the term "one embodiment", "some embodiments", "an example", "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (14)

1. A focus control method of an image sensor, the image sensor comprising: the focusing photosensitive unit comprises a photosensitive unit array, a filtering unit array arranged on the photosensitive unit array and a micro-lens array positioned on the filtering unit array, wherein the micro-lens array comprises a first micro-lens, a second micro-lens and a third micro-lens, the first micro-lens and the second micro-lens are of a double-layer structure, one first micro-lens covers N second micro-lenses, the N second micro-lenses cover one focusing photosensitive unit so that more light rays are focused on each photosensitive pixel of the focusing photosensitive unit, the N third micro-lenses cover one non-focusing photosensitive unit, the focusing photosensitive unit and the non-focusing photosensitive unit respectively comprise N photosensitive pixels, wherein N is a positive integer, and the method comprises the following steps:
controlling the photosensitive unit array to enter a focusing mode;
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
performing focus control according to the first output value and the second output value, wherein the performing focus control according to the first output value and the second output value specifically includes: generating a first phase value from the first output value; generating a second phase value according to the second output value; and carrying out focusing control according to the first phase value and the second phase value.
2. The method of claim 1, wherein the first and third microlenses are circular and the second microlens is rectangular.
3. The method of claim 1, wherein the microlens array includes a horizontal centerline and a vertical centerline, the first microlens is a plurality of, the plurality of first microlenses including:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
4. The method of claim 3, wherein the microlens array includes four edges, the first plurality of microlenses further including:
and a third group of first microlenses arranged on the four edge lines.
5. The method of claim 4, wherein the first set of first microlenses and the second set of first microlenses have a lens density greater than a lens density of the third set of first microlenses.
6. The method of claim 1, wherein the method further comprises:
controlling the photosensitive unit array to enter an imaging mode;
controlling the focusing photosensitive unit and the non-focusing photosensitive unit to be exposed, and reading output values of the focusing photosensitive unit and the non-focusing photosensitive unit;
and adding the output values of the N × N photosensitive pixels of the same focusing photosensitive unit or the N × N photosensitive pixels of the same non-focusing photosensitive unit to obtain the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit so as to generate a combined image.
7. An image sensor, comprising:
an array of photosensitive cells;
the light filtering unit array is arranged on the photosensitive unit array;
a micro lens array positioned above the filter unit array;
the micro-lens array comprises first micro-lenses, second micro-lenses and third micro-lenses, the first micro-lenses and the second micro-lenses are of a double-layer structure, one first micro-lens covers N x N second micro-lenses, the N x N second micro-lenses cover one focusing photosensitive unit so that more light rays can be focused on each photosensitive pixel of the focusing photosensitive unit, N third micro-lenses cover one non-focusing photosensitive unit, and the focusing photosensitive unit and the non-focusing photosensitive unit respectively comprise N x N photosensitive pixels, wherein N is a positive integer.
8. The image sensor of claim 7, wherein the first and third microlenses are circular and the second microlens is rectangular.
9. The image sensor of claim 7, wherein the microlens array includes a horizontal centerline and a vertical centerline, the first microlens is a plurality of, the plurality of first microlenses including:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
10. The image sensor of claim 9, wherein the microlens array includes four edges, the first plurality of microlenses further including:
and a third group of first microlenses arranged on the four edge lines.
11. The image sensor of claim 10, wherein the first set of first microlenses and the second set of first microlenses have a lens density greater than a lens density of the third set of first microlenses.
12. An image forming apparatus, comprising:
the image sensor of any one of claims 7-11; and
the control module controls the photosensitive unit array to enter a focusing mode;
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
carrying out focusing control according to the first output value and the second output value;
the control module is specifically configured to: generating a first phase value from the first output value; generating a second phase value according to the second output value; and carrying out focusing control according to the first phase value and the second phase value.
13. The imaging apparatus of claim 12, wherein the control module is further to:
controlling the photosensitive unit array to enter an imaging mode;
controlling the focusing photosensitive unit and the non-focusing photosensitive unit to be exposed, and reading output values of the focusing photosensitive unit and the non-focusing photosensitive unit;
and adding the output values of the N × N photosensitive pixels of the same focusing photosensitive unit or the N × N photosensitive pixels of the same non-focusing photosensitive unit to obtain the pixel values of the focusing photosensitive unit and the non-focusing photosensitive unit so as to generate a combined image.
14. A mobile terminal comprises a shell, a processor, a memory, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory for executing the focus control method of the image sensor according to any one of claims 1 to 6.
CN201710296853.8A 2017-04-28 2017-04-28 Image sensor, focusing control method, imaging device and mobile terminal Active CN107040702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710296853.8A CN107040702B (en) 2017-04-28 2017-04-28 Image sensor, focusing control method, imaging device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710296853.8A CN107040702B (en) 2017-04-28 2017-04-28 Image sensor, focusing control method, imaging device and mobile terminal

Publications (2)

Publication Number Publication Date
CN107040702A CN107040702A (en) 2017-08-11
CN107040702B true CN107040702B (en) 2020-06-05

Family

ID=59537876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710296853.8A Active CN107040702B (en) 2017-04-28 2017-04-28 Image sensor, focusing control method, imaging device and mobile terminal

Country Status (1)

Country Link
CN (1) CN107040702B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105141B (en) * 2017-04-28 2019-06-28 Oppo广东移动通信有限公司 Imaging sensor, image processing method, imaging device and mobile terminal
CN109449174A (en) * 2018-11-08 2019-03-08 德淮半导体有限公司 Phase focus image sensor and forming method thereof
CN109346494A (en) * 2018-11-20 2019-02-15 德淮半导体有限公司 Phase focus image sensor and forming method thereof
WO2020243936A1 (en) * 2019-06-05 2020-12-10 深圳市汇顶科技股份有限公司 Under-screen biometric feature identification apparatus and electronic device
CN112866548B (en) * 2019-11-12 2022-06-14 Oppo广东移动通信有限公司 Phase difference acquisition method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104241310A (en) * 2014-09-23 2014-12-24 上海集成电路研发中心有限公司 CMOS image pixel array with two-lenticule layer
CN105611124A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, imaging method, imaging device and electronic device
CN105611122A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, output method, phase focusing method, imaging apparatus and terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5537905B2 (en) * 2009-11-10 2014-07-02 富士フイルム株式会社 Imaging device and imaging apparatus
CN105578066B (en) * 2015-12-18 2018-01-19 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic installation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104241310A (en) * 2014-09-23 2014-12-24 上海集成电路研发中心有限公司 CMOS image pixel array with two-lenticule layer
CN105611124A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, imaging method, imaging device and electronic device
CN105611122A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, output method, phase focusing method, imaging apparatus and terminal

Also Published As

Publication number Publication date
CN107040702A (en) 2017-08-11

Similar Documents

Publication Publication Date Title
JP6878604B2 (en) Imaging method and electronic device
CN107040724B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN107040702B (en) Image sensor, focusing control method, imaging device and mobile terminal
CN106982329B (en) Image sensor, focusing control method, imaging device and mobile terminal
US10764522B2 (en) Image sensor, output method, phase focusing method, imaging device, and terminal
CN106982328B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN103238097B (en) Imaging device and method for detecting state of focusing
CN107105140B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
WO2018196703A1 (en) Image sensor, focusing control method, imaging device and mobile terminal
CN107146797B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
JP4322921B2 (en) Camera module and electronic device including the same
CN107124536B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN116506745A (en) Image forming apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

GR01 Patent grant
GR01 Patent grant