CN106982328B - Dual-core focusing image sensor, focusing control method thereof and imaging device - Google Patents

Dual-core focusing image sensor, focusing control method thereof and imaging device Download PDF

Info

Publication number
CN106982328B
CN106982328B CN201710295959.6A CN201710295959A CN106982328B CN 106982328 B CN106982328 B CN 106982328B CN 201710295959 A CN201710295959 A CN 201710295959A CN 106982328 B CN106982328 B CN 106982328B
Authority
CN
China
Prior art keywords
focusing
photosensitive
dual
core
output value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710295959.6A
Other languages
Chinese (zh)
Other versions
CN106982328A (en
Inventor
曾元清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710295959.6A priority Critical patent/CN106982328B/en
Publication of CN106982328A publication Critical patent/CN106982328A/en
Application granted granted Critical
Publication of CN106982328B publication Critical patent/CN106982328B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Abstract

The invention discloses a dual-core focusing image sensor, a focusing control method thereof and an imaging device, wherein the dual-core focusing image sensor comprises the following components: the focusing photosensitive unit comprises a photosensitive unit array, a filtering unit array and a micro-lens array, wherein the filtering unit array is arranged on the photosensitive unit array, the micro-lens array is positioned above the filtering unit array, the micro-lens array comprises a first micro-lens and a second micro-lens, one micro-lens covers one white filtering unit, one white filtering unit covers one focusing photosensitive unit, the focusing photosensitive unit comprises N photosensitive pixels, each photosensitive pixel corresponds to one photodiode, the photodiodes are fan-shaped, and one second micro-lens covers one double-core focusing photosensitive pixel. The dual-core focusing image sensor provided by the embodiment of the invention can increase the light flux of the focusing pixels and provide a hardware basis for improving the focusing speed in a low-light environment.

Description

Dual-core focusing image sensor, focusing control method thereof and imaging device
Technical Field
The invention relates to the technical field of image equipment, in particular to a dual-core focusing image sensor, a focusing control method thereof and an imaging device.
Background
Among related focusing technologies, the dual-core full-pixel focusing technology has become the most advanced focusing technology in the market. Compared with contrast focusing, laser focusing and phase focusing technologies, the dual-core full-pixel focusing technology has the advantages of higher focusing speed and wider focusing range. In addition, in the dual-core full-pixel focusing technology, the dual-core photodiodes are combined into one pixel to output during imaging, so that the focusing performance can be ensured, and the image quality is not influenced.
However, when the dual-core all-pixel focusing technique is used for focusing, the photodiode of each pixel is divided into two parts, so that the light transmission amount is reduced, and the dual-core focusing is difficult in a low-light environment.
Disclosure of Invention
The object of the present invention is to solve at least to some extent one of the above mentioned technical problems.
Therefore, a first objective of the present invention is to provide a method for controlling focusing of a dual-core focusing image sensor, which can increase the light flux of a focusing pixel and effectively increase the focusing speed in a low-light environment.
A second objective of the present invention is to provide a dual-core focusing image sensor.
A third object of the present invention is to provide an image forming apparatus.
A fourth object of the present invention is to provide a mobile terminal.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a focus control method for a dual-core focusing image sensor, wherein the dual-core focusing image sensor includes: the method comprises a photosensitive unit array, a light filtering unit array arranged on the photosensitive unit array and a micro-lens array positioned on the light filtering unit array, wherein the micro-lens array comprises a first micro-lens and a second micro-lens, one first micro-lens covers one white light filtering unit, one white light filtering unit covers one focusing photosensitive unit, the focusing photosensitive unit comprises N photosensitive pixels, each photosensitive pixel corresponds to one photodiode, the photodiodes are fan-shaped, and one second micro-lens covers one dual-core focusing photosensitive pixel, and the method comprises the following steps:
controlling the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
The focusing control method of the dual-core focusing image sensor is based on the fact that a first micro lens covers a white filtering unit, a white filtering unit covers a focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, the focusing control method can increase the light transmission amount of the focusing pixel and effectively improve the focusing speed in a low-light environment by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information.
In order to achieve the above object, a second embodiment of the present invention provides a dual-core focusing image sensor, including: the focusing photosensitive unit comprises a photosensitive unit array, a filtering unit array and a micro-lens array, wherein the filtering unit array is arranged on the photosensitive unit array, the micro-lens array is positioned above the filtering unit array, the micro-lens array comprises a first micro-lens and a second micro-lens, one micro-lens covers one white filtering unit, one white filtering unit covers one focusing photosensitive unit, the focusing photosensitive unit comprises N photosensitive pixels, each photosensitive pixel corresponds to one photodiode, the photodiodes are fan-shaped, and one second micro-lens covers one double-core focusing photosensitive pixel.
The dual-core focusing image sensor provided by the embodiment of the invention has the advantages that the micro lens array comprising the first micro lens and the second micro lens is arranged, one first micro lens covers one white light filtering unit, one white light filtering unit covers one focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to one sector photodiode, and one second micro lens covers one dual-core focusing photosensitive pixel, so that the light transmission quantity of the focusing pixels can be increased, and a hardware basis is provided for improving the focusing speed in a low-light environment.
In order to achieve the above object, an embodiment of a third aspect of the present invention proposes an image forming apparatus including: the dual-core focusing image sensor provided by the embodiment of the second aspect; the control module controls the photosensitive unit array to enter a focusing mode; reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels; and carrying out focusing control according to the first phase difference information and the second phase difference information.
The imaging device provided by the embodiment of the invention is characterized in that a first micro lens covers a white light filtering unit, a white light filtering unit covers a focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, and the focusing control is carried out by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel according to the first phase difference information and the second phase difference information, so that the light transmission amount of the focusing pixel can be increased, and the focusing speed under a low-light environment is effectively improved.
In order to achieve the above object, a fourth aspect of the present invention further provides a mobile terminal, which includes a housing, a processor, a memory, a circuit board, and a power circuit, wherein the circuit board is disposed inside a space enclosed by the housing, and the processor and the memory are disposed on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to execute the focusing control method of the dual-core focusing image sensor proposed in the embodiment of the first aspect.
The mobile terminal provided by the embodiment of the invention is characterized in that a white filtering unit is covered on the basis of a first micro lens, a focusing photosensitive unit is covered by a white filtering unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased and the focusing speed under a low-light environment can be effectively improved by reading the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1(a) is a schematic structural diagram of a conventional dual-core focusing image sensor;
FIG. 1(b) is a schematic diagram of the photodiode opening in a conventional dual-core focusing image sensor;
FIG. 2 is a cross-sectional view of a dual core in-focus image sensor according to one embodiment of the present invention;
FIG. 3 is a top view of a dual core in focus image sensor according to one embodiment of the present invention;
FIG. 4 is a schematic diagram of a photodiode opening in a dual core in-focus image sensor according to one embodiment of the present invention;
FIG. 5 is a first microlens array density distribution diagram;
FIG. 6 is a flowchart of a focus control method of a dual-core focus image sensor according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating a division effect of 2 × 2 photosensitive pixels in a focusing photosensitive unit according to an embodiment of the invention;
FIG. 8 is a flowchart of a focus control method of a dual-core focus image sensor according to another embodiment of the present invention;
FIG. 9 is a schematic diagram of an interpolation algorithm for obtaining pixel values of a focusing photosensitive unit;
FIG. 10 is a schematic structural diagram of an imaging apparatus according to an embodiment of the invention;
fig. 11 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A dual-core focusing image sensor, a focusing control method thereof, and an imaging device according to embodiments of the present invention are described below with reference to the accompanying drawings.
The dual-core full-pixel focusing technology is the most advanced focusing technology in the current market, the dual-core focusing sensor structure adopted by the focusing technology is shown in fig. 1(a), and each microlens (the circle in fig. 1(a) represents a microlens) corresponds to two photodiodes. When the imaging process is performed, the values of "1" and "2" are added to obtain a single-component pixel value. When the focusing process is performed, the values of "1" and "2" are read out, and the driving amount and driving direction of the lens can be calculated by calculating the phase difference between the two values.
It can be understood that as the total number of pixels increases, the photosensitive areas corresponding to "1" and "2" become smaller, which reduces the throughput, resulting in that the phase information in a low light environment is easily drowned by noise and the focusing is difficult.
In addition, in the related art, the photodiode used in the dual-core focusing image sensor is generally square, and as shown in fig. 1(b), when light is transmitted by using a square photodiode, the light sensing area of the photodiode cannot be fully utilized, so that the amount of transmitted light in a low-light environment is insufficient, and the focusing speed in the low-light environment is affected.
Therefore, in order to solve the problem that the conventional dual-core full-pixel focusing technology is difficult to focus in a low-light environment, the invention provides a focusing control method of a dual-core focusing image sensor, which can increase the light flux of focusing pixels and effectively improve the focusing speed in the low-light environment.
The dual-core focusing image sensor required for implementing the focusing control method of the dual-core focusing image sensor provided by the invention is introduced below.
Fig. 2 is a cross-sectional view of a dual core in-focus image sensor according to an embodiment of the present invention, and fig. 3 is a top view of the dual core in-focus image sensor according to an embodiment of the present invention.
As shown in fig. 2 and 3, the dual core in-focus image sensor 100 includes a photosensitive cell array 10, a filter cell array 20, and a microlens array 30.
Wherein, the filter unit array 20 is disposed on the photosensitive unit array 10, and the microlens array 30 is disposed on the filter unit array 20. The microlens array 30 includes first and second microlenses 31 and 32. A first microlens 31 covers one white filter unit 21, one white filter unit 21 covers one focusing photosensitive unit 11, one second microlens 32 covers one filter unit 22, and one filter unit 22 covers one binuclear focusing photosensitive pixel 12.
In the embodiment of the present invention, the arrangement of the dual-core focusing photosensitive pixels 12 is a Bayer pattern (Bayer pattern). The bayer structure is adopted, so that the image signal can be processed by adopting a traditional algorithm aiming at the bayer structure, and large adjustment on a hardware structure is not needed. The dual-core focus-sensitive pixel 12 has two photodiodes, a first photodiode 121 and a second photodiode 122, respectively, corresponding to "1" and "2" of each dual-core focus-sensitive pixel 12 in fig. 3, respectively.
In the embodiment of the present invention, the focusing photosensitive unit 11 includes N × N photosensitive pixels 110 (in the dual-core focusing image sensor structure shown in fig. 3, the focusing photosensitive unit 11 includes 2 × 2 photosensitive pixels 110), and one white filter unit 21 covers one focusing photosensitive unit 11, that is, N × N photosensitive pixels 110 correspond to one white filter unit 21.
In summary, in the dual-core focusing image sensor 100 according to the embodiment of the invention, N × N photosensitive pixels 110 form a group and share one first microlens 31, and the group, i.e., the photosensitive pixels 110 in the focusing photosensitive unit 11, corresponds to one white filter unit 21.
In the embodiment of the present invention, there is a photodiode corresponding to each photosensitive pixel in the focusing photosensitive unit 11, and the photodiode has a fan shape, as shown by a gray fan portion in a white area in fig. 3 and 4. Compared with the square photodiode shown in fig. 1(b), the sector photodiode shown in fig. 4 has a larger photosensitive area, so that the sector photodiode used in the dual-core focusing image sensor disclosed by the embodiment of the invention can obtain a larger light transmission amount, and is more beneficial to focusing in a low-light environment.
In one embodiment of the present invention, the microlens array 30 includes a horizontal center line and a vertical center line, and the first microlens 31 is plural. The plurality of first microlenses 31 includes a first group of first microlenses 31 disposed at a horizontal center line and a second group of first microlenses 31 disposed at a vertical center line.
In an embodiment of the present invention, the microlens array 30 may further include four edges, and in this case, the plurality of first microlenses 31 further includes a third group of first microlenses 31 disposed on the four edges.
When the microlens array 30 includes a horizontal center line, a vertical center line, and four side lines, the lens density of the first group of first microlenses 31 and the second group of first microlenses 31 is greater than the lens density of the third group of first microlenses 31.
For ease of understanding, the arrangement of the first microlenses 31 in the microlens array 30 is described below with reference to the drawings. Fig. 5 is a first microlens array density distribution diagram. As shown in fig. 5, the white filter units 21 covered by the first microlenses 31, i.e., W in the figure, are scattered in the entire dual-core focusing image sensor, accounting for 3% to 5% of the total number of pixels, and are distributed more densely on the horizontal center line and the vertical center line of the microlens array 30, and are distributed sparsely on the four side lines, so that the focusing accuracy and speed in the middle area of the screen are preferentially considered, and the focusing speed is effectively increased without affecting the image quality.
In fig. 3 and 5, W indicates that the filter unit covered by the first microlens 31 in the dual-core in-focus image sensor is the white filter unit 21, and a larger amount of light can be obtained when the white filter unit 21 is used. The filter unit covered by the first microlens 31 may also be a green filter unit, i.e., W in fig. 3 and 5 may be replaced by G, and when the green filter unit is used, more information is available in the imaging process. It should be understood that the embodiment of the present invention only uses the white filter unit as an example for illustration, and is not to be taken as a limitation of the present invention.
Based on the structure of the dual-core focusing image sensor in fig. 2-5, the following describes a focusing control method of the dual-core focusing image sensor according to an embodiment of the present invention. Fig. 6 is a flowchart of a focus control method of a dual-core focus image sensor according to an embodiment of the present invention, as shown in fig. 6, the method includes the steps of:
and S61, controlling the photosensitive unit array to enter a focusing mode.
When the camera is used for shooting, if the displayed picture is insufficient in definition, the photosensitive unit array can be controlled to enter a focusing mode so as to improve the definition of the picture through focusing.
S62, the first phase difference information of the focus photosensitive unit and the second phase difference information of the dual-core focus photosensitive pixels are read.
In an embodiment of the present invention, after entering the focusing mode, the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel may be further read.
Optionally, in an embodiment of the present invention, reading the first phase difference information of the focusing photosensitive unit may include: reading output values of a part of photosensitive pixels in the focusing photosensitive unit as first output values; reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value; and acquiring first phase difference information according to the first output value and the second output value.
Taking the focusing photosensitive unit including 2 × 2 photosensitive pixels as an example, the focusing photosensitive unit is divided from different angles, as shown in fig. 7, and 3 diagrams shown in fig. 7 are respectively divided from the left side, the right side, the upper side, the lower side, and the diagonal side of the focusing photosensitive unit, and the following is specifically described:
in example one, the focus photosensitive unit is divided from the left and right sides.
In this example, 2 × 2 photosensitive pixels in the focusing photosensitive unit are divided into two left and right portions, and a part of the photosensitive pixels in the focusing photosensitive unit may be two photosensitive pixels on the left side of the 2 × 2 photosensitive pixels, that is, the output values of two "1" on the left side in the focusing photosensitive unit are used as the first output values, and the output values of two "2" on the right side in the focusing photosensitive unit are used as the second output values.
Example two, the focus photosensitive units are divided from the upper and lower sides.
In this example, 2 × 2 photosensitive pixels in the focusing photosensitive unit are divided into two upper and lower portions, a portion of the photosensitive pixels in the focusing photosensitive unit may be two photosensitive pixels on the upper side of the 2 × 2 photosensitive pixels, that is, the output values of two "1" on the upper side in the focusing photosensitive unit are taken as the first output values, and the output values of two "2" on the lower side in the focusing photosensitive unit are taken as the second output values.
Example three, the focus photosensitive unit is divided from the diagonal side.
In this example, 2 × 2 photosensitive pixels in the focusing photosensitive unit are divided into two parts according to two diagonal lines, that is, the photosensitive pixel at the upper left corner and the photosensitive pixel at the lower right corner are taken as one part of the two parts, that is, the output values of two "1" at the upper left corner and the lower right corner in the focusing photosensitive unit are taken as the first output values, and the output values of two "2" at the lower left corner and the upper right corner are taken as the second output values.
In an embodiment of the present invention, after the first output value and the second output value are read, the first phase difference information may be acquired according to the first output value and the second output value.
For example, taking the output values of the photosensitive pixels on the left and right sides of 2 × 2 photosensitive pixels of the focusing photosensitive unit as the first output value and the second output value, respectively, the sum of the output values of two "1" on the left side in the focusing photosensitive unit may be calculated as the first phase information, the sum of the output values of two "2" on the right side in the focusing photosensitive unit may be calculated as the second phase information, and finally, the difference between the first phase information and the second phase information may be calculated as the first phase difference information.
In the embodiment of the invention, the output values of the photosensitive pixels at the left and right sides of 2 × 2 photosensitive pixels in the focusing photosensitive unit are respectively used as a first output value and a second output value, so that first phase difference information in the left and right directions can be detected; the output values of the photosensitive pixels at the upper and lower sides of 2 × 2 photosensitive pixels in the focusing photosensitive unit are respectively used as a first output value and a second output value, so that first phase difference information in the upper and lower directions can be detected; the output values of the photosensitive pixels on two diagonal lines in the focusing photosensitive unit are respectively used as a first output value and a second output value, and the oblique first phase difference information can be detected.
Optionally, in an embodiment of the present invention, reading the second phase difference information of the dual-core focusing photosensitive pixels may include: reading an output value of the first photodiode as a third output value; reading an output value of the second photodiode as a fourth output value; and acquiring second phase difference information according to the third output value and the fourth output value.
Still taking fig. 3 as an example, in fig. 3, the second phase difference information of all the dual-core focus-sensitive pixels is calculated in the same manner, and only the second phase difference information at Gr in fig. 3 is taken as an example for description. First, the output value of "1" at Gr is read as the third output value, and then the output value of "2" at Gr is read as the fourth output value, and the second phase difference information is obtained from the third output value and the fourth output value, for example, the difference between the third output value and the fourth output value may be calculated as the second phase difference information.
And S63, performing focusing control according to the first phase difference information and the second phase difference information.
In the embodiment of the invention, after the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive unit are read, the focusing control can be performed according to the first phase difference information and the second phase difference information.
In the related dual-core focusing technology, a phase difference is usually calculated according to output values of two photodiodes in a dual-core focusing photosensitive pixel, so as to calculate a driving amount and a driving direction of a lens, thereby realizing focusing. In a low light environment, the focusing speed is slow.
In the embodiment of the invention, based on the fact that the first micro lens covers one white light filtering unit and one white light filtering unit covers one focusing photosensitive unit, the white light filtering unit is adopted, the first phase difference information with larger light transmission quantity can be obtained under the low-light environment for focusing processing, and the focusing speed under the low-light environment is further improved.
The focusing control method of the dual-core focusing image sensor is based on the fact that a first micro lens covers a white filtering unit, a white filtering unit covers a focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, the focusing control method can increase the light transmission amount of the focusing pixel and effectively improve the focusing speed in a low-light environment by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information.
It should be understood that the purpose of focusing is to obtain a higher definition picture. In practical applications, after the focusing process is completed, a further imaging process is usually included, so that, as shown in fig. 8, on the basis of fig. 6, step S63 is followed by:
and S81, controlling the photosensitive unit array to enter an imaging mode.
In an embodiment of the present invention, after the focus control is completed, the array of photosensitive cells is further controlled to enter an imaging mode.
S82, the array of photosensitive cells is controlled to perform exposure, and the output values of the array of photosensitive cells are read to obtain the pixel values of the array of photosensitive cells to generate an image.
The pixel value of the focusing photosensitive unit is obtained through an interpolation reduction algorithm.
In the embodiment of the invention, after the photosensitive cell array enters the imaging mode, the photosensitive cell array is controlled to be exposed, and the output value of the photosensitive cell array is read, so that the pixel value of the photosensitive cell array is obtained to generate an image.
In an embodiment of the present invention, reading the output value of the light sensing unit array to obtain the pixel value of the light sensing unit array may include: after the output values of two photodiodes in a dual-core focusing photosensitive pixel are read, adding the output values of the two photodiodes to obtain a pixel value of the dual-core focusing photosensitive pixel; and obtaining the pixel value of the focusing photosensitive unit by adopting an interpolation reduction algorithm, wherein the interpolation reduction algorithm can be any one of a nearest neighbor interpolation algorithm, a bilinear interpolation algorithm and a cubic convolution interpolation algorithm.
For simplicity, a nearest neighbor interpolation algorithm may be used to obtain the pixel value of the focusing photosensitive unit, i.e. the gray value of the input pixel closest to the position to which the focusing photosensitive unit is mapped is selected as the interpolation result, i.e. the pixel value of the focusing photosensitive unit.
FIG. 9 is a schematic diagram of obtaining a pixel value of a focus photosensitive unit by using an interpolation algorithm.
As shown in fig. 9, the white filter unit (white area in the figure) includes four photosensitive pixels, and in order to output an image with good image quality, it is necessary to interpolate and restore the output values of the respective photosensitive pixels, that is, to obtain RGB values of each photosensitive pixel by calculation. The pixel average of the neighboring pixels may be taken as the pixel value of the photosensitive pixel. Taking the calculation of the RGB value at the upper left corner "1" in the white filter unit as an example, for the convenience of description, the R pixel value at the upper left corner "1" is recorded as R10And the G pixel value is marked as G10And the B pixel value is B10The calculation formulas are respectively as follows:
Figure BDA0001283070800000081
Figure BDA0001283070800000082
Figure BDA0001283070800000083
it should be noted that, the interpolation reduction method for RGB values at the bottom left corner "1", the top right corner "2" and the bottom right corner "2" in the white filter unit is similar to the RGB value reduction method at the top left corner "1", and adjacent pixel points are selected for interpolation reduction, and no example is given here to avoid redundancy.
It should be noted that the above description of the algorithm for obtaining the pixel value of the focus photosensitive unit is only used for explaining the present invention, and should not be taken as limiting the present invention. In actual processing, to obtain more accurate pixel values, the pixel values of several adjacent pixels can be used for interpolation restoration, and are not limited to the pixel values of the adjacent pixels, wherein a higher weight is assigned to a pixel value close to the pixel value, and a lower weight is assigned to a pixel value far away from the pixel value, that is, the weight occupied by the pixel value in the interpolation restoration algorithm is inversely proportional to the distance of the restored pixel.
In the embodiment of the invention, after the pixel values of the focusing photosensitive units are restored, the image can be generated according to the pixel values of all the pixel points in the photosensitive unit array.
According to the focusing control method of the dual-core focusing image sensor, after focusing control is finished, the photosensitive unit array is controlled to enter an imaging mode, exposure is carried out on the photosensitive unit array, the output value of the photosensitive unit array is read to obtain the pixel value of the photosensitive unit array, so that an image is generated, and the image quality can be improved.
In order to implement the above embodiments, the present invention further provides a dual-core focused image sensor, fig. 2 is a cross-sectional view of the dual-core focused image sensor according to an embodiment of the present invention, and fig. 3 is a top view of the dual-core focused image sensor according to an embodiment of the present invention.
It should be noted that, the explanation about the dual-core focusing image sensor in the foregoing focusing control method embodiment of the dual-core focusing image sensor is also applicable to the dual-core focusing image sensor in the embodiment of the present invention, and the implementation principle is similar, and details are not described here.
The dual-core focusing image sensor provided by the embodiment of the invention has the advantages that the micro lens array comprising the first micro lens and the second micro lens is arranged, one first micro lens covers one white light filtering unit, one white light filtering unit covers one focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to one sector photodiode, and one second micro lens covers one dual-core focusing photosensitive pixel, so that the light transmission quantity of the focusing pixels can be increased, and a hardware basis is provided for improving the focusing speed in a low-light environment.
In order to realize the above embodiments, the present invention further provides an imaging apparatus, and fig. 10 is a schematic structural diagram of the imaging apparatus according to an embodiment of the present invention.
As shown in fig. 10, the imaging apparatus 1000 includes the dual core focus image sensor 100 of the above embodiment and a control module 1010. Wherein the content of the first and second substances,
the control module 1010 controls the photosensitive cell array to enter a focusing mode, reads first phase difference information of the focusing photosensitive cells and second phase difference information of the dual-core focusing photosensitive pixels, and performs focusing control according to the first phase difference information and the second phase difference information.
Optionally, in an embodiment of the present invention, the control module 1010 is configured to read output values of a part of photosensitive pixels in the focusing photosensitive unit as a first output value, read output values of another part of photosensitive pixels in the focusing photosensitive unit as a second output value, and obtain the first phase difference information according to the first output value and the second output value.
In an embodiment of the present invention, a dual-core focusing photosensitive pixel in the dual-core focusing image sensor 100 has two photodiodes, a first photodiode and a second photodiode. Therefore, the control module 1010 is further configured to read the output value of the first photodiode as a third output value, read the output value of the second photodiode as a fourth output value, and obtain the second phase difference information according to the third output value and the fourth output value.
It should be understood that the purpose of focusing is to obtain a higher definition picture. In practical applications, after the focusing process is completed, a further imaging process is usually included, and therefore, in an embodiment of the present invention, the control module 1010 is further configured to control the photosensitive cell array to enter an imaging mode, control the photosensitive cell array to perform exposure, and read an output value of the photosensitive cell array to obtain a pixel value of the photosensitive cell array so as to generate an image, where the pixel value of the focused photosensitive cell is obtained through an interpolation reduction algorithm.
The imaging device provided by the embodiment of the invention is characterized in that a first micro lens covers a white light filtering unit, a white light filtering unit covers a focusing photosensitive unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, and the focusing control is carried out by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel according to the first phase difference information and the second phase difference information, so that the light transmission amount of the focusing pixel can be increased, and the focusing speed under a low-light environment is effectively improved.
In order to implement the above embodiments, the present invention further provides a mobile terminal, and fig. 11 is a schematic structural diagram of the mobile terminal according to an embodiment of the present invention.
As shown in fig. 11, the mobile terminal 1100 includes a housing 1101, a processor 1102, a memory 1103, a circuit board 1104 and a power circuit 1105, wherein the circuit board 1104 is disposed inside a space surrounded by the housing 1101, and the processor 1102 and the memory 1103 are disposed on the circuit board 1104; a power supply circuit 1105 for supplying power to each circuit or device of the mobile terminal; the memory 1103 is used to store executable program code; the processor 1102 runs a program corresponding to the executable program code by reading the executable program code stored in the memory 1103 for executing the focus control method of the dual-core focus image sensor in the above-described embodiment.
The mobile terminal provided by the embodiment of the invention is characterized in that a white filtering unit is covered on the basis of a first micro lens, a focusing photosensitive unit is covered by a white filtering unit, each photosensitive pixel in the focusing photosensitive unit corresponds to a fan-shaped photodiode, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased and the focusing speed under a low-light environment can be effectively improved by reading the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It should be noted that in the description of the present specification, reference to the description of the term "one embodiment", "some embodiments", "an example", "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (18)

1. A focus control method of a dual-core focusing image sensor is characterized in that the dual-core focusing image sensor comprises the following steps: the focusing photosensitive unit comprises N photosensitive pixels, the N photosensitive pixels share one first microlens, each photosensitive pixel corresponds to one photodiode, the photodiodes are fan-shaped, one second microlens covers one dual-core focusing photosensitive pixel, the microlens array comprises a horizontal central line, a vertical central line and four side lines, the first microlenses are multiple, and the lens density of the first microlenses on the horizontal central line and the vertical central line is greater than that of the first microlenses on the four side lines, the method comprises the following steps:
controlling the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
2. The method of claim 1, wherein reading the first phase difference information of the focus photosensitive unit comprises:
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
and acquiring the first phase difference information according to the first output value and the second output value.
3. The method of claim 1, wherein the dual-core focusing-photosensitive pixel has two photodiodes, a first photodiode and a second photodiode, reading second phase difference information of the dual-core focusing-photosensitive pixel, comprising:
reading an output value of the first photodiode as a third output value;
reading an output value of the second photodiode as a fourth output value;
and acquiring the second phase difference information according to the third output value and the fourth output value.
4. The method of claim 1, wherein said dual-core focus-sensitive pixels are arranged in a bayer array.
5. The method of claim 1, wherein the plurality of first microlenses comprises:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
6. The method of claim 5, wherein the plurality of first microlenses further comprises:
and a third group of first microlenses arranged on the four edge lines.
7. The method of claim 6, wherein the first set of first microlenses and the second set of first microlenses have a lens density greater than a lens density of the third set of first microlenses.
8. The method of claim 1, wherein the method further comprises:
controlling the photosensitive unit array to enter an imaging mode;
and controlling the photosensitive unit array to perform exposure, and reading an output value of the photosensitive unit array to obtain a pixel value of the photosensitive unit array so as to generate an image, wherein the pixel value of the focusing photosensitive unit is obtained by an interpolation reduction algorithm.
9. A dual-core in-focus image sensor, comprising:
an array of photosensitive cells;
the light filtering unit array is arranged on the photosensitive unit array;
a micro lens array positioned above the filter unit array;
the micro-lens array comprises a first micro-lens and a second micro-lens, wherein one first micro-lens covers one white light filtering unit, one white light filtering unit covers one focusing photosensitive unit, the focusing photosensitive unit comprises N photosensitive pixels, the N photosensitive pixels share one first micro-lens, each photosensitive pixel corresponds to one photodiode, the photodiodes are fan-shaped, one second micro-lens covers one dual-core focusing photosensitive pixel, the micro-lens array comprises a horizontal center line, a vertical center line and four side lines, the number of the first micro-lenses is multiple, and the lens density of the first micro-lens on the horizontal center line and the vertical center line is greater than that of the first micro-lens on the four side lines.
10. The dual-core focusing image sensor of claim 9, wherein said dual-core focusing photosensitive pixels are arranged in a bayer array.
11. A dual core in-focus image sensor as claimed in claim 9, wherein the plurality of first microlenses includes:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
12. A dual core in-focus image sensor as claimed in claim 11, wherein said plurality of first microlenses further comprises:
and a third group of first microlenses arranged on the four edge lines.
13. A dual core in-focus image sensor as claimed in claim 12, wherein the lens density of said first group of first microlenses and said second group of first microlenses is greater than the lens density of said third group of first microlenses.
14. An image forming apparatus, comprising:
the dual core in-focus image sensor of any one of claims 9-13; and
the control module controls the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
15. The imaging apparatus of claim 14, wherein the control module is specifically configured to:
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
and acquiring the first phase difference information according to the first output value and the second output value.
16. The imaging apparatus of claim 14, wherein the dual-core focus-sensitive pixel has two photodiodes, a first photodiode and a second photodiode, the control module is specifically configured to:
reading an output value of the first photodiode as a third output value;
reading an output value of the second photodiode as a fourth output value;
and acquiring the second phase difference information according to the third output value and the fourth output value.
17. The imaging apparatus of claim 14, wherein the control module is further to:
controlling the photosensitive unit array to enter an imaging mode;
and controlling the photosensitive unit array to perform exposure, and reading an output value of the photosensitive unit array to obtain a pixel value of the photosensitive unit array so as to generate an image, wherein the pixel value of the focusing photosensitive unit is obtained by an interpolation reduction algorithm.
18. A mobile terminal comprises a shell, a processor, a memory, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor runs a program corresponding to an executable program code stored in the memory by reading the executable program code for performing the focus control method of the dual core focus image sensor according to any one of claims 1 to 8.
CN201710295959.6A 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device Active CN106982328B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710295959.6A CN106982328B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710295959.6A CN106982328B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Publications (2)

Publication Number Publication Date
CN106982328A CN106982328A (en) 2017-07-25
CN106982328B true CN106982328B (en) 2020-01-10

Family

ID=59342331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710295959.6A Active CN106982328B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Country Status (1)

Country Link
CN (1) CN106982328B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10567636B2 (en) * 2017-08-07 2020-02-18 Qualcomm Incorporated Resolution enhancement using sensor with plural photodiodes per microlens
CN108965665B (en) 2018-07-19 2020-01-31 维沃移动通信有限公司 image sensor and mobile terminal
CN108900751A (en) * 2018-07-19 2018-11-27 维沃移动通信有限公司 A kind of imaging sensor, mobile terminal and image pickup method
CN108600712B (en) * 2018-07-19 2020-03-31 维沃移动通信有限公司 Image sensor, mobile terminal and image shooting method
CN108900772A (en) * 2018-07-19 2018-11-27 维沃移动通信有限公司 A kind of mobile terminal and image capturing method
CN110290328B (en) * 2019-07-04 2021-11-09 Oppo广东移动通信有限公司 Focusing method, device, terminal and computer storage medium
EP4060986A4 (en) * 2019-11-20 2022-11-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image sensor, control method, camera assembly, and mobile terminal
WO2022000485A1 (en) * 2020-07-03 2022-01-06 深圳市汇顶科技股份有限公司 Photoelectric conversion unit, image sensor, and focusing method
CN113992856A (en) * 2021-11-30 2022-01-28 维沃移动通信有限公司 Image sensor, camera module and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611124A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, imaging method, imaging device and electronic device
CN105609516A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor and output method, phase focusing method, imaging apparatus and terminal
CN106358026A (en) * 2015-07-15 2017-01-25 三星电子株式会社 Image sensor including auto-focusing pixel and image processing system including the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5674350B2 (en) * 2009-07-10 2015-02-25 株式会社島津製作所 Solid-state image sensor
US10044959B2 (en) * 2015-09-24 2018-08-07 Qualcomm Incorporated Mask-less phase detection autofocus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106358026A (en) * 2015-07-15 2017-01-25 三星电子株式会社 Image sensor including auto-focusing pixel and image processing system including the same
CN105611124A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor, imaging method, imaging device and electronic device
CN105609516A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image sensor and output method, phase focusing method, imaging apparatus and terminal

Also Published As

Publication number Publication date
CN106982328A (en) 2017-07-25

Similar Documents

Publication Publication Date Title
CN107040724B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN106982328B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN107105140B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN107146797B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
EP3396942B1 (en) Image sensor, imaging method and electronic device
US11082605B2 (en) Method of photography processing for camera module, terminal, using same and storage medium implementing same
CN107040702B (en) Image sensor, focusing control method, imaging device and mobile terminal
KR101304342B1 (en) Image processing method and system
WO2010144124A1 (en) Interpolation for four-channel color filter array
CN107124536B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
WO2018196703A1 (en) Image sensor, focusing control method, imaging device and mobile terminal
WO2018137773A1 (en) Method and device for blind correction of lateral chromatic aberration in color images
CN116506745A (en) Image forming apparatus and method
CN117751576A (en) Demosaicing-free pixel array, image sensor, electronic device and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

GR01 Patent grant
GR01 Patent grant