CN110784633A - Image sensor, camera module, terminal and imaging method - Google Patents

Image sensor, camera module, terminal and imaging method Download PDF

Info

Publication number
CN110784633A
CN110784633A CN201911101741.8A CN201911101741A CN110784633A CN 110784633 A CN110784633 A CN 110784633A CN 201911101741 A CN201911101741 A CN 201911101741A CN 110784633 A CN110784633 A CN 110784633A
Authority
CN
China
Prior art keywords
pixel
polarization
layer
image sensor
polarizing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911101741.8A
Other languages
Chinese (zh)
Other versions
CN110784633B (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911101741.8A priority Critical patent/CN110784633B/en
Publication of CN110784633A publication Critical patent/CN110784633A/en
Application granted granted Critical
Publication of CN110784633B publication Critical patent/CN110784633B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/79Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors

Abstract

The application discloses an image sensor, a camera module, a terminal and an imaging method. The image sensor includes a pixel layer, a polarizing layer, and a filter layer. The pixel layer includes a plurality of pixel groups, each of which includes nine pixel units. The polarizing layer includes a plurality of polarizing element groups, each of which includes nine polarizing elements whose angles of polarization axes can be different from each other, the polarizing elements and the pixel units corresponding one to one. The filter layer comprises a plurality of optical filters, the polarizing element groups and the pixel groups are in one-to-one correspondence, and each pixel unit is used for receiving polarized light which passes through the corresponding optical filters and the corresponding polarizing elements. The image sensor, the camera module, the terminal and the imaging method can acquire the polarized light with nine different polarization angles and the color information of the polarized light in a shot scene, so that different color polarized images are generated according to the polarized light with the color information and the different polarization angle information, and the scene utilization range is wide.

Description

Image sensor, camera module, terminal and imaging method
Technical Field
The present application relates to the field of consumer electronics, and in particular, to an image sensor, a camera module, a terminal, and an imaging method.
Background
The current image sensor cannot distinguish polarized light with different polarization angles in a shot scene, cannot generate different color polarization images according to the polarized light with different polarization angles, for example, a color polarization image is generated after removing the polarized light with a certain polarization angle, or a color polarization image is generated according to the polarized light with a certain polarization angle, and the scene utilization range is small.
Disclosure of Invention
The embodiment of the application provides an image sensor, a camera module, a terminal and an imaging method.
The image sensor comprises a pixel layer, a polarizing layer and a filter layer, wherein the pixel layer comprises a plurality of pixel groups, and each pixel group comprises nine pixel units. The polarization layer comprises a plurality of polarization element groups, each polarization element group comprises nine polarization elements, angles of polarization axes of the nine polarization elements can be different from each other, and the polarization elements correspond to the pixel units one by one. The filter layer comprises a plurality of optical filters, the polarizing element groups and the pixel groups correspond to one another one by one, and each pixel unit is used for receiving polarized light which passes through the corresponding optical filters and the corresponding polarizing elements.
The camera module of this application embodiment includes image sensor and lens module. The image sensor is arranged on the image side of the lens module. The image sensor includes a pixel layer and a polarization layer, the pixel layer including a plurality of pixel groups, each of the pixel groups including nine pixel units. The polarization layer comprises a plurality of polarization element groups, each polarization element group comprises nine polarization elements, angles of polarization axes of the nine polarization elements can be different from each other, and the polarization elements correspond to the pixel units one by one. The filter layer comprises a plurality of optical filters, the polarizing element groups and the pixel groups correspond to one another one by one, and each pixel unit is used for receiving polarized light which passes through the corresponding optical filters and the corresponding polarizing elements.
The terminal of the embodiment of the application comprises a shell and a camera module. The camera module is mounted on the housing. The camera module comprises an image sensor and a lens module. The image sensor is arranged on the image side of the lens module. The image sensor includes a pixel layer and a polarization layer, the pixel layer including a plurality of pixel groups, each of the pixel groups including nine pixel units. The polarization layer comprises a plurality of polarization element groups, each polarization element group comprises nine polarization elements, angles of polarization axes of the nine polarization elements can be different from each other, and the polarization elements correspond to the pixel units one by one. The filter layer comprises a plurality of optical filters, the polarizing element groups and the pixel groups correspond to one another one by one, and each pixel unit is used for receiving polarized light which passes through the corresponding optical filters and the corresponding polarizing elements.
The imaging method comprises the steps of obtaining pixel values of nine pixel units in each pixel group of an image sensor, wherein the pixel units are used for receiving light rays passing through corresponding polarizing elements and optical filters, and angles of polarization axes of the nine polarizing elements respectively corresponding to the nine pixel units in the pixel group can be different from each other; and generating a color polarization image according to the pixel values of the pixel units receiving the polarized light of the target polarization angle.
In the image sensor, the camera module, the terminal and the imaging method according to the embodiments of the present application, angles of polarization axes of nine polarization elements respectively corresponding to nine pixel units of a pixel group can be different from each other, the nine pixel units of each pixel group can receive polarized light passing through a corresponding optical filter and having different polarization angles, and can acquire the polarized light of nine different polarization angles and color information of the polarized light in a photographed scene, so that different color polarization images can be generated according to the polarized light having the color information and the different polarization angle information, and the scene utilization range is wide.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of embodiments of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic plan view of a terminal according to some embodiments of the present application.
Fig. 2 is a schematic plan view of another perspective of a terminal according to some embodiments of the present application.
Fig. 3 is a schematic cross-sectional view of a camera module according to some embodiments of the present disclosure.
FIG. 4 is an exploded schematic view of an image sensor according to some embodiments of the present application.
Fig. 5 is a schematic plan view of a filter set according to some embodiments of the present application.
FIG. 6 is a schematic plan view of a group of polarizing elements according to some embodiments of the present application.
FIG. 7 is a schematic plan view of a set of polarizing elements according to some embodiments of the present application.
FIG. 8 is an exploded schematic view of an image sensor according to some embodiments of the present application.
Fig. 9 is a schematic plan view of a pixel layer according to some embodiments of the present application.
FIG. 10 is a schematic cross-sectional view of an image sensor of certain embodiments of the present application.
FIG. 11 is a schematic diagram of the connection of a pixel read circuit and a pixel cell according to some embodiments of the present application.
FIG. 12 is a schematic flow chart of an imaging method according to some embodiments of the present application.
FIG. 13 is a schematic flow chart of an imaging method according to certain embodiments of the present application.
Fig. 14 is a schematic plan view of the group of polarizing elements in fig. 7 in another state.
FIG. 15 is a schematic flow chart of an imaging method according to certain embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1 and 2, a terminal 1000 according to an embodiment of the present disclosure includes a housing 200 and a camera module 100. The camera module 100 is mounted on the housing 200.
Referring to fig. 3, the camera module 100 includes an image sensor 10 and a lens module 20. The image sensor 10 is disposed on the image side of the lens module 20.
Referring to fig. 4 and 5, an image sensor 10 according to an embodiment of the present disclosure includes a pixel layer 11, a polarizing layer 12, and a filter layer 15. The pixel layer 11 includes a plurality of pixel groups 111, and each pixel group 111 includes nine pixel units 112. The polarizing layer 12 includes a plurality of polarizing element groups 121, each polarizing element group 121 includes nine polarizing elements 122, angles of polarization axes of the nine polarizing elements 122 can be different from each other, the polarizing elements 122 and the pixel units 112 are in one-to-one correspondence, and each pixel unit 112 is configured to receive polarized light passing through the corresponding polarizing element 122. The filter layer 15 includes a plurality of filters 151, and the filters 151, the polarizing element group 121, and the pixel group 111 correspond one to one.
In the image sensor 10 of the present application, the angles of the polarization axes of the nine polarization elements 122 corresponding to the nine pixel units 112 of the pixel group 111 may be different from each other, and the nine pixel units 112 of each pixel group 111 may receive polarized light having different polarization angles and passing through the corresponding optical filter 151, and may acquire the polarized light having nine different polarization angles and color information of the polarized light in a scene to be photographed, thereby generating different color polarization images from the polarized light having the color information and the different polarization angle information, and the scene usage range is wide. In addition, the present application can acquire polarization angle information about the attribute of the object itself from the polarization image acquired by the image sensor 10 that can receive polarized light of a larger number of polarization angles, and can be used for identifying the attribute such as the material type and the surface shape of the object.
Referring to fig. 1 and fig. 2 again, more specifically, the terminal 1000 may be a mobile phone, a tablet computer, a monitor, a notebook computer, a teller machine, a gate, a smart watch, a head-up display device, a game machine, and the like. In the embodiment of the present application, the terminal 1000 is a mobile phone as an example, and it is understood that the specific form of the terminal 1000 is not limited to the mobile phone.
The housing 200 may be used to mount the camera module 100, or the housing 200 may serve as a mounting carrier of the camera module 100. The terminal 1000 includes a front side 901 and a back side 902, the camera module 100 can be disposed on the front side 901 as a front camera, the camera module 100 can also be disposed on the back side 902 as a back camera, and in this embodiment, the camera module 100 is disposed on the back side 902 as a back camera. The housing 200 can also be used to install functional modules of the terminal 1000, such as the camera module 100, the power supply device, and the communication device, so that the housing 200 provides protection for the functional modules, such as dust prevention, falling prevention, and water prevention.
Referring to fig. 4, more specifically, the image sensor 10 includes a pixel layer 11, a filter layer 15, a polarization layer 12, and a microlens layer 13. The polarizing layer 12 is located between the pixel layer 11 and the microlens layer 13. The filter layer 15 is located between the pixel layer 11 and the polarizing layer 12. In other embodiments, the filter layer 15 is located between the polarizer layer 12 and the microlens layer 13, thereby providing more flexibility in the location of the filter layer 15.
The pixel layer 11 includes a plurality of pixel groups 111, and each pixel group 111 includes nine pixel units 112. Each pixel unit 112 is configured to receive incident light to perform photoelectric conversion, thereby converting an optical signal into an electrical signal. The nine pixel units 112 may be arranged in a matrix of three rows and three columns, and the pixel arrangement is more compact, so that the entire pixel layer 11 is conveniently spread. Of course, the nine pixel units 112 may be arranged in other shapes, such as a trapezoid, etc., and are not limited to the above-mentioned three rows and three columns matrix arrangement.
The filter layer 15 includes a plurality of optical filters 151, and the optical filters 151, the polarization element groups 121, and the pixel groups 111 are in one-to-one correspondence, that is, the pixel groups 111 are configured to receive polarized light passing through the corresponding polarization element groups 121 and the optical filters 151.
Referring to fig. 5, the filter layer 15 further includes a plurality of filter sets 152, each filter set 152 is formed by four filters 151 arranged in a matrix of two rows and two columns, and the four filters 151 in each filter set 152 are respectively used for transmitting red light R, green light G, blue light B, and green light G. Each pixel group 111 receives light filtered by the corresponding filter 151, for example, light received by the pixel group 111 corresponding to the filter 151 transmitting red light is red light, light received by the pixel group 111 corresponding to the filter 151 transmitting green light is green light, and light received by the pixel group 111 corresponding to the filter 151 transmitting blue light is blue light. In this way, the light received by the pixel unit 112 in the pixel group 111 is associated with color information, and can be used for generating a color image. In other embodiments, four filters 151 in each filter set 152 are used to transmit red light, green light, blue light, and white light, respectively, so as to improve the photographing effect in a dark light environment.
The filter 151 includes nine sub-filters 1511, each sub-filter 1511 corresponds to one pixel unit 112, and each pixel unit 112 is configured to receive the polarized light passing through the corresponding sub-filter 1511 and the polarizing element 122. The nine sub-filters 1511 allow the same light to pass, such as all red, filtered, blue, or white light. The nine sub-filters 1511 may be integrally formed to form one filter 151, and the nine sub-filters 1511 may be separately formed and combined together by gluing or the like to form one filter 151. The nine sub-filters 1511 of the present disclosure may be integrally formed to form one filter 151, and the nine sub-filters 1511 are combined tightly.
Referring to fig. 4 and 6, the polarizing layer 12 includes a plurality of polarizing element groups 121, and each polarizing element group 121 is disposed corresponding to one pixel group 111. Each polarization element group 121 includes nine polarization elements 122, and the angles of the polarization axes of the nine polarization elements 122 can be different from each other, for example, the angles of the polarization axes of the nine polarization elements 122 are 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °, respectively, or the angles of the polarization axes of the nine polarization elements 122 are 5 °, 25 °, 45 °, 65 °, 85 °, 105 °, 125 °, 145 °, and 165 °, respectively, and so on. The polarizing elements 122 are in one-to-one correspondence with the pixel units 112, and each pixel unit 112 is configured to receive polarized light passing through the corresponding polarizing element 122. The nine polarizing elements 122 are also arranged in a matrix of three rows and three columns, which facilitates spreading the entire polarizing layer 12, and makes the nine pixel units 112 have stronger correlation with the polarized light passing through the corresponding polarizing elements 122. Likewise, the nine polarizing elements 122 may be arranged in other shapes, such as a trapezoid, etc., and are not limited to the above-mentioned three rows and three columns matrix arrangement.
The polarization element 122 includes a plurality of polarization wire grids 1221 arranged in parallel and at intervals, and the polarization wire grids 1221 are rectangular bodies, such as rectangular bodies. By setting the angle at which the polarization wire grid 1221 is arranged, the angle of the polarization axis of the corresponding polarization element 122 can be determined, and a light ray parallel to the angle at which the polarization wire grid 1221 is arranged can pass through the polarization wire grid 1221, while a light ray perpendicular to the angle at which the arrangement is arranged can be completely reflected, thereby realizing the polarization of the light. In the example shown in fig. 6, the polarization wire grids 1221 of the nine polarization elements 122 are arranged at angles of 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °, respectively. That is, the angles of the polarization axes of the nine polarization elements 122 can be determined to be 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °, respectively. The spacing of the polarization wire grids 1221 can be determined according to the wavelength of the light to be received, and it is only necessary to ensure that the spacing of the adjacent polarization wire grids 1221 is smaller than the wavelength of the light to be received, for example, when red light is polarized, the spacing of the adjacent polarization wire grids 1221 needs to be smaller than the minimum wavelength of the red light (e.g., 640 nanometers (nm)), when green light is polarized, the spacing of the adjacent polarization wire grids 1221 needs to be smaller than the minimum wavelength of the green light (e.g., 500 nm)), when blue light is polarized, the spacing of the adjacent polarization wire grids 1221 needs to be smaller than the minimum wavelength of the blue light (e.g., 435nm), and it is ensured that the polarization element 122 can effectively polarize the received corresponding. The material of the polarization wire grid 1221 may be a metal, the material of the polarization wire grid 1221 is at least one of gold, silver, copper, and aluminum, for example, the material of the polarization wire grid 1221 is gold, or the material of the polarization wire grid 1221 is silver, or the material of the polarization wire grid 1221 is copper, or the material of the polarization wire grid 1221 is iron, or the material of the polarization wire grid 1221 is an alloy of gold and silver, the material of the polarization wire grid 1221 is an alloy of gold and copper, the material of the polarization wire grid 1221 is an alloy of gold and iron, and so on, which are not listed herein.
Referring to fig. 7, at least one of the polarization elements 122 includes a liquid crystal cell 1222, for example, one polarization element 122 includes a liquid crystal cell 1222, two polarization elements 122 include a liquid crystal cell 1222, three polarization elements 122 include liquid crystal cells 1222, … …, N polarization elements 122 include a liquid crystal cell 1222, and the like, where N is a positive integer, and the present application takes the example that all the polarization elements 122 of the polarization layer 12 include a liquid crystal cell 1222. The number of the liquid crystal cells 122 in each polarization element 122 is one or more, and may be set according to the size of the polarization element 122 and the size of the liquid crystal cells, for example, the number of the liquid crystal cells 122 in each polarization element 122 is one, two, three, etc.
The liquid crystal cell 1222 may be deflected by an electric field to change the angle of the polarization axis of the polarization element 122, such that the angle of the polarization axis of the polarization element 122 may be changed within a predetermined angle range, for example, the predetermined angle range may be [0 °,180 ° ]. In the initial state, the angles of the polarization axes of the nine polarization elements 122 of each polarization element group 121 are 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °, if it is desired to obtain polarized light with more polarization angles, the exposure may be performed multiple times (e.g., two times, three times, etc.) continuously, after the exposure is performed with the angles of the polarization axes of the nine polarization elements 122 in the initial state, polarized light with nine polarization angles may be obtained, and after the angles of the polarization axes of the nine polarization elements 122 of each polarization element group 121 are changed (e.g., respectively changed to 5 °, 25 °, 45 °, 65 °, 85 °, 105 °, 125 °, 145 °, and 165 °), the exposure may be performed again to obtain polarized light with 5 °, 25 °, 45 °, 65 °, 85 °, 105 °, 125 °, 145 °, and 165 °, if it is desired to obtain polarized light with more polarization angles, the angles of the polarization axes of the nine polarization elements 122 of each polarization element group 121 may be changed again and exposure may be performed. In this way, the image sensor 10 can acquire polarized light with any polarization angle within a predetermined angle range according to the requirement.
Referring to fig. 4 and 8, the microlens layer 13 is disposed on a side of the polarization layer 12 opposite to the pixel layer 11. The microlens layer 13 includes a plurality of microlenses 131. The micro lens 131 may be a convex lens for converging light emitted from the lens module 20 to the micro lens 131 so that more light is irradiated on the polarizing layer 12. Each microlens 131 corresponds to one pixel unit 112, and the pixel units 112 correspond to the polarization elements 122 one to one, that is, the microlenses 131, the polarization elements 122, and the pixel units 112 correspond to one; alternatively, each microlens 131 corresponds to one pixel group 111, and the pixel group 111 corresponds to the polarization element group 121 one by one, that is, the microlenses 131, the polarization element group 121, and the pixel group 111 correspond one by one.
Referring to fig. 4 and 9, when each microlens 131 corresponds to one pixel unit 112, the corresponding microlens 131 and pixel unit 112 of the pixel unit 112 near the center of the pixel layer 11 (for example, 2 pixel units 112 near the center of the pixel layer 11) are aligned, and the pixel unit 112 and the corresponding microlens 131 near the center of the non-pixel layer 11 are offset from each other. Specifically, taking each pixel unit 112 as a square and the side length as L as an example, the center of the pixel layer 11 is the intersection of the diagonals of the rectangular pixel layer 11, and the center of the pixel layer 11 is the center of a circle, which is larger than the radius
Figure BDA0002270072170000061
A plurality of circles (i.e., the radius of the smallest circle covering two pixel cells 112 near the center of the pixel layer 11 with the center of the pixel layer 11) and smaller than the radius R2 (i.e., half of the diagonal length of the pixel layer 11) are all located at non-central positions, the amount of shift between the pixel cells 112 and the corresponding microlenses 131 distributed on the same circle is the same, and the amount of shift between the pixel cells 112 and the corresponding microlenses 131 is positively correlated with the size of the radius. Here, the offset amount refers to a distance between the center of the orthographic projection of the microlens 131 on the pixel layer 11 and the center of the corresponding pixel unit 112.
Referring to fig. 8 and 9, when each microlens 131 corresponds to one pixel group 111, the pixel group 111 near the center of the pixel layer 11 (for example, 2 pixel groups 111 near the center of the pixel layer 11) corresponds toThe microlens 131 and the pixel group 111 are aligned, and the pixel group 111 and the corresponding microlens 131 are offset from each other instead of the center of the pixel layer 11, and the size of the microlens 131 can be set large to be able to converge all the light rays directed to the pixel group 111. Specifically, taking each pixel group 111 as a square and the side length as M as an example, the center of the pixel layer 11 is the intersection of the diagonals of the rectangular pixel layer 11, and the center of the pixel layer 11 is the center of a circle and is larger than the radius
Figure BDA0002270072170000062
A plurality of circles (i.e., the radius of the smallest circle covering the two pixel groups 111 near the center of the pixel layer 11 with the center of the pixel layer 11) each having a radius R2 smaller than the radius R2 (i.e., half the length of the diagonal line of the pixel layer 11) are located at non-central positions, the amounts of shift of the pixel groups 111 and the corresponding microlenses 131 distributed on the same circle are the same, and the amounts of shift of the pixel groups 111 and the corresponding microlenses 131 are positively correlated with the sizes of the radii. Here, the offset amount refers to a distance between the center of the orthographic projection of the microlens 131 on the pixel layer 11 and the center of the corresponding pixel group 111. Specifically, the offset amount of the microlens 131 and the corresponding pixel unit 112 (or the pixel group 111) is in positive correlation with the radius of the circle, which means that as the radius of the circle where the microlens 131 is located gradually increases, the offset amount of the microlens 131 and the corresponding pixel unit 112 (or the pixel group 111) also gradually increases.
Thus, when the microlenses 131 and the pixel units 112 (or the pixel groups 111) are completely aligned without being shifted, for the pixel layer 11, a part of the light converged by the microlenses 131 at the edge positions cannot be received by the corresponding pixel units 112 (or the pixel groups 111), which results in waste of light. The image sensor 10 according to the embodiment of the present disclosure sets a reasonable offset amount for the microlens 131 corresponding to the non-center position and the pixel unit 112 (or the pixel group 111) corresponding to the microlens 131, so that the converging effect of the microlens 131 can be improved, and light received by the microlens 131 can be received by the corresponding pixel unit 112 (or the pixel group 111) after being converged.
Referring to fig. 4 and 10, the image sensor 10 further includes a metal wiring layer 14, wherein the metal wiring layer 14 is connected to the pixel layer 11 and is located on a side of the pixel layer 11 opposite to the polarization layer 12. The metal wiring layer 14 is used to read the pixel value generated when each pixel unit 112 is exposed.
The metal wiring layer 14 includes a plurality of pixel reading circuits 141, and each pixel reading circuit 141 is connected to one pixel unit 112 for reading a pixel value of the pixel unit 112.
Referring to fig. 11, the pixel reading circuit 141 includes a floating diffusion 1411 and a transfer transistor 1412, the floating diffusion 1411 is used for storing charges, and the transfer transistor 1412 is used for connecting the photodiode of the pixel unit 112 and the floating diffusion 1411 to transfer the charges generated by the photodiode to the floating diffusion 1411. The pixel reading circuit 141 is used to determine the pixel value of the corresponding pixel cell 112 according to the charge of the floating diffusion region 1411.
Specifically, after the pixel unit 112 receives light passing through the corresponding polarizing element 122 and the sub-filter 1511, the photodiode generates a photoelectric effect, electron-hole pairs generated by light irradiation are separated by the presence of an electric field of the photodiode, electrons move to an n region, holes move to a p region, at the end of exposure, RST is activated, then the pixel reading circuit 141 resets to reset the reading region to a high level, after the reset is completed, the reset level is read, then the charge of the n region is transferred to the floating diffusion region 1411 by the transfer transistor 1412, then the level of the floating diffusion region 1411 is read out as a signal level, finally the pixel value of the pixel unit 112 is calculated according to the signal level and the reset level (for example, the difference between the signal level and the reset level is taken as a level corresponding to the pixel value of the pixel unit 112, and then the pixel value of the pixel unit 112 is calculated according to the level), the polarization angle information of the corresponding polarization element 122 and the color information of the sub-filter 1511 are associated with the pixel value of each pixel unit 112, so that a color polarization image can be output according to the pixel value of the pixel unit 112, the polarization angle information of the corresponding polarization element 122 of the pixel unit 112, and the color information of the sub-filter 1511.
Referring to fig. 3 again, the lens module 20 includes a substrate 21, a lens barrel 22 and a lens group 23. The lens barrel 22 is disposed on the substrate 21.
The substrate 21 may be a flexible circuit board, a rigid circuit board, or a rigid-flex circuit board. In the embodiment of the present application, the substrate 21 is a flexible circuit board, which is convenient for installation. The substrate 21 includes a carrying surface 211.
The lens barrel 22 can be mounted on the mounting surface 211 by screwing, engaging, gluing, etc. The image sensor 10 is disposed on the bearing surface 211 and within the lens barrel 22 to correspond to the lens group 23.
The lens group 23 may be provided in the lens barrel 22 by means of snap-fit, gluing, or the like. The lens assembly 23 may include one or more lenses 231. For example, the lens assembly 23 may include a lens 231, and the lens 231 may be a convex lens or a concave lens; for another example, the lens assembly 23 includes a plurality of lenses 231 (greater than or equal to two lenses), and all of the plurality of lenses 231 may be convex lenses or concave lenses, or some of the plurality of lenses are convex lenses and some of the plurality of lenses are concave lenses.
In other embodiments, at least one surface of at least one lens 231 in the lens group 23 is a free-form surface. It will be appreciated that since the aspheric lens has a rotationally symmetric design and only one axis of symmetry, its corresponding imaging area is generally circular. The lens group 23 including the free-form surface is designed to be non-rotationally symmetrical, includes a plurality of symmetry axes, is not limited by a circle in the design of an imaging region, and can be designed to be rectangular, rhombic, even irregular (such as "D" shape), and the like. In this embodiment, the imaging area corresponding to the lens assembly 23 may be rectangular, and the imaging area may just cover the entire pixel layer 11.
Referring to fig. 4 and 12, an imaging method according to an embodiment of the present application includes:
011: acquiring pixel values of nine pixel units 112 in each pixel group 111 of the image sensor 10, wherein the pixel units 112 are used for receiving light rays passing through corresponding polarizing elements 122 and optical filters 151, and angles of polarization axes of the nine polarizing elements 122 respectively corresponding to the nine pixel units 112 in the pixel group 111 can be different from each other; and
012: a color polarized image is generated from the pixel values of the pixel cells 112 that receive polarized light at the target polarization angle.
Specifically, after the exposure is completed, the pixel reading circuit 141 may acquire the pixel values of the nine pixel units 112 in each pixel group 111 of the image sensor 10, and the angles of the polarization axes of the nine polarization elements 122 respectively corresponding to the nine pixel units 112 in each pixel group 111 can be different from each other, that is, the nine pixel units 112 in each pixel group 111 respectively receive polarized light with different polarization angles, and the pixel values of the nine pixel units 112 in each pixel group 111 are associated with different polarization angle information from each other. And the pixel value of each pixel unit 112 is associated with the color information of the corresponding sub-filter 1511.
Referring to fig. 1, the terminal 1000 includes a processor 300, the processor 300 may generate a color polarization image according to the pixel value of the pixel unit 112 receiving the polarized light of the target polarization angle, and specifically, the processor 300 may generate the color polarization image according to the pixel value of each pixel unit 112 receiving the polarized light of the target polarization angle, the polarization angle information of the polarization element 122 corresponding to the pixel unit 112, and the color information of the sub-filter 1511.
For example, the angles of the polarization axes of the nine polarizing elements 122 are 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °, respectively, and the target polarization angle is 0 °, then the processor 300 obtains the pixel value of the pixel unit 112 in each pixel group 111 that receives the polarized light with the polarization angle of 0 °, and obtains the color information associated with the pixel unit 112. Since half of the light is lost after passing through the polarization axis, the received light amount is half of the actual light amount, and therefore, when the pixel value of the pixel unit 112 is obtained, the pixel value can be determined according to twice the obtained light amount, so that the obtaining of the pixel value is more accurate. In addition, when calculating the pixel values of the pixel units 112 in each pixel group 111 by the interpolation algorithm, the pixel values may be determined according to the pixel values of the pixel units 112 in the adjacent pixel groups 111 that receive polarized light of the same polarization angle, for example, when calculating the pixel value of the pixel unit 112 that receives polarized light of 0 ° in one pixel group 111 and the polarized light is red, the red value of the pixel unit 112 may be directly acquired first, and then the interpolation calculation may be performed according to the pixel values of the pixel units 112 that also receive polarized light of 0 ° in the adjacent eight pixel groups 111 to obtain the green value and the blue value of the pixel unit 112. If the average value of the pixel values of the pixel units 112 that also receive the polarized light with the polarization angle of 0 ° and the color of the polarized light is green in the eight pixel groups 111 adjacent to the pixel group 111 is interpolated to obtain the green value of the pixel unit 112, and then the average value of the pixel values of the pixel units 112 that also receive the polarized light with the polarization angle of 0 ° and the color of the polarized light is blue in the eight pixel groups 111 adjacent to the pixel group 111 is interpolated to obtain the blue value of the pixel unit 112, the pixel values of the pixel units 112 that receive the polarized light with the polarization angle of 0 ° can be ensured to be the same and the associated color information is different according to the red value, the green value and the blue value, so that the pixel value of each pixel unit 112 can be calculated more accurately.
The processor 300 then generates a color polarization image according to the acquired pixel value of the pixel unit 112, the polarization angle information (i.e., the polarization angle is 0 °) of the polarization element 122 corresponding to the pixel unit 112, and the color information of the sub-filter 1511. The target polarization angle may be set according to the preference of the user, for example, one exposure may be performed in advance in the current scene, the processor 300 may generate the polarization image of each polarization angle (9 color polarization images with different polarization angles are generated, which are respectively 0 °, 20 °, 40 °, 60 °, 80 °, 100 °, 120 °, 140 °, and 160 °) to be presented to the user on the display screen of the terminal 1000, the user selects one of the favorite color polarization images, and then the polarization angle corresponding to the color polarization image is used as the target polarization angle; or, the target user selects the color polarization image that is least preferred, for example, when the user does not want to see the reflection of the sky reflected by the lake when taking the lake, the color polarization image including the reflection may be selected, and the processor 300 uses the polarization angle corresponding to the color polarization image not including the reflection as the target polarization angle according to the selection of the user, and at this time, only one color polarization image (for example, the color polarization image corresponding to 100 degrees) may include the reflection image, and the target polarization angle is 0 °, 20 °, 40 °, 60 °, 80 °, 120 °, 140 °, and 160 °. The processor 300 may then re-determine the pixel value of the pixel cell 112 based on the pixel values of the pixel cells 112 adjacent to the pixel cell 112 receiving the polarized light of 100 °, for example, taking the average of the sum of the pixel values of the pixel cells 112 adjacent to the pixel cell 112 as the pixel value of the pixel cell 112.
Finally, the processor 300 generates a color polarization image based on the pixel values newly determined by the pixel unit 112 receiving the polarized light of 100 ° and the pixel values of all other pixel units 112, wherein the color polarization image not only removes the polarization angle information corresponding to the polarized light of 90 ° (i.e., the reflection image unwanted by the user in the above example), but also has a resolution consistent with the resolution of the entire image sensor 10.
Referring to fig. 4, 7 and 13, in some embodiments, the imaging method further includes:
013: generating one or more color polarization images according to the pixel values of the pixel units 112 receiving the polarized light of the same polarization angle;
014: obtaining a target polarization angle according to the definition of one or more color polarization images; and
015: the liquid crystal cell 1222 is controlled to change the angle of the polarization axis of the polarizing element 122 to a target polarization angle.
Specifically, the target polarization angle may also be automatically set by the processor 300 (shown in fig. 1) according to the captured color polarization image, for example, one exposure may be performed in advance in the current scene, and the processor 300 may generate the color polarization image for each polarization angle (for example, 9 color polarization images with different polarization angles are generated in total), then compare the resolutions of the 9 color polarization images, and select the polarization angle corresponding to the color polarization image with the highest resolution as the target polarization angle.
The processor 300 then controls the liquid crystal cells 1222 of the polarizer 122 to change the angle of the polarization axis of the polarizer 122 to a target polarization angle, so that the nine pixel cells 112 in each pixel group 111 receive light of the target polarization angle. For example, as shown in FIG. 14, the target polarization angle is 0, and the processor 300 controls the liquid crystal cells 1222 of the polarizing element 122 to rotate by 0. In contrast to the polarization angles of the nine pixel cells 112 in each pixel group 111 being different from each other, the color polarization image is generated from the pixel cell 112 having the polarization angle of the target polarization angle in each pixel group 111, and the resolution of the color polarization image is 1/9, which is the resolution of the entire image sensor 10, and the resolution is the same as the resolution of the entire image sensor 10, so that the color polarization image having the polarization angle information of the target polarization angle and having a high resolution can be acquired.
Referring to fig. 4 and 15, in some embodiments, the imaging method further includes:
016: determining pixel cells 112 with pixel values greater than a predetermined pixel value as overexposed pixel cells 112; and
017: the pixel values of the overexposed pixel cells 112 are determined according to the pixel values of the pixel cells 112 adjacent to the overexposed pixel cells 112.
Specifically, when the pixel value of a certain pixel unit 112 is overexposed, the information indicating that the pixel unit 112 is no longer accurate, the pixel is defined as the overexposed pixel unit 112, and the overexposed pixel unit 112 is the pixel unit 112 having the pixel value greater than or equal to the predetermined pixel value, wherein the predetermined pixel value can be set according to the requirement, and the predetermined pixel value can be set to be smaller, such as 180, or larger, such as 255, for example, the pixel unit 112 having the pixel value greater than 255 is the overexposed pixel unit 112.
The processor 300 (shown in fig. 1) may determine the pixel value of the overexposed pixel unit 112 according to the pixel values of the pixel units 112 adjacent to the overexposed pixel unit 112. Since the polarization angles of the polarization elements 122 corresponding to the adjacent pixel units 112 in this application can be different from each other, when the polarized light of one polarization angle in the environment is too much to cause overexposure of the pixel unit 112 receiving the polarized light of the polarization angle, the polarized light of other polarization angles is generally less, that is, the pixel unit 112 adjacent to the pixel unit 112 is generally not overexposed, at this time, the pixel value of the pixel unit 112 can be calculated according to the pixel value of the pixel unit 112 adjacent to the pixel unit 112, for example, an average value of the sum of the pixel values of the pixel units 112 adjacent to the pixel unit 112 is taken as the pixel value of the pixel unit 112. In this way, the processor 300 can recalculate the pixel values of the overexposed pixel units 112, and prevent the overexposed pixel units 112 from affecting the image quality.
In the description herein, reference to the terms "certain embodiments," "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiments or examples is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one feature. In the description of the present application, "a plurality" means at least two, e.g., two, three, unless specifically limited otherwise.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (12)

1. An image sensor, comprising:
a pixel layer including a plurality of pixel groups, each of the pixel groups including nine pixel units;
the polarization layer comprises a plurality of polarization element groups, each polarization element group comprises nine polarization elements, angles of polarization axes of the nine polarization elements can be different from each other, and the polarization elements correspond to the pixel units one by one; and
the filter layer comprises a plurality of optical filters, the polarizing element groups and the pixel groups correspond to one another one by one, and each pixel unit is used for receiving polarized light which passes through the corresponding optical filters and the corresponding polarizing elements.
2. The image sensor of claim 1, wherein the filter layer is located between the pixel layer and the polarizing layer; or, the polarizing layer is located between the pixel layer and the filter layer.
3. The image sensor of claim 1, wherein the filter layer comprises a plurality of filter sets, the filter sets are arranged in a two-row two-column matrix by four filters, and the four filters of the filter sets are respectively configured to transmit red light, green light, blue light, and green light; or, the four optical filters of the optical filter set are respectively used for transmitting red light, green light, blue light and white light.
4. The image sensor of claim 1, wherein the angle of the polarization axis of the polarizing element is variable within a predetermined angular range.
5. The image sensor of claim 4, wherein at least one of the polarizing elements comprises a liquid crystal cell for changing an angle of a polarization axis of the polarizing element under an electric field.
6. The image sensor of claim 1, further comprising a microlens layer, the polarizing layer being between the microlens layer and the pixel layer, the microlens layer comprising a plurality of microlenses;
each of the microlenses corresponds to one of the pixel groups; or
Each of the microlenses corresponds to one of the pixel cells.
7. A camera module, comprising:
the image sensor of any one of claims 1 to 6; and
the image sensor is arranged on the image side of the lens module.
8. A terminal, comprising:
the camera module of claim 7; and
the camera module is arranged on the shell.
9. An imaging method, characterized in that the imaging method comprises:
acquiring pixel values of nine pixel units in each pixel group of the image sensor, wherein the pixel units are used for receiving light rays passing through corresponding polarizing elements and optical filters, and angles of polarization axes of the nine polarizing elements respectively corresponding to the nine pixel units in the pixel group can be different from each other; and
a color polarized image is generated from pixel values of the pixel cells that receive polarized light at a target polarization angle.
10. The imaging method according to claim 9, wherein the polarizing element includes a liquid crystal cell, the imaging method further comprising:
generating one or more color polarized images according to the pixel values of the pixel units receiving the polarized light with the same polarization angle;
obtaining a target polarization angle according to the definition of one or more color polarization images; and
and controlling the liquid crystal unit to change the angle of the polarization axis of the polarization element to be the target polarization angle.
11. The imaging method according to claim 10, wherein the pixel value of the pixel unit of each of the pixel groups is determined according to the pixel units of the adjacent pixel groups receiving polarized light of the same polarization angle.
12. The imaging method of claim 9, further comprising:
determining the pixel units with pixel values larger than a preset pixel value as overexposed pixel units; and
and determining the pixel value of the overexposed pixel unit according to the pixel value of the pixel unit adjacent to the overexposed pixel unit.
CN201911101741.8A 2019-11-12 2019-11-12 Image sensor, camera module, terminal and imaging method Active CN110784633B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911101741.8A CN110784633B (en) 2019-11-12 2019-11-12 Image sensor, camera module, terminal and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911101741.8A CN110784633B (en) 2019-11-12 2019-11-12 Image sensor, camera module, terminal and imaging method

Publications (2)

Publication Number Publication Date
CN110784633A true CN110784633A (en) 2020-02-11
CN110784633B CN110784633B (en) 2021-07-16

Family

ID=69390513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911101741.8A Active CN110784633B (en) 2019-11-12 2019-11-12 Image sensor, camera module, terminal and imaging method

Country Status (1)

Country Link
CN (1) CN110784633B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111447376A (en) * 2020-05-06 2020-07-24 Oppo广东移动通信有限公司 Image processing method, camera assembly, mobile terminal and computer readable storage medium
WO2021103872A1 (en) * 2019-11-25 2021-06-03 Oppo广东移动通信有限公司 Image sensor, camera apparatus, electronic device, and imaging method
CN113286067A (en) * 2021-05-25 2021-08-20 Oppo广东移动通信有限公司 Image sensor, image pickup apparatus, electronic device, and imaging method
CN115118859A (en) * 2022-06-27 2022-09-27 联想(北京)有限公司 Electronic device and processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080287A1 (en) * 2000-12-22 2002-06-27 Samsung Electro-Mechanics Co., Ltd. Color separating/synthesizing apparatus
CN102647567A (en) * 2012-04-27 2012-08-22 上海中科高等研究院 CMOS (complementary metal oxide semiconductor) image sensor and a pixel structure thereof
CN107251553A (en) * 2015-02-27 2017-10-13 索尼公司 Image processing apparatus, image processing method and image pick-up element

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080287A1 (en) * 2000-12-22 2002-06-27 Samsung Electro-Mechanics Co., Ltd. Color separating/synthesizing apparatus
CN102647567A (en) * 2012-04-27 2012-08-22 上海中科高等研究院 CMOS (complementary metal oxide semiconductor) image sensor and a pixel structure thereof
CN107251553A (en) * 2015-02-27 2017-10-13 索尼公司 Image processing apparatus, image processing method and image pick-up element

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021103872A1 (en) * 2019-11-25 2021-06-03 Oppo广东移动通信有限公司 Image sensor, camera apparatus, electronic device, and imaging method
CN111447376A (en) * 2020-05-06 2020-07-24 Oppo广东移动通信有限公司 Image processing method, camera assembly, mobile terminal and computer readable storage medium
CN113286067A (en) * 2021-05-25 2021-08-20 Oppo广东移动通信有限公司 Image sensor, image pickup apparatus, electronic device, and imaging method
CN113286067B (en) * 2021-05-25 2023-05-26 Oppo广东移动通信有限公司 Image sensor, image pickup apparatus, electronic device, and imaging method
CN115118859A (en) * 2022-06-27 2022-09-27 联想(北京)有限公司 Electronic device and processing method

Also Published As

Publication number Publication date
CN110784633B (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN110784633B (en) Image sensor, camera module, terminal and imaging method
CN110995968B (en) Image sensor, image pickup apparatus, electronic device, and imaging method
CN211577919U (en) Fingerprint identification device and electronic equipment
CN110677575B (en) Image sensor, camera module and terminal
US8325266B2 (en) Method of forming thin camera
JP3170847B2 (en) Solid-state image sensor and optical device using the same
US10419664B2 (en) Image sensors with phase detection pixels and a variable aperture
US20080080028A1 (en) Imaging method, apparatus and system having extended depth of field
CN110636277B (en) Detection apparatus, detection method, and image pickup apparatus
US10567636B2 (en) Resolution enhancement using sensor with plural photodiodes per microlens
US9945718B2 (en) Image sensors with multi-functional pixel clusters
CN111629140A (en) Image sensor and electronic device
JP2011176715A (en) Back-illuminated image sensor and imaging apparatus
CN101500085A (en) Image pickup apparatus
CN113286067B (en) Image sensor, image pickup apparatus, electronic device, and imaging method
JPWO2005081020A1 (en) Optics and beam splitters
JPWO2019078335A1 (en) Imaging equipment and methods, and image processing equipment and methods
CN110708453A (en) Image sensor, camera module, terminal and imaging method
US10347678B2 (en) Image sensor with shifted microlens array
Brückner et al. Diffraction and photometric limits in today's miniature digital camera systems
CN110505387B (en) Imaging system, terminal and image acquisition method
CN110505385B (en) Imaging system, terminal and image acquisition method
CN110505384B (en) Imaging system, terminal and image acquisition method
CN110445974B (en) Imaging system, terminal and image acquisition method
JPH0265386A (en) Solid-state image pickup element

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant