WO2021103872A1 - Capteur d'image, appareil de caméra, dispositif électronique, et procédé d'imagerie - Google Patents

Capteur d'image, appareil de caméra, dispositif électronique, et procédé d'imagerie Download PDF

Info

Publication number
WO2021103872A1
WO2021103872A1 PCT/CN2020/123226 CN2020123226W WO2021103872A1 WO 2021103872 A1 WO2021103872 A1 WO 2021103872A1 CN 2020123226 W CN2020123226 W CN 2020123226W WO 2021103872 A1 WO2021103872 A1 WO 2021103872A1
Authority
WO
WIPO (PCT)
Prior art keywords
polarization
pixel
filter
layer
units
Prior art date
Application number
PCT/CN2020/123226
Other languages
English (en)
Chinese (zh)
Inventor
杨鑫
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021103872A1 publication Critical patent/WO2021103872A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements

Definitions

  • This application relates to the field of consumer electronics technology, and more specifically, to an image sensor, a camera device, an electronic device, and an imaging method.
  • the embodiments of the present application provide an image sensor, a camera device, an electronic device, and an imaging method.
  • the image sensor of the embodiment of the present application includes a pixel layer, a polarization layer, and a filter layer.
  • the pixel layer includes a plurality of pixel units, each of the pixel units includes four pixels;
  • the polarization layer includes a plurality of polarization units, each Each of the polarization units includes four polarization elements, and the angles of the polarization axes of the four polarization elements can be different from each other;
  • the filter layer includes a plurality of filter units, and each of the filter units includes four filters.
  • each filter unit allows the same color of light to pass through
  • the filter zone, the polarizing element and the pixel have a one-to-one correspondence, and each pixel uses To receive the polarized light passing through the corresponding filter area and the corresponding polarizing element.
  • the camera device of the embodiment of the present application includes an image sensor and a lens module, the image sensor is arranged on the image side of the lens module;
  • the image sensor includes a pixel layer, a polarization layer and a filter layer, the pixel layer includes A plurality of pixel units, each of the pixel units includes four pixels;
  • the polarization layer includes a plurality of polarization units, each of the polarization units includes four polarization elements, the angles of the polarization axes of the four polarization elements can be Are different from each other;
  • the filter layer includes a plurality of filter units, each of the filter units includes four filter regions, and the four filter regions of each filter unit allow light to pass through The color is the same, the filter area, the polarizing element, and the pixel are in one-to-one correspondence, and each pixel is used to receive polarized light passing through the corresponding filter area and the corresponding polarizing element.
  • the electronic equipment of the embodiment of the present application includes a housing and a camera, the camera is mounted on the housing; the camera includes an image sensor and a lens module, the image sensor is arranged on the lens module The image side; the image sensor includes a pixel layer, a polarization layer and a filter layer, the pixel layer includes a plurality of pixel units, each of the pixel units includes four pixels; the polarization layer includes a plurality of polarization units, each Each of the polarization units includes four polarization elements, and the angles of the polarization axes of the four polarization elements can be different from each other; the filter layer includes a plurality of filter units, and each of the filter units includes four filters.
  • each filter unit allows the same color of light to pass through
  • the filter zone, the polarizing element and the pixel have a one-to-one correspondence, and each pixel uses To receive the polarized light passing through the corresponding filter area and the corresponding polarizing element.
  • the imaging method of the embodiment of the present application is applied to an image sensor, the image sensor includes a pixel layer, a polarization layer, and a filter layer.
  • the pixel layer includes a plurality of pixel units, and each of the pixel units includes four pixels;
  • the polarization layer includes a plurality of polarization units, each of the polarization units includes four polarization elements, and the angles of the polarization axes of the four polarization elements can be different from each other;
  • the filter layer includes a plurality of filter units, each The filter unit includes four filter areas, the four filter areas of each filter unit allow the same color of light to pass through, and the filter area, the polarizing element, and the pixel are one One correspondence, each of the pixels is used to receive the polarized light passing through the corresponding filter area and the corresponding polarizing element;
  • the imaging method includes: acquiring four of the pixel units The pixel value of the pixel; and generating a color polarization image based on the pixel
  • the angles of the polarization axes of the four polarization elements corresponding to the four pixels of the pixel unit can be different from each other, and the four pixels of each pixel unit can be different from each other. It can receive polarized light with different polarization angles after passing through the corresponding filter area, and can obtain four polarized light with different polarization angles and the color information of the polarized light in the shooting scene, thereby according to the polarization with color information and different polarization angle information. The light generates different colored and polarized images, and the scene is used in a wide range.
  • Fig. 1 is a schematic plan view of an electronic device according to some embodiments of the present application.
  • FIG. 2 is a schematic plan view of another view of the electronic device according to some embodiments of the present application.
  • FIG. 3 is a schematic cross-sectional view of an imaging device according to some embodiments of the present application.
  • FIG. 4 is an exploded schematic diagram of an image sensor according to some embodiments of the present application.
  • FIG. 5 to 8 are schematic plan views of filter unit groups according to some embodiments of the present application.
  • 9 and 10 are schematic plan views of polarization units according to some embodiments of the present application.
  • FIG. 11 is an exploded schematic diagram of an image sensor according to some embodiments of the present application.
  • 12 and 13 are schematic plan views of pixel layers in some embodiments of the present application.
  • FIG. 14 is a schematic cross-sectional view of an image sensor according to some embodiments of the present application.
  • 15 is a schematic diagram of the connection between a pixel reading circuit and a pixel in some embodiments of the present application.
  • 16 and 17 are schematic flowcharts of imaging methods according to some embodiments of the present application.
  • FIG. 18 is a schematic plan view of the polarization unit in FIG. 10 when it is in another state
  • FIG. 19 is a schematic flowchart of an imaging method according to some embodiments of the present application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the present application, “multiple” means two or more than two, unless otherwise specifically defined.
  • connection should be understood in a broad sense, unless otherwise clearly specified and limited.
  • it can be a fixed connection or a detachable connection.
  • Connected or integrally connected it can be mechanically connected, or it can be electrically connected or can communicate with each other; it can be directly connected, or indirectly connected through an intermediary, it can be the internal communication of two components or the interaction of two components relationship.
  • connection should be understood according to specific circumstances.
  • the electronic device 1000 includes a housing 200 and a camera 100.
  • the imaging device 100 is mounted on the housing 200.
  • the camera device 100 of the embodiment of the present application includes an image sensor 10 and a lens module 20.
  • the image sensor 10 is arranged on the image side of the lens module 20.
  • the image sensor 10 includes a pixel layer 11, a polarizing layer 12 and a filter layer 15.
  • the pixel layer 11 includes a plurality of pixel units 111, and each pixel unit 111 includes four pixels 112.
  • the polarization layer 12 includes a plurality of polarization units 121, and each polarization unit 121 includes four polarization elements 122, and the angles of the polarization axes of the four polarization elements 122 can be different from each other.
  • the filter layer 15 includes a plurality of filter units 151, and each filter unit 151 includes four filter regions 1511. The four filter regions 1511 of each filter unit 151 allow the same color of light to pass through.
  • One pixel unit 111 corresponds to one polarization unit 121 and one filter unit 151
  • the filter area 1511 in each filter unit 151 corresponds to the polarization element 122 in the polarization unit 121 and the pixel 112 in the corresponding pixel unit 111.
  • each pixel 112 is used to receive the polarized light passing through the corresponding filter region 1511 and the corresponding polarizing element 122.
  • the angles of the polarization axes of the four polarization elements 122 respectively corresponding to the four pixels 112 of the pixel unit 111 can be different from each other, and each pixel unit 111
  • the four pixels 112 can receive the polarized light with different polarization angles through the corresponding filter area 1511, and can obtain the polarized light with different polarization angles and the color information of the polarized light in the shooting scene, so as to be based on the color information and color information.
  • the polarized light with different polarization angle information generates different color polarization images, which can be used in a wide range of scenes.
  • the embodiments of the present application can obtain polarization angle information about the properties of the object itself, which can be used to determine the material type, surface shape, etc. of the object. Identification of attributes.
  • the electronic device 1000 may be a mobile phone, a tablet computer, a monitor, a notebook computer, a teller machine, a gate, a smart watch, a head-mounted display device, a game console, and the like.
  • the embodiments of this application are described by taking the electronic device 1000 as a mobile phone as an example. It can be understood that the specific form of the electronic device 1000 is not limited to a mobile phone.
  • the housing 200 can be used to install the camera device 100, or in other words, the housing 200 can be used as a mounting carrier of the camera device 100.
  • the electronic device 1000 includes a front 901 and a back 902.
  • the camera device 100 can be set on the front 901 as a front camera, and the camera 100 can also be set on the back 902 as a rear camera. In the embodiment of the present application, the camera device 100 is installed on the back 902 as a rear camera.
  • the housing 200 can also be used to install functional modules such as the camera device 100, the power supply device, and the communication device of the electronic device 1000, so that the housing 200 provides protections such as dustproof, anti-drop, and waterproof for the functional modules.
  • the image sensor 10 includes a pixel layer 11, a filter layer 15, a polarizing layer 12, and a micro lens layer 13.
  • the polarizing layer 12 is located between the pixel layer 11 and the microlens layer 13.
  • the filter layer 15 is located between the pixel layer 11 and the polarizing layer 12.
  • the image sensor 10 includes a microlens layer 13, a polarizing layer 12, a filter layer 15 and a pixel layer 11 in sequence.
  • the filter layer 15 is located between the polarizing layer 12 and the microlens layer 13 (that is, the polarizing layer 12 is located between the pixel layer 11 and the filter layer 15).
  • the image sensor 10 includes a microlens layer 13, a filter layer 15, a polarizing layer 12, and a pixel layer 11 in sequence.
  • Both of the above two position setting solutions of the filter layer 15 can enable the image sensor 10 to have a better imaging effect.
  • the position of the filter layer 15 can be flexibly set as required.
  • the filter layer 15 when the filter layer 15 is located between the pixel layer 11 and the polarizing layer 12, since the polarizing layer 12 uses spaced metal wire grids (that is, the "microstructure unit 1221" below, which will be described in detail later), the light filter The layer 15 being located between the pixel layer 11 and the polarizing layer 12 will not cause uneven placement of the filter layer 15, that is, the filter layer 15 is arranged on the pixel layer 11 to be relatively flat, and the light filtering effect is better.
  • the pixel layer 11 includes a plurality of pixel units 111, and each pixel unit 111 includes four pixels 112. Each pixel 112 is used to receive incident light for photoelectric conversion, thereby converting optical signals into electrical signals.
  • the four pixels 112 can be arranged in a matrix of two rows and two columns, and the pixel arrangement is more compact, which is convenient to cover the entire pixel layer 11.
  • the four pixels 112 can also be arranged in other shapes, such as trapezoids, etc., instead of being limited to the above-mentioned two rows and two columns matrix arrangement.
  • the filter layer 15 includes a plurality of filter units 151, and the filter units 151 correspond to the pixel units 111 in a one-to-one manner.
  • the pixel unit 111 is used to receive the light filtered by the corresponding filter unit 151.
  • each filter unit 151 includes four filter regions 1511.
  • Each filter area 1511 corresponds to a pixel 112, and each pixel 112 is used to receive light filtered by the corresponding filter area 1511.
  • the four filter regions 1511 of each filter unit 151 allow the same color of light to pass through. For example, all four filter regions 1511 transmit red light; or, all four filter regions 1511 transmit green light; or, all four filter regions 1511 transmit blue light; or, all four filter regions 1511 transmit blue light.
  • the four filter regions 1511 may be integrally formed to form a filter unit 151.
  • the four filter regions 1511 can also be formed separately and combined together by a connection method such as gluing to form a filter unit 151.
  • the four filter regions 1511 of the embodiment of the present application can be integrally formed to form a filter unit 151, and the four filter regions 1511 are tightly combined.
  • the filter layer 15 further includes a plurality of filter unit groups 152, and each filter unit group 152 consists of four filter units 151 arranged in a matrix of two rows and two columns.
  • the four filter units 151 in each filter unit group 152 are used to transmit red light R, green light G, blue light B, and green light G, respectively.
  • Each pixel unit 111 receives the light filtered by the corresponding filter unit 151.
  • the light received by the pixel unit 111 corresponding to the filter unit 151 that transmits red light R is red light R
  • the light that transmits green light G is green light G
  • the light received by the pixel unit 111 corresponding to the filter unit 151 that transmits blue light B is red light R.
  • the light received by the pixel 112 in the pixel unit 111 is associated with color information, which can be used to generate a color image.
  • the four filter units 151 in each filter unit group 152 can also be used to transmit red light R, green light G, blue light B, and white light W, respectively, so as to improve shooting in a dark environment. effect.
  • the above-mentioned matrix of two rows and two columns formed by the four filter units 151 may have a first diagonal direction D1 and a second diagonal direction D2, a first diagonal direction D1 and a second diagonal direction D2 different.
  • the first diagonal direction D1 and the second diagonal direction D2 may be perpendicular to each other.
  • the two filter units 151 distributed along the first diagonal direction D1 of the matrix are both used to transmit the green light G along the second diagonal direction of the matrix.
  • the two filter units 151 distributed in D2 are used to transmit the red light R and the blue light B respectively.
  • the four filter units 151 are used to transmit red light R, green light G, green light G, and blue light B in order from left to right and top to bottom (as shown in FIG. 5 Shown); or, in the filter unit group 152, the four filter units 151 from left to right, from top to bottom, are used to transmit blue light B, green light G, green light G, and red light R (such as (Shown in Figure 6)
  • two filter units 151 distributed along the first diagonal direction D1 of the matrix are used to transmit red light R and blue light B, respectively, along the second diagonal of the matrix.
  • the two filter units 151 distributed in the diagonal direction D2 are both used to transmit the green light G. More specifically, in the filter unit group 152, the four filter units 151 are used to transmit green light G, red light R, blue light B, and green light G in order from left to right and top to bottom (as shown in FIG. 7 Shown); or, in the filter unit group 152, the four filter units 151 from left to right, from top to bottom, are used to transmit green light G, blue light B, red light R, and green light G (such as (Shown in Figure 8)
  • the polarizing layer 12 includes a plurality of polarizing units 121, and the polarizing unit 121, the filter unit 151 and the pixel unit 111 are in one-to-one correspondence.
  • the pixel unit 111 is used to receive the polarized light passing through the corresponding polarization unit 121 and the filter unit 151.
  • Each polarization unit 121 includes four polarization elements 122, and the angles of the polarization axes of the four polarization elements 122 can be different from each other.
  • angles of the polarization axes of the four polarization elements 122 are 0°, 45°, 90°, and 135°, respectively; or, the angles of the polarization axes of the four polarization elements 122 are 30°, 75°, 120°, and 165°, respectively. °Wait.
  • the polarizing element 122, the filter area 1511 and the pixel 112 are in one-to-one correspondence, and each pixel 112 is used to receive the polarized light passing through the corresponding filter area 1511 and the corresponding polarizing element 122.
  • the four polarizing elements 122 can also be arranged in a matrix of two rows and two columns, which is convenient to cover the entire polarizing layer 12, so that the four pixels 112 receive the polarized light passing through the corresponding filter region 1511 and the corresponding polarizing element 122. More relevant. Similarly, the four polarizing elements 122 can also be arranged in other shapes, such as trapezoids, etc., and are not limited to the above-mentioned two-row two-column matrix arrangement.
  • Each polarizing element 122 may include a plurality of microstructure units 1221 arranged in parallel and spaced apart, and the microstructure units 1221 may be rectangular parallelepipeds or the like. By setting the arrangement angle of the microstructure unit 1221, the angle of the polarization axis of the corresponding polarizing element 122 can be determined. Light parallel to the arrangement angle of the microstructure unit 1221 can pass through the microstructure unit 1221 and be perpendicular to the arrangement angle The light will be completely reflected, thereby achieving light polarization. In the example shown in FIG.
  • the angles at which the microstructure units 1221 of the four polarizing elements 122 are arranged are 0°, 45°, 90°, and 135°, respectively, and the angles of the polarization axes of the four polarizing elements 122 can be determined respectively. It is 0°, 45°, 90° and 135°.
  • the distance between the microstructure units 1221 can be determined according to the wavelength of the light to be received, and it is only necessary to ensure that the distance between adjacent microstructure units 1221 is smaller than the wavelength of the light to be received. For example, during visible light imaging, the distance between adjacent microstructure units 1221 needs to be smaller than the minimum wavelength of visible light (such as 380 nanometers) to ensure that the polarizing element 122 can effectively polarize the received light.
  • the material of the microstructure unit 1221 may be metal.
  • the material of the microstructure unit 1221 is at least one of gold, silver, copper, and aluminum.
  • the material of the microstructure unit 1221 is gold, or the material of the microstructure unit 1221 is silver, or the material of the microstructure unit 1221 is copper, or the material of the microstructure unit 1221 is iron, or the material of the microstructure unit 1221 is gold
  • the alloy with silver, or the material of the microstructure unit 1221 is an alloy of gold and copper, or the material of the microstructure unit 1221 is an alloy of gold and iron, etc., which are not listed here.
  • At least one polarizing element 122 includes a liquid crystal cell 1222.
  • one polarization element 122 includes a liquid crystal cell 1222
  • two polarization elements 122 include a liquid crystal cell 1222
  • three polarization elements 122 include a liquid crystal cell 1222
  • ... N polarization elements 122 include a liquid crystal cell 1222, etc.
  • N is a positive integer.
  • the embodiment of the present application is described by taking as an example that all the polarization elements 122 of the polarization layer 12 include the liquid crystal cell 1222.
  • the number of liquid crystal cells 122 in each polarizing element 122 is one or more, which can be set according to the size of the polarizing element 122 and the 1222 size of the liquid crystal cell.
  • the number of liquid crystal cells 122 in each polarizing element 122 is one, two, three, and so on.
  • the liquid crystal cell 1222 can be deflected under the action of an electric field, thereby changing the angle of the polarization axis of the polarization element 122, so that the angle of the polarization axis of the polarization element 122 can be changed within a predetermined angle range.
  • the predetermined angle range may be [0°, 180°].
  • the angles of the polarization axes of the four polarization elements 122 of each polarization unit 121 are 0°, 45°, 90° and 135° respectively. If you want to obtain more polarized light with more polarization angles, you can continue Perform multiple exposures (such as two, three, etc.).
  • polarized light with four polarization angles can be obtained, and then each polarized light can be obtained through the liquid crystal cell 1222.
  • the angles of the polarization axes of the four polarization elements 122 of the unit 121 are changed (for example, to 30°, 75°, 120°, and 165°, respectively) and then another exposure can obtain 30°, 75°, 120°, and 165°. If you want to obtain more polarized light with more polarization angles, you can change the angles of the polarization axes of the four polarization elements 122 of each polarization unit 121 again and perform exposure.
  • the image sensor 10 can obtain polarized light with any polarization angle within a predetermined angle range according to requirements, so as to obtain a polarization image with any polarization angle within the predetermined angle range.
  • a polarization image with any polarization angle within the predetermined angle range For example, by changing the angles of the polarization axes of the four polarization elements 122 of each polarization unit 121 through the liquid crystal unit 1222, 8 polarization images with different polarization angles can be obtained (before the angle of the polarization axis is changed, a polarization image of 0°, 45° Polarization image of 90°, polarization image of 90°, and polarization image of 135°.
  • a polarization image of 30°, a polarization image of 75°, a polarization image of 120°, and a polarization image of 165° are obtained).
  • the eight polarization images with different polarization angles can be presented to the user on the display screen of the electronic device 1000, and the user can select the polarization image he likes, or use it for other purposes.
  • the micro lens layer 13 is located on the side of the polarizing layer 12 opposite to the pixel layer 11 (that is, the polarizing layer 12 is located between the micro lens layer 13 and the pixel layer 11).
  • the microlens layer 13 includes a plurality of microlenses 131.
  • the micro lens 131 may be a convex lens for condensing the light emitted from the lens module 20 to the micro lens 131 so that more light is irradiated on the polarizing layer 12.
  • Each microlens 131 corresponds to a pixel 112 (as shown in FIG.
  • Each microlens 131 corresponds to a pixel unit 111 (as shown in FIG. 11), and the pixel unit 111 corresponds to the polarization unit 121 one-to-one, that is, the microlens 131, the polarization unit 121 and the pixel unit 111 are one of three.
  • each microlens 131 corresponds to a pixel 112
  • the pixels 112 near the center of the pixel layer 11 correspond to the microlenses 131 and
  • the pixels 112 are aligned, and the pixels 112 near the center of the non-pixel layer 11 and the corresponding microlenses 131 are offset from each other.
  • the first circle is formed with the radius of the smallest circle covering the four pixels 112 near the center of the pixel layer 11 with the center of the pixel layer 11 as the radius, and the radius is formed with R2 (that is, half of the diagonal length of the pixel layer 11)
  • the second circle forms a plurality of third circles with an arbitrary length greater than R1 and less than R2 as a radius.
  • the pixels 112 between the first circle and the second circle are offset from the corresponding microlenses 131, and the same
  • the offset between the pixels 112 distributed on the third circle and the corresponding microlens 131 is the same, and the offset between the pixels 112 and the corresponding microlens 131 between each third circle and the first circle is the same as that of the third circle.
  • the size of the radius of the circle is positively correlated.
  • the first circle, the second circle, and the third circle are all virtual circles introduced for the convenience of explanation, and they do not really exist.
  • the offset refers to the distance between the center of the orthographic projection of the microlens 131 on the pixel layer 11 and the center of the corresponding pixel 112.
  • each microlens 131 corresponds to a pixel unit 111
  • the pixel unit 111 near the center of the pixel layer 11 corresponds to the microlens
  • the lens 131 and the pixel unit 111 are aligned, and the pixel unit 111 near the center of the non-pixel layer 11 and the corresponding micro lens 131 are offset from each other.
  • the size of the micro lens 131 can be set to be larger to be able to converge toward the pixel. All light of unit 111.
  • the radius forms a second circle, and a plurality of third circles are formed with any length greater than R1 and less than R2 as the radius, and there is an offset between the pixel unit 111 between the first circle and the second circle and the corresponding microlens 131 ,
  • the offset between the pixel units 111 and the corresponding microlens 131 distributed on the same third circle is the same, and the offset between the pixel units 111 and the corresponding microlens 131 between each third circle and the first circle
  • the offset refers to the distance between the center of the orthographic projection of the microlens 131 on the pixel layer 11 and the center of the corresponding pixel unit 111.
  • the offset of the microlens 131 and the corresponding pixel 112 (or pixel unit 111) is positively correlated with the radius of the circle where the microlens 131 is located. This means that as the radius of the circle where the microlens 131 is located gradually increases, the microlens 131 The offset from the corresponding pixel 112 (or pixel unit 111) also gradually increases.
  • a reasonable offset is set for the microlens 131 corresponding to the non-central position and the corresponding pixel 112 (or pixel unit 111), which can improve the convergence effect of the microlens 131, so that the microlens 131 receives After being condensed, the rays of light can be received by the corresponding pixel 112 (or pixel unit 111).
  • the image sensor 10 further includes a metal wiring layer 14.
  • the metal connection layer 14 is connected to the pixel layer 11 and is located on the side of the pixel layer 11 opposite to the polarizing layer 12.
  • the metal wiring layer 14 is used to read the pixel value generated when each pixel 112 is exposed.
  • the metal wiring layer 14 includes a plurality of pixel reading circuits 141, and each pixel reading circuit 141 is connected to a pixel 112 for reading the pixel value of the pixel 112.
  • the pixel reading circuit 141 includes a floating diffusion region 1411 and a transfer transistor 1412.
  • the floating diffusion region 1411 is used to store charges
  • the transfer transistor 1412 is used to connect the photodiode of the pixel 112 and the floating diffusion region 1411 to transfer the charge generated by the photodiode to the floating diffusion region 1411.
  • the pixel reading circuit 141 is used to determine the pixel value of the corresponding pixel 112 according to the charge of the floating diffusion region 1411.
  • the photodiode After the pixel 112 receives the light passing through the corresponding polarizing element 122 and the corresponding filter region 1511, the photodiode generates a photoelectric effect, and the electron-hole pairs generated by the light irradiation will be separated due to the existence of the electric field of the photodiode. , The electrons move to the n area and the holes move to the p area. At the end of the exposure, RST is activated, and then the pixel reading circuit 141 resets to reset the read area to a high level.
  • the transfer transistor 1412 transfers the charge in the n area to the floating diffusion area 1411, and then reads the level of the floating diffusion area 1411 as the signal level, and finally calculates the pixel value of the pixel 112 based on the signal level and the reset level ( For example, the difference between the signal level and the reset level is taken as the level corresponding to the pixel value of the pixel 112, and then the pixel value of the pixel 112 can be calculated based on this level), the pixel value of each pixel 112 is associated with a corresponding
  • the polarization angle information of the polarization element 122 and the color information of the filter area 1511 can output a color polarization image according to the pixel value of the pixel 112, the polarization angle information of the polarization element 122 corresponding to the pixel 112, and the color information of the filter area 1511.
  • the lens module 20 includes a substrate 21, a lens barrel 22 and a lens group 23.
  • the lens barrel 22 is provided on the substrate 21.
  • the substrate 21 may be a flexible circuit board, a rigid circuit board, or a rigid-flex circuit board. In the embodiment of the present application, the substrate 21 is a flexible circuit board, which is convenient for installation.
  • the substrate 21 includes a bearing surface 211.
  • the lens barrel 22 can be installed on the bearing surface 211 by screwing, snapping, gluing, or the like.
  • the image sensor 10 is disposed on the bearing surface 211 and located in the lens barrel 22 to correspond to the lens group 23.
  • the lens group 23 can be arranged in the lens barrel 22 by means of snapping, gluing, or the like.
  • the lens group 23 may include one or more lenses 231.
  • the lens group 23 may include one lens 231, and the lens 231 may be a convex lens or a concave lens; for another example, the lens group 23 may include a plurality of lenses 231 (more than or equal to two), and the plurality of lenses 231 may all be convex lenses or concave lenses. , Or partly convex lens, partly concave lens.
  • At least one surface of at least one lens 231 in the lens group 23 is a free-form surface. It can be understood that, due to the rotationally symmetric design of the aspheric lens, there is only one axis of symmetry, so its corresponding imaging area is generally circular.
  • the lens group 23 including a free-form surface is a non-rotationally symmetrical design and includes multiple symmetry axes.
  • the design of the imaging area is not restricted by a circle, and can be designed into a rectangle, a rhombus, or even an irregular shape (such as a "D" shape). )Wait.
  • the imaging area corresponding to the lens group 23 may be rectangular, and the imaging area may just cover the entire pixel layer 11.
  • the imaging method of the embodiment of the present application can be used for the image sensor 10.
  • the image sensor 10 includes a pixel layer 11, a polarizing layer 12 and a filter layer 15.
  • the pixel layer 11 includes a plurality of pixel units 111, and each pixel unit 111 includes four pixels 112.
  • the polarization layer 12 includes a plurality of polarization units 121, and each polarization unit 121 includes four polarization elements 122, and the angles of the polarization axes of the four polarization elements 122 can be different from each other.
  • the filter layer 15 includes a plurality of filter units 151, and each filter unit 151 includes four filter regions 1511. The four filter regions 1511 of each filter unit 151 allow the same color of light to pass through.
  • the filter area 1511, the polarizing element 122 and the pixel 112 have a one-to-one correspondence, and each pixel 112 is used to receive the polarized light passing through the corresponding filter area 1511 and the corresponding polarizing element 122.
  • Imaging methods include:
  • the pixel reading circuit 141 can obtain the pixel values of the four pixels 112 in each pixel unit 111 of the image sensor 10, and the four pixels 112 in each pixel unit 111 correspond to four pixels.
  • the angles of the polarization axis of the polarization element 122 can be different from each other, that is, the four pixels 112 in each pixel unit 111 receive polarized light with different polarization angles, and the pixel values of the four pixels 112 in each pixel unit 111 are correlated with each other. Different polarization angle information, and the pixel value of each pixel 112 is associated with the color information of the corresponding filter region 1511.
  • the electronic device 1000 may include a processor 300, and the processor 300 may generate a color polarization image according to the pixel value of the pixel 112 that receives the polarized light of the target polarization angle. Specifically, the processor 300 may generate a color polarization image according to the pixel value of each pixel 112 that receives the polarized light of the target polarization angle, the polarization angle information of the polarization element 122 corresponding to the pixel 112, and the color information of the filter area 1511.
  • the processor 300 obtains that the received polarization angle in each pixel unit 111 is 0°
  • the pixel value of the pixel 112 of the polarized light is obtained, and the color information associated with the pixel 112 is obtained. Since half of the light is lost after the light passes through the polarization axis, the amount of received light is half of the actual amount of light. Therefore, when the pixel value of the pixel 112 is obtained, the pixel value can be determined according to twice the amount of light obtained, so that the pixel The acquisition of the value is more accurate.
  • the pixel value of the pixel 112 in each pixel unit 111 is calculated by the interpolation algorithm, it can be determined according to the pixel value of the pixel 112 in the adjacent pixel unit 111 that receives the polarized light of the same polarization angle. For example, when calculating a pixel When the pixel value of the pixel 112 that receives the polarized light of 0° polarization angle in the unit 111 and the polarized light is red light, it can be based on the pixel value of the pixel 112 that also receives the polarized light of 0° polarization angle in the eight adjacent pixel units 111.
  • the pixel value is calculated by interpolation, such as taking the average value of the pixel value of the pixel 112 of the eight pixel units 111 adjacent to the pixel unit 111 that also receive polarized light with a polarization angle of 0° and the color of the polarized light is green light.
  • the green light value of the pixel 112 In order to obtain the green light value of the pixel 112, and based on the average of the pixel values of the pixels 112 that also receive the polarized light of 0° polarization angle and the color of the polarized light is blue light among the eight pixel units 111 adjacent to the pixel unit 111 The value is interpolated to obtain the blue value of the pixel 112, so as to obtain the pixel value of the pixel 112 that receives the polarized light with a polarization angle of 0°, which can ensure that the polarization angle of the polarized light received by the pixel 112 for interpolation calculation is the same and the associated color The information is different, and the pixel value of each pixel 112 can be calculated more accurately.
  • the processor 300 generates a color polarization image according to the obtained pixel value of the pixel 112, the polarization angle information of the polarization element 122 corresponding to the pixel 112 (that is, the polarization angle is 0°), and the color information of the filter area 1511.
  • the target polarization angle can be artificially set according to user preferences.
  • one exposure can be performed in advance in the current scene, and the processor 300 can generate polarization images for each polarization angle (a total of 4 color polarization images with different polarization angles are generated, The color polarization image of 0°, the color polarization image of 45°, the color polarization image of 90°, and the color polarization image of 135° respectively) are presented to the user on the display screen of the electronic device 1000, and the user selects the favorite among them Then use the polarization angle corresponding to the color polarization image as the target polarization angle; or, the target user chooses the least favorite color polarization image. For example, when shooting a lake surface, the user does not want to see the sky reflected by the lake surface.
  • the color polarization images that contain reflections can be selected, and the processor 300 uses the polarization angle corresponding to the color polarization images that do not contain reflections as the target polarization angle according to the user's selection.
  • the processor 300 uses the polarization angle corresponding to the color polarization images that do not contain reflections as the target polarization angle according to the user's selection.
  • there may be only one color polarization image For example, if the color polarization image corresponding to 90° includes a reflection image, the target polarization angle is 0°, 45°, and 135°.
  • the processor 300 may re-determine the pixel value of the pixel 112 according to the pixel value of the pixel 112 adjacent to the pixel 112 that receives the 90° polarized light, for example, taking the average of the sum of the pixel values of the pixel 112 adjacent to the pixel 112 The value is taken as the pixel value of the pixel 112.
  • the processor 300 generates a color polarization image based on the re-determined pixel value of the pixel 112 receiving the 90° polarized light and the pixel values of all other pixels 112.
  • the color polarization image not only removes the polarization corresponding to the 90° polarized light.
  • the angle information that is, the reflected image that the user does not want in the above example
  • the resolution is consistent with the resolution of the entire image sensor 10.
  • At least one polarizing element 122 includes a liquid crystal cell 1222. Imaging methods also include:
  • the target polarization angle can also be automatically set by the processor 300 according to the captured color polarization image. For example, an exposure can be performed in advance in the current scene, and the processor 300 can generate a color polarization image for each polarization angle (for example, a total of Generate 4 color polarization images with different polarization angles), and then compare the resolution of the 4 color polarization images, and select the polarization angle corresponding to the color polarization image with the highest resolution as the target polarization angle.
  • a color polarization image for each polarization angle for example, a total of Generate 4 color polarization images with different polarization angles
  • the processor 300 controls the liquid crystal unit 1222 of the polarization element 122 to change the angle of the polarization axis of the polarization element 122 to the target polarization angle, so that the four pixels 112 in each pixel unit 111 receive light with the target polarization angle. For example, as shown in FIG. 18, if the target polarization angle is 0°, the processor 300 controls the liquid crystal cells 1222 of the polarization element 122 to rotate to 0°.
  • the color polarization image is generated according to the pixel 112 whose polarization angle is the target polarization angle in each pixel unit 111, and the resolution of the color polarization image is image
  • the resolution is the same as the overall resolution of the image sensor 10, so that a high-resolution color polarization image having the polarization angle information of the target polarization angle can be acquired.
  • the imaging method further includes:
  • the pixel value of a certain pixel 112 when the pixel value of a certain pixel 112 is over-exposed, it means that the information of the pixel 112 is no longer accurate.
  • the pixel is defined as an over-exposed pixel 112, and the over-exposed pixel 112 is a pixel 112 with a pixel value greater than a predetermined pixel value.
  • the predetermined pixel value can be set according to requirements. For example, the predetermined pixel value can be set to be as small as 180, and the pixel 112 with a pixel value greater than 180 is the overexposed pixel 112; or the predetermined pixel value is set to be larger, such as 255, and the pixel 112 with the pixel value greater than 255 is the overexposed pixel. 112.
  • the processor 300 may determine the pixel value of the overexposed pixel 112 according to the pixel value of the pixel 112 adjacent to the overexposed pixel 112. Since the polarization angles of the polarizing elements 122 corresponding to adjacent pixels 112 in the present application can be different from each other, when the polarized light of one polarization angle in the environment is too much, the pixel 112 that receives the polarized light of that polarization angle is overexposed. The angle of polarized light is generally less, that is, the pixel 112 adjacent to the pixel 112 is generally not overexposed.
  • the pixel value of the pixel 112 can be calculated according to the pixel value of the pixel 112 adjacent to the pixel 112, for example, the The average value of the sum of the pixel values of the pixels 112 adjacent to the pixel 112 is taken as the pixel value of the pixel 112.
  • the processor 300 can recalculate the pixel value of the overexposed pixel 112 to prevent the overexposed pixel 112 from affecting the image quality.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, "a plurality of” means at least two, such as two, three, etc., unless specifically defined otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optical Filters (AREA)
  • Studio Devices (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Polarising Elements (AREA)

Abstract

L'invention concerne un capteur d'image, un appareil de caméra, un dispositif électronique et un procédé d'imagerie. Le capteur d'image comprend une couche de pixel, une couche de polarisation, et une couche de filtre. La couche de pixel comprend une pluralité d'unités de pixel, et chacune des unités de pixel comprend quatre pixels. La couche de polarisation comprend une pluralité d'unités de polarisation, et chacune des unités de polarisation comprend quatre éléments de polarisation. Les angles des tiges de polarisation des quatre éléments de polarisation peuvent être différents l'un de l'autre. Le filtre comprend une pluralité d'unités de filtre, et chacune des unités de filtre comprend quatre régions de filtre. Quatre régions de filtre de chaque unité de filtre permettent à la lumière de la même couleur de passer à travers. Les régions de filtre, les éléments de polarisation et les pixels sont en correspondance biunivoque. Chaque pixel est utilisé pour recevoir la lumière polarisée traversant une région de filtre correspondante et un élément de polarisation correspondant. Des modes de réalisation de la présente invention peuvent obtenir la lumière polarisée à quatre angles de polarisation différents dans une scène photographiée, et des informations de couleur de la lumière polarisée de façon à générer différentes images de polarisation de couleur en fonction de la lumière polarisée avec les informations de couleur et des informations de différents angles de polarisation. La plage d'application de scène est large.
PCT/CN2020/123226 2019-11-25 2020-10-23 Capteur d'image, appareil de caméra, dispositif électronique, et procédé d'imagerie WO2021103872A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911167081.3A CN110995968B (zh) 2019-11-25 2019-11-25 图像传感器、摄像装置、电子设备和成像方法
CN201911167081.3 2019-11-25

Publications (1)

Publication Number Publication Date
WO2021103872A1 true WO2021103872A1 (fr) 2021-06-03

Family

ID=70086585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123226 WO2021103872A1 (fr) 2019-11-25 2020-10-23 Capteur d'image, appareil de caméra, dispositif électronique, et procédé d'imagerie

Country Status (2)

Country Link
CN (1) CN110995968B (fr)
WO (1) WO2021103872A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110995968B (zh) * 2019-11-25 2021-08-20 Oppo广东移动通信有限公司 图像传感器、摄像装置、电子设备和成像方法
US11418762B2 (en) * 2020-09-11 2022-08-16 GM Global Technology Operations LLC Imaging system and method
CN113055575B (zh) * 2021-03-30 2023-02-07 Oppo广东移动通信有限公司 图像传感器、摄像头模组及电子设备
CN113286067B (zh) * 2021-05-25 2023-05-26 Oppo广东移动通信有限公司 图像传感器、摄像装置、电子设备及成像方法
CN115118859A (zh) * 2022-06-27 2022-09-27 联想(北京)有限公司 电子设备以及处理方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103219350A (zh) * 2012-01-23 2013-07-24 全视科技有限公司 用于3d成像的有具有交替偏光的滤光器的图像传感器
CN103248900A (zh) * 2012-02-03 2013-08-14 联咏科技股份有限公司 感光装置与光学设定方法
US20130321888A1 (en) * 2012-06-01 2013-12-05 Samsung Electronics Co., Ltd. Complex spatial light modulator and holographic 3d image display device having the same
CN107251553A (zh) * 2015-02-27 2017-10-13 索尼公司 图像处理装置、图像处理方法和图像拾取元件
CN110021612A (zh) * 2018-01-10 2019-07-16 三星电子株式会社 图像传感器
CN110677575A (zh) * 2019-11-12 2020-01-10 Oppo广东移动通信有限公司 图像传感器、相机模组和终端
CN110784633A (zh) * 2019-11-12 2020-02-11 Oppo广东移动通信有限公司 图像传感器、相机模组、终端和成像方法
CN110995968A (zh) * 2019-11-25 2020-04-10 Oppo广东移动通信有限公司 图像传感器、摄像装置、电子设备和成像方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013172292A (ja) * 2012-02-21 2013-09-02 Sony Corp 撮像装置及び撮像素子アレイ
CN102647567B (zh) * 2012-04-27 2015-03-25 中国科学院上海高等研究院 Cmos图像传感器及其像素结构
CN104777624B (zh) * 2015-03-25 2017-07-04 清华大学深圳研究生院 一种偏振成像设备及方法
CN105049690B (zh) * 2015-07-06 2017-06-09 中国科学院遥感与数字地球研究所 便携式偏振数字相机及其成像方法
CN206515736U (zh) * 2017-01-09 2017-09-22 广东欧珀移动通信有限公司 显示屏和触摸屏组件
CN107592465B (zh) * 2017-10-10 2020-05-26 联想(北京)有限公司 一种成像系统以及成像方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103219350A (zh) * 2012-01-23 2013-07-24 全视科技有限公司 用于3d成像的有具有交替偏光的滤光器的图像传感器
CN103248900A (zh) * 2012-02-03 2013-08-14 联咏科技股份有限公司 感光装置与光学设定方法
US20130321888A1 (en) * 2012-06-01 2013-12-05 Samsung Electronics Co., Ltd. Complex spatial light modulator and holographic 3d image display device having the same
CN107251553A (zh) * 2015-02-27 2017-10-13 索尼公司 图像处理装置、图像处理方法和图像拾取元件
CN110021612A (zh) * 2018-01-10 2019-07-16 三星电子株式会社 图像传感器
CN110677575A (zh) * 2019-11-12 2020-01-10 Oppo广东移动通信有限公司 图像传感器、相机模组和终端
CN110784633A (zh) * 2019-11-12 2020-02-11 Oppo广东移动通信有限公司 图像传感器、相机模组、终端和成像方法
CN110995968A (zh) * 2019-11-25 2020-04-10 Oppo广东移动通信有限公司 图像传感器、摄像装置、电子设备和成像方法

Also Published As

Publication number Publication date
CN110995968B (zh) 2021-08-20
CN110995968A (zh) 2020-04-10

Similar Documents

Publication Publication Date Title
WO2021103872A1 (fr) Capteur d'image, appareil de caméra, dispositif électronique, et procédé d'imagerie
US10708492B2 (en) Array camera configurations incorporating constituent array cameras and constituent cameras
US10015416B2 (en) Imaging systems with high dynamic range and phase detection pixels
CN208690261U (zh) 图像传感器
US10419664B2 (en) Image sensors with phase detection pixels and a variable aperture
CN104284106B (zh) 固态摄像装置和电子设备
CN110784633B (zh) 图像传感器、相机模组、终端和成像方法
US20180047185A1 (en) Light field metadata
JP2011176715A (ja) 裏面照射型撮像素子および撮像装置
JP6017322B2 (ja) 固体撮像装置
CN113286067B (zh) 图像传感器、摄像装置、电子设备及成像方法
EP3065395A1 (fr) Traitement de données de champ lumineux
CN108174068A (zh) 光场成像模组
CN207691912U (zh) 小体积的光场成像模组
CN110677575B (zh) 图像传感器、相机模组和终端
TWI676393B (zh) 影像感測器
CN113055575B (zh) 图像传感器、摄像头模组及电子设备
KR20140061234A (ko) 이미지 생성 장치 및 이미지 생성 방법
CN110708453A (zh) 图像传感器、相机模组、终端和成像方法
WO2019049193A1 (fr) Module de capteur pour authentification d'empreinte digitale, et dispositif d'authentification d'empreinte digitale
CN207691911U (zh) 光场成像模组
KR20130128226A (ko) 광학 수단 및 영상 촬영 장치
KR20220067081A (ko) 렌즈리스 파노라마 영상 처리 장치
CN115755514A (zh) 一种成像装置和成像方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20894392

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20894392

Country of ref document: EP

Kind code of ref document: A1