WO2023009838A1 - Sensing with liquid crystal polarization holograms and metasurface - Google Patents

Sensing with liquid crystal polarization holograms and metasurface Download PDF

Info

Publication number
WO2023009838A1
WO2023009838A1 PCT/US2022/038909 US2022038909W WO2023009838A1 WO 2023009838 A1 WO2023009838 A1 WO 2023009838A1 US 2022038909 W US2022038909 W US 2022038909W WO 2023009838 A1 WO2023009838 A1 WO 2023009838A1
Authority
WO
WIPO (PCT)
Prior art keywords
subpixel
layer
imaging pixels
microlens
image sensor
Prior art date
Application number
PCT/US2022/038909
Other languages
French (fr)
Inventor
Xinqiao Liu
Lu LU
Qing Chao
Junren WANG
Hao Yu
Original Assignee
Meta Platforms Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/865,223 external-priority patent/US20230037261A1/en
Application filed by Meta Platforms Technologies, Llc filed Critical Meta Platforms Technologies, Llc
Priority to EP22769423.9A priority Critical patent/EP4378001A1/en
Priority to CN202280052057.9A priority patent/CN117897815A/en
Publication of WO2023009838A1 publication Critical patent/WO2023009838A1/en

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses

Definitions

  • This disclosure relates generally to optics, and in particular to sensing applications.
  • Optical components in devices include refractive lenses, diffractive lenses, color filters, neutral density filters, and polarizers.
  • refractive lenses and microlenses are used to focus image light to a sensor.
  • an image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a patterned liquid crystal polarization hologram (LCPH) layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
  • LCPH liquid crystal polarization hologram
  • liquid crystals in the patterned LCPH layer may be doped to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
  • the microlens regions may have a one-to-one correspondence with subpixels of the imaging pixels.
  • the microlens regions may have a one-to-one correspondence with the imaging pixels.
  • the microlens regions may be rectangular.
  • the image sensor may further comprise: a wavelength filtering layer disposed between a semiconductor layer of the imaging pixels an the patterned LCPH layer.
  • the patterned LCPH layer may include a Pancharatnam-Berry Phase (LC-PBP) design.
  • LC-PBP Pancharatnam-Berry Phase
  • the patterned LCPH layer may include a polarized volume hologram (PVH) design.
  • PVH polarized volume hologram
  • the microlens regions of the LCPH layer may have a longest dimension of less than four microns wide. In some embodiments, the microlens regions of the LCPH layer may each have a longest dimension of four subpixels of the imaging pixels.
  • an image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a metasurface lens layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
  • nanostructures in the metasurface lens layer may be configured to: pass a first polarization orientation to the imaging pixels and reject a second polarization orientation from becoming incident on the imaging pixels, the first polarization orientation different from the second polarization orientation.
  • the nanostructures in the metasurface lens layer may be configured to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
  • the metasurface lens layer may include non-symmetric nanostructures.
  • the metasurface lens layer may be polarization-dependent.
  • nanostructures of the metasurface lens layer may be formed of at least one of silicon, silicon-nitride, or titanium-oxide.
  • the microlens regions may have a one-to-one correspondence with subpixels of the imaging pixels.
  • the microlens regions may have a one-to-one correspondence with the imaging pixels.
  • the microlens regions may be rectangular.
  • a camera comprising: an image sensor including a plurality of imaging pixels configured to sense image light; and a lens assembly having a liquid crystal Pancharatnam-Berry Phase (LC-PBP) lens configured to focus the image light to the imaging pixels of the image sensor, wherein the lens assembly is without a refractive lens formed of glass or plastic.
  • LC-PBP liquid crystal Pancharatnam-Berry Phase
  • the camera may further comprise: a circular polarizer layer, wherein LC-PBP lens is disposed between the circular polarizer layer and the imaging pixels.
  • FIG. 1 illustrates a sensing system that includes a patterned liquid crystal polarization holograms (LCPH) layer having microlens regions configured to focus image light to different subpixels of an image sensor, in accordance with aspects of the disclosure.
  • LCPH liquid crystal polarization holograms
  • FIGs. 2A-2B illustrate an example imaging pixel that includes subpixels, in accordance with aspects of the disclosure.
  • FIG. 3 illustrates a patterned LCPH layer arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
  • FIG. 4 illustrates a patterned LCPH layer including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure.
  • FIG. 5 illustrates an example imaging system including an image pixel array, in accordance with aspects of the disclosure.
  • FIG. 6 illustrates a sensing system that includes a metasurface lens layer having microlens regions configured to focus image light to different subpixels of an image sensor, in accordance with aspects of the disclosure.
  • FIGs. 7A-7B illustrate an example imaging pixel that includes subpixels, in accordance with aspects of the disclosure.
  • FIG. 8 illustrates a metasurface lens layer arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
  • FIG. 9 illustrates a metasurface lens layer including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure.
  • FIG. 10 illustrates an imaging system that may include a metasurface lens layer, in accordance with aspects of the disclosure.
  • FIGs. 11 A-14B illustrate various nanostructures that may be included in microlens regions of metasurface lens layer, in accordance with aspects of the disclosure.
  • FIGs. 15A-15B illustrate an example imaging pixel having a multi-functional microlens layer, in accordance with aspects of the disclosure.
  • FIG. 16 illustrates a camera system that includes a lens assembly having a diffractive focusing element, in accordance with aspects of the disclosure.
  • FIGs. 17A-17C illustrate an example process for fabricating an LCPH, in accordance with aspects of the disclosure.
  • Embodiments of liquid crystal polarization holograms (LCPH) and metasurfaces in imaging contexts are described herein.
  • LCPH liquid crystal polarization holograms
  • numerous specific details are set forth to provide a thorough understanding of the embodiments.
  • One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc.
  • well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
  • visible light may be defined as having a wavelength range of approximately 380 nm - 700 nm.
  • Non-visible light may be defined as light having wavelengths that are outside the visible light range, such as ultraviolet light and infrared light.
  • Infrared light having a wavelength range of approximately 700 nm - 1 mm includes near-infrared light.
  • near-infrared light may be defined as having a wavelength range of approximately 700 nm - 1.6 pm.
  • the term “transparent” may be defined as having greater than 90% transmission of light. In some aspects, the term “transparent” may be defined as a material having greater than 90% transmission of visible light.
  • LCPH liquid crystal polarization holograms
  • metasurfaces configured with the functionality of one or more of polarizers, wavelength filters, and/or microlenses to be used to replace conventional optical components in sensing application (e.g. an image sensor).
  • FIG. 1 illustrates an imaging system 100 that includes a patterned LCPH layer 141 having microlens regions configured to focus image light to different subpixels of an image pixel array 102, in accordance with implementations of the disclosure.
  • Imaging system 100 includes a focusing element 115 having a focal length 116, and an image pixel array 102.
  • Image pixel array 102 may be implemented as a complementary metal-oxide semiconductor (CMOS) image sensor, in some implementations.
  • Focusing element 115 focuses image light scattered/reflected from object 110 to image pixel array 102.
  • Image pixel array 102 includes a plurality of imaging pixels such as imaging pixel 151.
  • Example imaging pixel 151 includes a first subpixel 111 A and a second subpixel 11 IB.
  • a filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared, ultraviolet) to filter the wavelength of image light that propagates to a semiconductor layer 145 (e.g. silicon) of imaging pixel 151.
  • a spacer layer 135 may be disposed between patterned LCPH layer 141 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
  • DTI Deep Trench Interface
  • BSM Buried Shielding Metal
  • Patterned LCPH layer 141 includes microlens regions for focusing the image light to subpixels 111 A and 11 IB.
  • a circular polarizer 118 may be disposed between focusing element 115 and image pixel array 102 to provide circularly polarized image light to patterned LCPH layer 141 that covers all or a portion of image pixel array 102.
  • a patterned LCPH layer 141 may be disposed over image pixel array 102 where the patterned LCPH layer 141 includes various microlens regions that are disposed over each imaging pixel with a one-to-one correspondence.
  • patterned LCPH layer 141 includes microlens regions having a one-to-one correspondence with subpixels (e.g. 111 A and 11 IB) of the imaging pixels 151.
  • FIG. 2A illustrates a side view of an example imaging pixel 252 that includes subpixels 252A and 252B, in accordance with implementations of the disclosure.
  • FIG. 2A illustrates image light 290 encountering patterned LCPH layer 241. Patterned LCPH layer 241 is disposed over the imaging pixels (e.g. imaging pixel 252) of an image pixel array (e.g. image pixel array 102).
  • Example imaging pixel 252 includes subpixel 252A and subpixel 252B in FIG. 2A.
  • FIG. 2B illustrates a perspective view of example imaging pixel 252 including four subpixels 252A, 252B, 252C, and 252D.
  • FIG. 2A shows subpixel 252A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 252B is illustrated as an infrared subpixel.
  • Subpixel 252B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light.
  • Subpixel 252A includes a filter 238 in filter layer 237 that may be configured to pass visible light (while rejecting/blocking non-visible light) to a sensing region 211 A in a semiconductor layer 245 of imaging pixel 252.
  • Filter 238 may be an infrared stop filter that blocks/rejects infrared light while passing visible light.
  • Subpixel 252B includes a filter 239 that may be configured to pass broad-spectrum infrared light (e.g. -700 nm - 3000 nm) or narrow-band infrared light (e.g. 795 nm to 805 nm) to sensing region 21 IB in a semiconductor layer 245 of imaging pixel 252.
  • Sensing region 211 A of subpixel 252A and sensing region 21 IB of subpixel 252B may be implemented in doped silicon included in semiconductor layer 245. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels.
  • Each subpixel of imaging pixel 252 may be 1-2 microns.
  • FIG. 2 A shows that layer 235 may be disposed between LCPH layer 241 and filter layer 237.
  • Layer 235 may include DTI and/or back side metal (BSM).
  • Optional oxide layer 243 is shown disposed between semiconductor layer 245 and filter layer 237.
  • Oxide layer 243 may include silicon dioxide.
  • Oxide layer 243 may include BSM.
  • the BSM may include Tungsten and one or more layers of titanium-nitride.
  • Wavelength filtering layer 237 may include example filters 238 and 23 in addition to other wavelength filters disposed above subpixels 252C and 252D.
  • Layer 237 is disposed between semiconductor layer 245 and patterned LCPH layer 241.
  • layers 235, 237, layer 243, and layer 245 may extend to many imaging pixels in an image pixel array even though FIG. 2A only illustrates a single imaging pixel 252.
  • FIG. 2B illustrates that example imaging pixel includes four subpixels where two of the subpixels (252A and 252C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 252B that may be configured to measure/sense infrared light.
  • Other wavelength filters and different filter patterns may also be used in imaging pixels, in accordance with implementations of the disclosure.
  • Subpixel 252D may include a red, green, or blue filter that passes red, green, or blue light to a sensing region of subpixel 252D.
  • imaging pixels include a red-green- green-blue (RGGB) filter pattern, for example.
  • RGGB red-green- green-blue
  • patterned LCPH layer 241 includes microlens regions configured to focus image light 290 to subpixels 252A, 252B, 252C, and 252D.
  • patterned LCPH layer 241 is implemented with a liquid crystal Pancharatnam-Berry Phase (LC-PBP) design.
  • patterned LCPH layer 241 includes a polarized volume hologram (PVH) design.
  • the liquid crystals in patterned LCPH layer 241 are doped with color dye to provide a filter for specific wavelength bands or to block specific wavelengths.
  • Certain microlens regions of patterned LCPH layer 241 may be doped to pass a particular wavelength band of image light 290 while other microlens regions of patterned LCPH layer 241 may be doped to pass other (different) wavelength band(s) of image light 290, for example.
  • FIG. 3 illustrates a patterned LCPH layer 301 arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
  • microlens region 341 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 01, 02, 05, and 06 of imaging pixel 352.
  • microlens region 341 has a one-to-one correspondence with imaging pixel 352.
  • patterned LCPH layer 301 is patterned to have a microlens region to focus image light (e.g. image light 290) for each imaging pixel.
  • Subpixel 01 is configured to measure/sense infrared light
  • subpixel 02 is configured to measure/sense the intensity of visible light
  • subpixel 05 is configured to measure/sense the intensity of visible light
  • subpixel 06 is configured to measure/sense red, green, or blue light.
  • the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
  • Microlens region 342 is disposed over imaging pixel 353.
  • Microlens region 342 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 03, 04, 07, and 08 of imaging pixel 353.
  • microlens region 342 has a one-to- one correspondence with imaging pixel 353.
  • Subpixel 03 is configured to measure/sense infrared light
  • subpixel 04 is configured to measure/sense the intensity of visible light
  • subpixel 07 is configured to measure/sense the intensity of visible light
  • subpixel 08 is configured to measure/sense red, green, or blue light.
  • an oval, circular, or rectangular microlens region of patterned LCPH layer 301 may be disposed over subpixels 09, 10, 13, and 14 of imaging pixel 354. And, an oval, circular, or rectangular microlens region of patterned LCPH layer 301 may be disposed over subpixels 11, 12, 15, and 16 of imaging pixel 355.
  • Patterned LCPH layer 301 may be a contiguous layer.
  • a contiguous patterned LCPH layer 301 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
  • FIG. 4 illustrates a patterned LCPH layer 401 including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure.
  • microlens region 441 is illustrated as a round or oval microlens region configured to focus image light to one subpixel (subpixel 01).
  • Microlens region 442 is configured to focus image light to subpixel 02
  • microlens region 443 is configured to focus image light to subpixel 05
  • microlens region 444 is configured to focus image light to subpixel 06.
  • microlens regions of patterned LCPH layer 401 have a one-to-one correspondence with subpixels of imaging pixel 452 that includes subpixels 01, 02, 05, and 06.
  • microlens regions 445, 446, 447, and 448 of patterned LCPH layer 401 have a one-to-one correspondence to subpixels 09, 10, 13, and 14, respectively, of imaging pixel 454.
  • patterned LCPH layer 401 is patterned to have a microlens regions to focus image light (e.g. image light 290) for each subpixel in FIG. 4.
  • subpixel 01 is configured to measure/sense infrared light
  • subpixel 02 is configured to measure/sense the intensity of visible light
  • subpixel 05 is configured to measure/sense the intensity of visible light
  • subpixel 06 is configured to measure/sense red, green, or blue light.
  • the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
  • an oval, circular, or rectangular microlens region of patterned LCPH layer 401 may be disposed (individually) over subpixels 03, 04, 07, and 08 of imaging pixel 453.
  • an oval, circular, or rectangular microlens region of patterned LCPH layer 401 may be disposed (individually) over subpixels 11, 12, 15, and 16 of imaging pixel 455.
  • the microlens regions of patterned LCPH layer 401 are configured to pass different wavelengths of light.
  • microlens region 441 may be doped with a color dye that functions as a filter and microlens region 442 may be doped with a different color dye that functions as a filter that filters different wavelengths of light.
  • a first microlens region may be configured to pass a first wavelength band of image light to a first subpixel and not a second subpixel and a second microlens region may be configured to pass a second (different) wavelength band of image light to the second subpixel and not the first subpixel.
  • Patterned LCPH layer 401 may be a contiguous layer.
  • a contiguous patterned LCPH layer 401 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
  • Patterned LCPH layer 301 and/or patterned LCPH layer 401 may be considered “patterned” because the microlens regions are arranged, sized, and/or configured to focus image light onto subpixels or imaging pixels of an image pixel array.
  • FIG. 5 illustrates an imaging system 500 including an image pixel array 502, in accordance with aspects of the disclosure. All or portions of imaging system 500 may be included in an image sensor, in some implementations.
  • Imaging system 500 includes control logic 508, processing logic 512, and image pixel array 502.
  • Image pixel array 502 may be arranged in rows and columns where integer y is the number of rows and integer x is the number of columns.
  • the image pixel array 502 may have a total of n pixel (P) and integer n may be the product of integer x and integer y. In some implementations, n is over one million imaging pixels.
  • Each imaging pixel may include subpixels described in the disclosure.
  • control logic 508 drives image pixel array 502 to capture an image.
  • Image pixel array 502 may be configured to have a global shutter or a rolling shutter, for example.
  • Each subpixel may be configured in a 3-transistor (3T) or 4-transistor (4T) readout circuit configuration.
  • Processing logic 512 is configured to receive the imaging signals from each subpixel. Processing logic 512 may perform further operations such as subtracting or adding some imaging signals from other imaging signals to generate image 515.
  • Aspects of a patterned LCPH layer in accordance with FIGs. 1-4 of this disclosure may be disposed over image pixel array 502 in imaging system 500.
  • patterned LCPH layer 301 is disposed over image pixel array 502 in imaging system 500.
  • patterned LCPH layer 401 is disposed over image pixel array 502 in imaging system 500.
  • FIG. 6 illustrates an imaging system 600 that includes a metasurface lens layer 641 having microlens regions configured to focus image light to different subpixels of an image pixel array 602, in accordance with implementations of the disclosure.
  • Imaging system 600 includes a focusing element 115 having a focal length 116, and an image pixel array 602.
  • Image pixel array 602 may be implemented as a complementary metal-oxide semiconductor (CMOS) image sensor, in some implementations.
  • Focusing element 115 focuses image light scattered/reflected from object 110 to image pixel array 602.
  • Image pixel array 602 includes a plurality of imaging pixels such as imaging pixel 651.
  • Imaging pixel 651 includes a first subpixel 611 A and a second subpixel 61 IB.
  • a filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared) to filter the wavelength of image light that propagates to a semiconductor layer 645 (e.g. silicon) of imaging pixel 651.
  • a spacer layer 135 may be disposed between metasurface lens layer 641 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
  • DTI Deep Trench Interface
  • BSM Buried Shielding Metal
  • Metasurface lens layer 641 includes microlens regions for focusing the image light to subpixels 611 A and 61 IB.
  • a circular polarizer 118 may be disposed between focusing element 115 and image pixel array 602 to provide circularly polarized image light to metasurface lens layer 641.
  • a metasurface lens layer 641 may be disposed over image pixel array 602 where the metasurface lens layer 641 includes various microlens regions that are disposed over each imaging pixel with a one-to- one correspondence.
  • metasurface lens layer 641 includes microlens regions having a one-to-one correspondence with subpixels (e.g. 611A and 61 IB) of the imaging pixels 651.
  • FIG. 7 A illustrates a side view of an example imaging pixel 752 that includes subpixels 752A and 752B, in accordance with implementations of the disclosure.
  • FIG. 7A illustrates image light 290 encountering metasurface lens layer 741.
  • Metasurface lens layer 741 is disposed over the imaging pixels (e.g. imaging pixel 752) of an image pixel array (e.g. image pixel array 602).
  • Example imaging pixel 752 includes subpixel 752A and subpixel 752B in FIG. 7A.
  • FIG. 7B illustrates a perspective view of example imaging pixel 752 including four subpixels 752A, 752B, 752C, and 752D.
  • FIG. 7A shows subpixel 752A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 752B is illustrated as an infrared subpixel.
  • Subpixel 752B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light.
  • Subpixel 752A includes a filter 238 in filter layer 237 that may be configured to pass visible light (while rejecting/blocking non-visible light) to a sensing region 711 A in a semiconductor layer 745 of imaging pixel 752.
  • Subpixel 752B includes a filter 239 that may be configured to pass broad-spectrum infrared light (e.g. -700 nm - 3000 nm) or narrow-band infrared light (e.g. 795 nm to 805 nm) to sensing region 71 IB in a semiconductor layer 745 of imaging pixel 752.
  • Sensing region 711A of subpixel 752A and sensing region 71 IB of subpixel 752B may be implemented in doped silicon included in semiconductor layer 745. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels.
  • DTI Deep Trench Isolation
  • Each subpixel of imaging pixel 752 may be 1-2 microns.
  • FIG. 7A shows that layer 235 may be disposed between metasurface lens layer 741 and filter layer 237.
  • Layer 235 may include DTI and/or back side metal (BSM).
  • Optional oxide layer 243 is shown disposed between semiconductor layer 745 and filter layer 237.
  • Wavelength filtering layer 237 may include example filters 238 and 239.
  • Layer 237 is disposed between semiconductor layer 745 and metasurface lens layer 741.
  • layers 235, 237, layer 243, and layer 745 may extend to many imaging pixels in an image pixel array even though FIG. 7 A only illustrates a single imaging pixel 752.
  • FIG. 7B illustrates that example imaging pixel 752 includes four subpixels where two of the subpixels (752A and 752C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 752B that may be configured to measure/sense infrared light.
  • Subpixel 752D may include a red, green, or blue filter that passes red, green, or blue light to a sensing region of subpixel 752D.
  • imaging pixels include a red-green- green-blue (RGGB) filter pattern, for example.
  • RGGB red-green- green-blue
  • Each subpixel of imaging pixel 752 may be 1-2 microns.
  • metasurface lens layer 741 includes microlens regions configured to focus image light 290 to subpixels 752A, 752B, 752C, and 752D.
  • FIGs. 11 A-14B illustrate various nanostructures that may be included in microlens regions of metasurface lens layer, in accordance with aspects of the disclosure.
  • FIG. 11A illustrates a top view of a circular nanostructure 1101 and FIG. 1 IB illustrates a perspective view of nanostructure 1101 having a height HI.
  • FIG. 12A illustrates atop view of an oval nanostructure 1201 and FIG. 12B illustrates a perspective view of nanostructure 1201 having a height H2.
  • FIG. 13A illustrates a top view of a rectangular nanostructure 1301 and FIG. 13B illustrates a perspective view of nanostructure 1301 having a height H3.
  • FIG. 14A illustrates atop view of a square nanostructure 1401 and FIG. 14B illustrates a perspective view of nanostructure 1401 having a height H4.
  • the microlens regions of the metasurface lens layers may include two- dimensional or one-dimensional nanostructures.
  • the height of the nanostructures may be between 50 nm and approximately two microns, for example.
  • the width or length of the nano structures may be between 10 nm and approximately 500 nm, in some implementations.
  • the nanostructures may be formed out of a planar substrate (e.g. silicon, silicon-nitride, or titanium-oxide) using a subtractive process (e.g. photolithography and/or etching).
  • the resulting metasurface lens layer having microlens regions may be considered a flat optical component with only minor thickness variations (the height/thickness of the various nanostructures of the metasurface).
  • the metasurface design of each microlens region of the metasurface lens layer may be configured to focus image light to subpixels of an image pixel array.
  • the metasurface design of the microlens regions may be further configured to pass/transmit or block/reject particular wavelengths and/or polarization orientations of image light 290.
  • wavelength filtering features of a metasurface are formed using metasurface-based subtractive color filter fabrication techniques that may include forming the metasurface on a glass wafer using CMOS processing techniques.
  • FIG. 8 illustrates a metasurface lens layer 801 arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
  • microlens region 841 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 01, 02, 05, and 06 of imaging pixel 852.
  • microlens region 841 has a one-to-one correspondence with imaging pixel 852.
  • Subpixel 01 is configured to measure/sense infrared light
  • subpixel 02 is configured to measure/sense the intensity of visible light
  • subpixel 05 is configured to measure/sense the intensity of visible light
  • subpixel 06 is configured to measure/sense red, green, or blue light.
  • the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
  • Microlens region 842 is disposed over imaging pixel 853.
  • Microlens region 842 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 03, 04, 07, and 08 of imaging pixel 853.
  • microlens region 842 has a one-to- one correspondence with imaging pixel 853.
  • Subpixel 03 is configured to measure/sense infrared light
  • subpixel 04 is configured to measure/sense the intensity of visible light
  • subpixel 07 is configured to measure/sense the intensity of visible light
  • subpixel 08 is configured to measure/sense red, green, or blue light.
  • an oval, circular, or rectangular microlens region of metasurface lens layer 801 may be disposed over subpixels 09, 10, 13, and 14 of imaging pixel 854. And, an oval, circular, or rectangular microlens region of metasurface lens layer 801 may be disposed over subpixels 11, 12, 15, and 16 of imaging pixel 855.
  • Metasurface lens layer 801 may be a contiguous layer.
  • a contiguous metasurface lens layer 801 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
  • FIG. 9 illustrates a metasurface lens layer 901 including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure.
  • microlens region 941 is illustrated as a round or oval microlens region configured to focus image light to one subpixel (subpixel 01).
  • Microlens region 942 is configured to focus image light to subpixel 02
  • microlens region 943 is configured to focus image light to subpixel 05
  • microlens region 944 is configured to focus image light to subpixel 06.
  • microlens regions of metasurface lens layer 901 have a one-to-one correspondence with subpixels of imaging pixel 952 that includes subpixels 01, 02, 05, and 06.
  • microlens regions 945, 946, 947, and 948 of metasurface lens layer 901 have a one-to-one correspondence to subpixels 09, 10, 13, and 14, respectively, of imaging pixel 954.
  • metasurface lens layer 901 is configured to have a microlens region to focus image light (e.g. image light 290) for each subpixel in FIG. 9.
  • subpixel 01 is configured to measure/sense infrared light
  • subpixel 02 is configured to measure/sense the intensity of visible light
  • subpixel 05 is configured to measure/sense the intensity of visible light
  • subpixel 06 is configured to measure/sense red, green, or blue light.
  • the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
  • an oval, circular, or rectangular microlens region of metasurface lens layer 901 may be disposed (individually) over subpixels 03, 04, 07, and 08 of imaging pixel 953.
  • an oval, circular, or rectangular microlens region of metasurface lens layer 901 may be disposed (individually) over subpixels 11, 12, 15, and 16 of imaging pixel 955.
  • Metasurface lens layer 901 may be a contiguous layer.
  • a contiguous metasurface lens layer 901 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
  • the metasurface lens layers of the disclosure are polarization-dependent.
  • nanostructures in the metasurface lens layers of the disclosure are configured to pass/transmit a first polarization orientation to the imaging pixels and reject/block a second (different) polarization orientation from becoming incident on the imaging pixels.
  • the first polarization orientation is orthogonal to the second polarization orientation.
  • nanostructures in the metasurface lens layer are configured to pass right-hand circularly polarized light and block left-hand circularly polarized light.
  • nanostructures in the metasurface lens layer are configured to pass left-hand circularly polarized light and block right-hand circularly polarized light.
  • nanostructures in the metasurface lens layer are configured to: (1) pass a first wavelength band of image light to a first subpixel and not a second subpixel; and (2) pass a second (different) wavelength band of the image light to the second subpixel and not the first subpixel.
  • FIG. 10 illustrates an imaging system 1000 that may include a metasurface lens layer, in accordance with aspects of the disclosure. All or portions of imaging system 1000 may be included in an image sensor, in some implementations.
  • Imaging system 1000 includes control logic 1008, processing logic 1012, and image pixel array 1002.
  • Image pixel array 1002 may be arranged in rows and columns where integer y is the number of rows and integer x is the number of columns.
  • the image pixel array 1002 may have a total of n pixel (P) and integer n may be the product of integer x and integer y. In some implementations, n is over one million imaging pixels.
  • Each imaging pixel may include subpixels described in the disclosure.
  • control logic 1008 drives image pixel array 1002 to capture an image.
  • Image pixel array 1002 may be configured to have a global shutter or a rolling shutter, for example.
  • Each subpixel may be configured in a 3-transistor (3T) or 4-transistor (4T) readout circuit configuration.
  • Processing logic 1012 is configured to receive the imaging signals from each subpixel.
  • Processing logic 1012 may perform further operations such as subtracting or adding some imaging signals from other imaging signals to generate image 1015.
  • Aspects of a metasurface lens layer in accordance with FIGs. 6-9 of this disclosure may be disposed over image pixel array 1002 in imaging system 1000.
  • metasurface lens layer 801 is disposed over image pixel array 1002 in imaging system 1000.
  • metasurface lens layer 901 is disposed over image pixel array 1002 in imaging system 1000.
  • FIG. 15A illustrates a side view of an example imaging pixel 1552 having a multi-functional microlens layer 1541, in accordance with implementations of the disclosure.
  • Multi-functional microlens layer 1541 may function as (1) a microlens and polarizer; (2) a microlens and a wavelength filter; or (3) a combination of microlens, polarizer, and wavelength filter.
  • an image sensor may become smaller, less expensive, and reduce the fabrication steps in the manufacturing process.
  • Multi-functional microlens layer 1541 may be implemented as a patterned LCPH layer.
  • multi-functional microlens layer 1541 is implemented as a patterned LCPH layer designed as an LC-PBP. If the patterned LCPH layer includes wavelength filtering functionality, liquid crystals in the patterned LCPH layer may be doped with color dye to provide a filter for specific wavelength bands or to block specific wavelengths. Multi-functional microlens layer 1541 may be implemented as a metasurface, in accordance with implementations previously describe in this disclosure.
  • FIG. 15A illustrates image light 290 encountering multi-functional microlens layer 1541.
  • Multi-functional microlens layer 1541 is disposed over the imaging pixels (e.g. imaging pixel 1552) of an image pixel array.
  • Example imaging pixel 1552 includes subpixel 1552A and subpixel 1552B in FIG. 15A.
  • FIG. 15B illustrates a perspective view of example imaging pixel 1552 including four subpixels 1552A, 1552B, 1552C, and 1552D.
  • FIG. 15A shows subpixel 1552A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 1552B is illustrated as an infrared subpixel.
  • the light filtering functionality is provided by a microlens region of multi functional microlens layer 1541.
  • Subpixel 1552B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light. Again, this light filtering functionality is provided by a microlens region of multi-functional microlens layer 1541.
  • Sensing region 1511A of subpixel 1552A and sensing region 1511B of subpixel 1552B may be implemented in doped silicon included in semiconductor layer 1545. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels.
  • DTI Deep Trench Isolation
  • metal layers may separate the sensing regions of different subpixels.
  • Each subpixel of imaging pixel 1552 may be 1-2 microns.
  • FIG. 15B illustrates that example imaging pixel 1552 includes four subpixels where two of the subpixels (1552A and 1552C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 1552B that may be configured to measure/sense infrared light.
  • Subpixel 1552D may be configured to measure/sense red, green, or blue light propagating to a sensing region of subpixel 1552D.
  • the filtering functionality for each subpixels may be provided by a microlens region of multi functional microlens layer 1541 disposed above the particular subpixel.
  • polarization filtering for each subpixel may also be provided by a microlens region of multi functional microlens layer 1541 disposed above the particular subpixel.
  • Each subpixel of imaging pixel 1552 may be 1-2 microns.
  • FIG. 16 illustrates a camera system 1600 that includes a lens assembly 1633 having a diffractive focusing element 1615, in accordance with implementations of the disclosure.
  • Diffractive focusing element 1615 may be in implemented as an LC-PBP lens or a metasurface lens.
  • Diffractive focusing element 1615 has a focal length 116 to focus image light onto image pixel array 1602.
  • Lens assembly 1633 may include diffractive focusing element 1615 and circular polarizer 118.
  • Circular polarizer 118 may pass a particular handed polarization orientation (e.g. right-hand polarized or left-hand polarized light) to diffractive focusing element 1615.
  • Image pixel array 1602 may be implemented as a CMOS image sensor, in some implementations.
  • Lens assembly 1633 is configured to focus image light scattered/reflected from object 110 to imaging pixels of image pixel array 1602.
  • Image pixel array 1602 includes a plurality of imaging pixels such as imaging pixel 1651.
  • Imaging pixel 1651 includes a first subpixel 1611A and a second subpixel 161 IB.
  • a filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared) to filter the wavelength of image light that propagates to a semiconductor layer 1645 (e.g. silicon) of imaging pixel 1651.
  • a spacer layer 135 may be disposed between microlens layer 1641 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
  • DTI Deep Trench Interface
  • BSM Buried Shielding Metal
  • Microlens layer 1641 includes microlens regions for focusing the image light to subpixels 1611 A and 161 IB. Microlens layer 1641 may be implemented as a patterned LCPH or a metasurface layer having the features described above in this disclosure.
  • FIGs. 17A-17C illustrate an example process for fabricating an LCPH, in accordance with implementations of the disclosure.
  • a photoalignment material (PAM) 1712 is formed on a glass layer 1710 using a spincoat technique to form optical structure 1700.
  • a liquid crystal monomer layer 1714 is formed on PAM layer 1712 using a spincoat technique.
  • optical structure 1700 is illuminated by ultraviolet (UV) light to photo-polymerize liquid crystal monomer layer 1714 according to the particular configuration of the LCPH under fabrication.
  • UV ultraviolet
  • Embodiments of the invention may include or be implemented in conjunction with an artificial reality system.
  • Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof.
  • Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content.
  • the artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer).
  • artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality.
  • the artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
  • HMD head-mounted display
  • processing logic in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein.
  • memories are integrated into the processing logic to store instructions to execute operations and/or store data.
  • Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
  • a “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures.
  • the “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
  • Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
  • Networks may include any network or network system such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
  • a peer-to-peer network such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
  • Communication channels may include or be routed through one or more wired or wireless communication utilizing IEEE 802.11 protocols, BlueTooth, SPI (Serial Peripheral Interface), I 2 C (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN), a Wide Area Network (WAN), a public network (e.g. “the Internet”), a private network, a satellite network, or otherwise.
  • IEEE 802.11 protocols BlueTooth, SPI (Serial Peripheral Interface), I 2 C (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN),
  • a computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise.
  • a server computer may be located remotely in a data center or be stored locally.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

Imaging systems, cameras, and image sensors of this disclosure include imaging pixels that include subpixels. Diffractive optical elements such as a metasurface lens layers or a liquid crystal polarization hologram (LCPH) are configured to focus image light to the subpixels of the imaging pixels.

Description

SENSING WITH LIQUID CRYSTAL POLARIZATION HOLOGRAMS
AND METASURFACE
TECHNICAL FIELD
[0001] This disclosure relates generally to optics, and in particular to sensing applications.
BACKGROUND INFORMATION
[0002] Optical components in devices include refractive lenses, diffractive lenses, color filters, neutral density filters, and polarizers. In imaging applications such as cameras, refractive lenses and microlenses are used to focus image light to a sensor.
SUMMARY OF THE DISCLOSURE
[0003] In accordance with a first aspect of the present disclosure, there is provided an image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a patterned liquid crystal polarization hologram (LCPH) layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
[0004] In some embodiments, liquid crystals in the patterned LCPH layer may be doped to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
[0005] In some embodiments, the microlens regions may have a one-to-one correspondence with subpixels of the imaging pixels.
[0006] In some embodiments, the microlens regions may have a one-to-one correspondence with the imaging pixels.
[0007] In some embodiments, the microlens regions may be rectangular.
[0008] In some embodiments, the image sensor may further comprise: a wavelength filtering layer disposed between a semiconductor layer of the imaging pixels an the patterned LCPH layer.
[0009] In some embodiments, the patterned LCPH layer may include a Pancharatnam-Berry Phase (LC-PBP) design.
[0010] In some embodiments, the patterned LCPH layer may include a polarized volume hologram (PVH) design.
[0011] In some embodiments, the microlens regions of the LCPH layer may have a longest dimension of less than four microns wide. In some embodiments, the microlens regions of the LCPH layer may each have a longest dimension of four subpixels of the imaging pixels.
[0012] In accordance with a further aspect of the present disclosure, there is provided an image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a metasurface lens layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
[0013] In some embodiments, nanostructures in the metasurface lens layer may be configured to: pass a first polarization orientation to the imaging pixels and reject a second polarization orientation from becoming incident on the imaging pixels, the first polarization orientation different from the second polarization orientation.
[0014] In some embodiments, the nanostructures in the metasurface lens layer may be configured to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
[0015] In some embodiments, the metasurface lens layer may include non-symmetric nanostructures.
[0016] In some embodiments, the metasurface lens layer may be polarization- dependent.
[0017] In some embodiments, nanostructures of the metasurface lens layer may be formed of at least one of silicon, silicon-nitride, or titanium-oxide.
[0018] In some embodiments, the microlens regions may have a one-to-one correspondence with subpixels of the imaging pixels.
[0019] In some embodiments, the microlens regions may have a one-to-one correspondence with the imaging pixels.
[0020] In some embodiments, the microlens regions may be rectangular.
[0021] In accordance with a further aspect of the present disclosure, there is provided a camera comprising: an image sensor including a plurality of imaging pixels configured to sense image light; and a lens assembly having a liquid crystal Pancharatnam-Berry Phase (LC-PBP) lens configured to focus the image light to the imaging pixels of the image sensor, wherein the lens assembly is without a refractive lens formed of glass or plastic.
[0022] In some embodiments, the camera may further comprise: a circular polarizer layer, wherein LC-PBP lens is disposed between the circular polarizer layer and the imaging pixels.
[0023] It will be appreciated that any features described herein as being suitable for incorporation into one or more aspects or embodiments of the present disclosure are intended to be generalizable across any and all aspects and embodiments of the present disclosure. Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
[0025] FIG. 1 illustrates a sensing system that includes a patterned liquid crystal polarization holograms (LCPH) layer having microlens regions configured to focus image light to different subpixels of an image sensor, in accordance with aspects of the disclosure.
[0026] FIGs. 2A-2B illustrate an example imaging pixel that includes subpixels, in accordance with aspects of the disclosure.
[0027] FIG. 3 illustrates a patterned LCPH layer arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
[0028] FIG. 4 illustrates a patterned LCPH layer including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure.
[0029] FIG. 5 illustrates an example imaging system including an image pixel array, in accordance with aspects of the disclosure.
[0030] FIG. 6 illustrates a sensing system that includes a metasurface lens layer having microlens regions configured to focus image light to different subpixels of an image sensor, in accordance with aspects of the disclosure.
[0031] FIGs. 7A-7B illustrate an example imaging pixel that includes subpixels, in accordance with aspects of the disclosure.
[0032] FIG. 8 illustrates a metasurface lens layer arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure.
[0033] FIG. 9 illustrates a metasurface lens layer including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure. [0034] FIG. 10 illustrates an imaging system that may include a metasurface lens layer, in accordance with aspects of the disclosure.
[0035] FIGs. 11 A-14B illustrate various nanostructures that may be included in microlens regions of metasurface lens layer, in accordance with aspects of the disclosure.
[0036] FIGs. 15A-15B illustrate an example imaging pixel having a multi-functional microlens layer, in accordance with aspects of the disclosure.
[0037] FIG. 16 illustrates a camera system that includes a lens assembly having a diffractive focusing element, in accordance with aspects of the disclosure.
[0038] FIGs. 17A-17C illustrate an example process for fabricating an LCPH, in accordance with aspects of the disclosure.
DETAILED DESCRIPTION
[0039] Embodiments of liquid crystal polarization holograms (LCPH) and metasurfaces in imaging contexts are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
[0040] Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0041] In aspects of this disclosure, visible light may be defined as having a wavelength range of approximately 380 nm - 700 nm. Non-visible light may be defined as light having wavelengths that are outside the visible light range, such as ultraviolet light and infrared light. Infrared light having a wavelength range of approximately 700 nm - 1 mm includes near-infrared light. In aspects of this disclosure, near-infrared light may be defined as having a wavelength range of approximately 700 nm - 1.6 pm. In aspects of this disclosure, the term “transparent” may be defined as having greater than 90% transmission of light. In some aspects, the term “transparent” may be defined as a material having greater than 90% transmission of visible light. [0042] Conventional image sensors use polarizers, wavelength filters, and refractive microlenses to filter and focus image light to imaging pixels and subpixels of a sensor. However, these conventional components add bulk to sensors and may also cause resolution issues and reduce a signal-to-noise (SNR) of the sensor. Implementations of the disclosure include liquid crystal polarization holograms (LCPH) and metasurfaces configured with the functionality of one or more of polarizers, wavelength filters, and/or microlenses to be used to replace conventional optical components in sensing application (e.g. an image sensor). These and other embodiments are described in more detail in connection with FIGs. 1-17.
[0043] FIG. 1 illustrates an imaging system 100 that includes a patterned LCPH layer 141 having microlens regions configured to focus image light to different subpixels of an image pixel array 102, in accordance with implementations of the disclosure. Imaging system 100 includes a focusing element 115 having a focal length 116, and an image pixel array 102. Image pixel array 102 may be implemented as a complementary metal-oxide semiconductor (CMOS) image sensor, in some implementations. Focusing element 115 focuses image light scattered/reflected from object 110 to image pixel array 102. Image pixel array 102 includes a plurality of imaging pixels such as imaging pixel 151. Example imaging pixel 151 includes a first subpixel 111 A and a second subpixel 11 IB. A filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared, ultraviolet) to filter the wavelength of image light that propagates to a semiconductor layer 145 (e.g. silicon) of imaging pixel 151. A spacer layer 135 may be disposed between patterned LCPH layer 141 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
[0044] Patterned LCPH layer 141 includes microlens regions for focusing the image light to subpixels 111 A and 11 IB. A circular polarizer 118 may be disposed between focusing element 115 and image pixel array 102 to provide circularly polarized image light to patterned LCPH layer 141 that covers all or a portion of image pixel array 102. Although not particularly illustrated, a patterned LCPH layer 141 may be disposed over image pixel array 102 where the patterned LCPH layer 141 includes various microlens regions that are disposed over each imaging pixel with a one-to-one correspondence. In an implementation, patterned LCPH layer 141 includes microlens regions having a one-to-one correspondence with subpixels (e.g. 111 A and 11 IB) of the imaging pixels 151.
[0045] FIG. 2A illustrates a side view of an example imaging pixel 252 that includes subpixels 252A and 252B, in accordance with implementations of the disclosure. FIG. 2A illustrates image light 290 encountering patterned LCPH layer 241. Patterned LCPH layer 241 is disposed over the imaging pixels (e.g. imaging pixel 252) of an image pixel array (e.g. image pixel array 102). Example imaging pixel 252 includes subpixel 252A and subpixel 252B in FIG. 2A. FIG. 2B illustrates a perspective view of example imaging pixel 252 including four subpixels 252A, 252B, 252C, and 252D.
[0046] FIG. 2A shows subpixel 252A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 252B is illustrated as an infrared subpixel. Subpixel 252B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light. Subpixel 252A includes a filter 238 in filter layer 237 that may be configured to pass visible light (while rejecting/blocking non-visible light) to a sensing region 211 A in a semiconductor layer 245 of imaging pixel 252. Filter 238 may be an infrared stop filter that blocks/rejects infrared light while passing visible light. Subpixel 252B includes a filter 239 that may be configured to pass broad-spectrum infrared light (e.g. -700 nm - 3000 nm) or narrow-band infrared light (e.g. 795 nm to 805 nm) to sensing region 21 IB in a semiconductor layer 245 of imaging pixel 252. Sensing region 211 A of subpixel 252A and sensing region 21 IB of subpixel 252B may be implemented in doped silicon included in semiconductor layer 245. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels. Each subpixel of imaging pixel 252 may be 1-2 microns.
[0047] FIG. 2 A shows that layer 235 may be disposed between LCPH layer 241 and filter layer 237. Layer 235 may include DTI and/or back side metal (BSM). Optional oxide layer 243 is shown disposed between semiconductor layer 245 and filter layer 237. Oxide layer 243 may include silicon dioxide. Oxide layer 243 may include BSM. The BSM may include Tungsten and one or more layers of titanium-nitride. Wavelength filtering layer 237 may include example filters 238 and 23 in addition to other wavelength filters disposed above subpixels 252C and 252D. Layer 237 is disposed between semiconductor layer 245 and patterned LCPH layer 241. Those skilled in the art appreciate that layers 235, 237, layer 243, and layer 245 may extend to many imaging pixels in an image pixel array even though FIG. 2A only illustrates a single imaging pixel 252.
[0048] FIG. 2B illustrates that example imaging pixel includes four subpixels where two of the subpixels (252A and 252C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 252B that may be configured to measure/sense infrared light. Other wavelength filters and different filter patterns may also be used in imaging pixels, in accordance with implementations of the disclosure. Subpixel 252D may include a red, green, or blue filter that passes red, green, or blue light to a sensing region of subpixel 252D. In some implementations, imaging pixels include a red-green- green-blue (RGGB) filter pattern, for example. Each subpixel of imaging pixel 252 may be 1-2 microns.
[0049] In FIG. 2B, patterned LCPH layer 241 includes microlens regions configured to focus image light 290 to subpixels 252A, 252B, 252C, and 252D. In an implementation, patterned LCPH layer 241 is implemented with a liquid crystal Pancharatnam-Berry Phase (LC-PBP) design. In an implementation, patterned LCPH layer 241 includes a polarized volume hologram (PVH) design. In an implementation, the liquid crystals in patterned LCPH layer 241 are doped with color dye to provide a filter for specific wavelength bands or to block specific wavelengths. Certain microlens regions of patterned LCPH layer 241 may be doped to pass a particular wavelength band of image light 290 while other microlens regions of patterned LCPH layer 241 may be doped to pass other (different) wavelength band(s) of image light 290, for example.
[0050] FIG. 3 illustrates a patterned LCPH layer 301 arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure. In FIG. 3, microlens region 341 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 01, 02, 05, and 06 of imaging pixel 352. Hence, microlens region 341 has a one-to-one correspondence with imaging pixel 352. In other words, patterned LCPH layer 301 is patterned to have a microlens region to focus image light (e.g. image light 290) for each imaging pixel. Subpixel 01 is configured to measure/sense infrared light, subpixel 02 is configured to measure/sense the intensity of visible light, subpixel 05 is configured to measure/sense the intensity of visible light, and subpixel 06 is configured to measure/sense red, green, or blue light. Of course, the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
[0051] Microlens region 342 is disposed over imaging pixel 353. Microlens region 342 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 03, 04, 07, and 08 of imaging pixel 353. Hence, microlens region 342 has a one-to- one correspondence with imaging pixel 353. Subpixel 03 is configured to measure/sense infrared light, subpixel 04 is configured to measure/sense the intensity of visible light, subpixel 07 is configured to measure/sense the intensity of visible light, and subpixel 08 is configured to measure/sense red, green, or blue light. While not particularly illustrated, an oval, circular, or rectangular microlens region of patterned LCPH layer 301 may be disposed over subpixels 09, 10, 13, and 14 of imaging pixel 354. And, an oval, circular, or rectangular microlens region of patterned LCPH layer 301 may be disposed over subpixels 11, 12, 15, and 16 of imaging pixel 355.
[0052] Patterned LCPH layer 301 may be a contiguous layer. A contiguous patterned LCPH layer 301 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
[0053] FIG. 4 illustrates a patterned LCPH layer 401 including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure. In FIG. 4, microlens region 441 is illustrated as a round or oval microlens region configured to focus image light to one subpixel (subpixel 01). Microlens region 442 is configured to focus image light to subpixel 02, microlens region 443 is configured to focus image light to subpixel 05, and microlens region 444 is configured to focus image light to subpixel 06. Hence, microlens regions of patterned LCPH layer 401 have a one-to-one correspondence with subpixels of imaging pixel 452 that includes subpixels 01, 02, 05, and 06. Similarly, microlens regions 445, 446, 447, and 448 of patterned LCPH layer 401 have a one-to-one correspondence to subpixels 09, 10, 13, and 14, respectively, of imaging pixel 454. In other words, patterned LCPH layer 401 is patterned to have a microlens regions to focus image light (e.g. image light 290) for each subpixel in FIG. 4.
[0054] Still referring to FIG. 4, subpixel 01 is configured to measure/sense infrared light, subpixel 02 is configured to measure/sense the intensity of visible light, subpixel 05 is configured to measure/sense the intensity of visible light, and subpixel 06 is configured to measure/sense red, green, or blue light. Of course, the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel. While not particularly illustrated, an oval, circular, or rectangular microlens region of patterned LCPH layer 401 may be disposed (individually) over subpixels 03, 04, 07, and 08 of imaging pixel 453. And, an oval, circular, or rectangular microlens region of patterned LCPH layer 401 may be disposed (individually) over subpixels 11, 12, 15, and 16 of imaging pixel 455.
[0055] In some implementations, the microlens regions of patterned LCPH layer 401 are configured to pass different wavelengths of light. For example, microlens region 441 may be doped with a color dye that functions as a filter and microlens region 442 may be doped with a different color dye that functions as a filter that filters different wavelengths of light. A first microlens region may be configured to pass a first wavelength band of image light to a first subpixel and not a second subpixel and a second microlens region may be configured to pass a second (different) wavelength band of image light to the second subpixel and not the first subpixel.
[0056] Patterned LCPH layer 401 may be a contiguous layer. A contiguous patterned LCPH layer 401 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels. Patterned LCPH layer 301 and/or patterned LCPH layer 401 may be considered “patterned” because the microlens regions are arranged, sized, and/or configured to focus image light onto subpixels or imaging pixels of an image pixel array.
[0057] FIG. 5 illustrates an imaging system 500 including an image pixel array 502, in accordance with aspects of the disclosure. All or portions of imaging system 500 may be included in an image sensor, in some implementations. Imaging system 500 includes control logic 508, processing logic 512, and image pixel array 502. Image pixel array 502 may be arranged in rows and columns where integer y is the number of rows and integer x is the number of columns. The image pixel array 502 may have a total of n pixel (P) and integer n may be the product of integer x and integer y. In some implementations, n is over one million imaging pixels. Each imaging pixel may include subpixels described in the disclosure.
[0058] In operation, control logic 508 drives image pixel array 502 to capture an image. Image pixel array 502 may be configured to have a global shutter or a rolling shutter, for example. Each subpixel may be configured in a 3-transistor (3T) or 4-transistor (4T) readout circuit configuration. Processing logic 512 is configured to receive the imaging signals from each subpixel. Processing logic 512 may perform further operations such as subtracting or adding some imaging signals from other imaging signals to generate image 515. Aspects of a patterned LCPH layer in accordance with FIGs. 1-4 of this disclosure may be disposed over image pixel array 502 in imaging system 500. In an implementation, patterned LCPH layer 301 is disposed over image pixel array 502 in imaging system 500. In an implementation, patterned LCPH layer 401 is disposed over image pixel array 502 in imaging system 500.
[0059] FIG. 6 illustrates an imaging system 600 that includes a metasurface lens layer 641 having microlens regions configured to focus image light to different subpixels of an image pixel array 602, in accordance with implementations of the disclosure. Imaging system 600 includes a focusing element 115 having a focal length 116, and an image pixel array 602. Image pixel array 602 may be implemented as a complementary metal-oxide semiconductor (CMOS) image sensor, in some implementations. Focusing element 115 focuses image light scattered/reflected from object 110 to image pixel array 602. Image pixel array 602 includes a plurality of imaging pixels such as imaging pixel 651. Imaging pixel 651 includes a first subpixel 611 A and a second subpixel 61 IB. A filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared) to filter the wavelength of image light that propagates to a semiconductor layer 645 (e.g. silicon) of imaging pixel 651. A spacer layer 135 may be disposed between metasurface lens layer 641 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
[0060] Metasurface lens layer 641 includes microlens regions for focusing the image light to subpixels 611 A and 61 IB. A circular polarizer 118 may be disposed between focusing element 115 and image pixel array 602 to provide circularly polarized image light to metasurface lens layer 641. Although not particularly illustrated, a metasurface lens layer 641 may be disposed over image pixel array 602 where the metasurface lens layer 641 includes various microlens regions that are disposed over each imaging pixel with a one-to- one correspondence. In an implementation, metasurface lens layer 641 includes microlens regions having a one-to-one correspondence with subpixels (e.g. 611A and 61 IB) of the imaging pixels 651.
[0061] FIG. 7 A illustrates a side view of an example imaging pixel 752 that includes subpixels 752A and 752B, in accordance with implementations of the disclosure. FIG. 7A illustrates image light 290 encountering metasurface lens layer 741. Metasurface lens layer 741 is disposed over the imaging pixels (e.g. imaging pixel 752) of an image pixel array (e.g. image pixel array 602). Example imaging pixel 752 includes subpixel 752A and subpixel 752B in FIG. 7A. FIG. 7B illustrates a perspective view of example imaging pixel 752 including four subpixels 752A, 752B, 752C, and 752D.
[0062] FIG. 7A shows subpixel 752A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 752B is illustrated as an infrared subpixel. Subpixel 752B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light. Subpixel 752A includes a filter 238 in filter layer 237 that may be configured to pass visible light (while rejecting/blocking non-visible light) to a sensing region 711 A in a semiconductor layer 745 of imaging pixel 752. Subpixel 752B includes a filter 239 that may be configured to pass broad-spectrum infrared light (e.g. -700 nm - 3000 nm) or narrow-band infrared light (e.g. 795 nm to 805 nm) to sensing region 71 IB in a semiconductor layer 745 of imaging pixel 752. Sensing region 711A of subpixel 752A and sensing region 71 IB of subpixel 752B may be implemented in doped silicon included in semiconductor layer 745. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels. Each subpixel of imaging pixel 752 may be 1-2 microns.
[0063] FIG. 7A shows that layer 235 may be disposed between metasurface lens layer 741 and filter layer 237. Layer 235 may include DTI and/or back side metal (BSM).
Optional oxide layer 243 is shown disposed between semiconductor layer 745 and filter layer 237. Wavelength filtering layer 237 may include example filters 238 and 239. Layer 237 is disposed between semiconductor layer 745 and metasurface lens layer 741. Those skilled in the art appreciate that layers 235, 237, layer 243, and layer 745 may extend to many imaging pixels in an image pixel array even though FIG. 7 A only illustrates a single imaging pixel 752.
[0064] FIG. 7B illustrates that example imaging pixel 752 includes four subpixels where two of the subpixels (752A and 752C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 752B that may be configured to measure/sense infrared light. Other wavelength filters and different filter patterns may also be used in imaging pixels, in accordance with implementations of the disclosure. Subpixel 752D may include a red, green, or blue filter that passes red, green, or blue light to a sensing region of subpixel 752D. In some implementations, imaging pixels include a red-green- green-blue (RGGB) filter pattern, for example. Each subpixel of imaging pixel 752 may be 1-2 microns. In FIG. 7B, metasurface lens layer 741 includes microlens regions configured to focus image light 290 to subpixels 752A, 752B, 752C, and 752D.
[0065] FIGs. 11 A-14B illustrate various nanostructures that may be included in microlens regions of metasurface lens layer, in accordance with aspects of the disclosure.
The nanostructures of FIGs. 11A-14B may be also referred to as meta-units of a metasurface lens layer. FIG. 11A illustrates a top view of a circular nanostructure 1101 and FIG. 1 IB illustrates a perspective view of nanostructure 1101 having a height HI. FIG. 12A illustrates atop view of an oval nanostructure 1201 and FIG. 12B illustrates a perspective view of nanostructure 1201 having a height H2. FIG. 13A illustrates a top view of a rectangular nanostructure 1301 and FIG. 13B illustrates a perspective view of nanostructure 1301 having a height H3. FIG. 14A illustrates atop view of a square nanostructure 1401 and FIG. 14B illustrates a perspective view of nanostructure 1401 having a height H4. Other shapes of nanostructures including non-symmetric nanostructures may also be used, in accordance with aspects of the disclosure. [0066] The microlens regions of the metasurface lens layers may include two- dimensional or one-dimensional nanostructures. The height of the nanostructures may be between 50 nm and approximately two microns, for example. The width or length of the nano structures may be between 10 nm and approximately 500 nm, in some implementations. To form the metasurface lens layer, the nanostructures may be formed out of a planar substrate (e.g. silicon, silicon-nitride, or titanium-oxide) using a subtractive process (e.g. photolithography and/or etching). Therefore, the resulting metasurface lens layer having microlens regions may be considered a flat optical component with only minor thickness variations (the height/thickness of the various nanostructures of the metasurface). The metasurface design of each microlens region of the metasurface lens layer may be configured to focus image light to subpixels of an image pixel array.
[0067] Furthermore, in some implementations, the metasurface design of the microlens regions may be further configured to pass/transmit or block/reject particular wavelengths and/or polarization orientations of image light 290. In an implementation, wavelength filtering features of a metasurface are formed using metasurface-based subtractive color filter fabrication techniques that may include forming the metasurface on a glass wafer using CMOS processing techniques.
[0068] FIG. 8 illustrates a metasurface lens layer 801 arranged with microlens regions to be disposed over subpixels, in accordance with aspects of the disclosure. In FIG. 8, microlens region 841 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 01, 02, 05, and 06 of imaging pixel 852. Hence, microlens region 841 has a one-to-one correspondence with imaging pixel 852. Subpixel 01 is configured to measure/sense infrared light, subpixel 02 is configured to measure/sense the intensity of visible light, subpixel 05 is configured to measure/sense the intensity of visible light, and subpixel 06 is configured to measure/sense red, green, or blue light. Of course, the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel.
[0069] Microlens region 842 is disposed over imaging pixel 853. Microlens region 842 is illustrated as a round or oval microlens region configured to focus image light to four subpixels 03, 04, 07, and 08 of imaging pixel 853. Hence, microlens region 842 has a one-to- one correspondence with imaging pixel 853. Subpixel 03 is configured to measure/sense infrared light, subpixel 04 is configured to measure/sense the intensity of visible light, subpixel 07 is configured to measure/sense the intensity of visible light, and subpixel 08 is configured to measure/sense red, green, or blue light. While not particularly illustrated, an oval, circular, or rectangular microlens region of metasurface lens layer 801 may be disposed over subpixels 09, 10, 13, and 14 of imaging pixel 854. And, an oval, circular, or rectangular microlens region of metasurface lens layer 801 may be disposed over subpixels 11, 12, 15, and 16 of imaging pixel 855.
[0070] Metasurface lens layer 801 may be a contiguous layer. A contiguous metasurface lens layer 801 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
[0071] FIG. 9 illustrates a metasurface lens layer 901 including microlens regions having a one-to-one correspondence with subpixels of the imaging pixels, in accordance with aspects of the disclosure. In FIG. 9, microlens region 941 is illustrated as a round or oval microlens region configured to focus image light to one subpixel (subpixel 01). Microlens region 942 is configured to focus image light to subpixel 02, microlens region 943 is configured to focus image light to subpixel 05, and microlens region 944 is configured to focus image light to subpixel 06. Hence, microlens regions of metasurface lens layer 901 have a one-to-one correspondence with subpixels of imaging pixel 952 that includes subpixels 01, 02, 05, and 06. Similarly, microlens regions 945, 946, 947, and 948 of metasurface lens layer 901 have a one-to-one correspondence to subpixels 09, 10, 13, and 14, respectively, of imaging pixel 954. In other words, metasurface lens layer 901 is configured to have a microlens region to focus image light (e.g. image light 290) for each subpixel in FIG. 9.
[0072] Still referring to FIG. 9, subpixel 01 is configured to measure/sense infrared light, subpixel 02 is configured to measure/sense the intensity of visible light, subpixel 05 is configured to measure/sense the intensity of visible light, and subpixel 06 is configured to measure/sense red, green, or blue light. Of course, the different subpixels may be configured to sense different wavelengths of light by changing the filters in the filter layer of the subpixel. While not particularly illustrated, an oval, circular, or rectangular microlens region of metasurface lens layer 901 may be disposed (individually) over subpixels 03, 04, 07, and 08 of imaging pixel 953. And, an oval, circular, or rectangular microlens region of metasurface lens layer 901 may be disposed (individually) over subpixels 11, 12, 15, and 16 of imaging pixel 955.
[0073] Metasurface lens layer 901 may be a contiguous layer. A contiguous metasurface lens layer 901 may cover an entire image pixel array where the image pixel array includes thousands or millions of imaging pixels.
[0074] In an implementation, the metasurface lens layers of the disclosure are polarization-dependent. In one implementation, nanostructures in the metasurface lens layers of the disclosure are configured to pass/transmit a first polarization orientation to the imaging pixels and reject/block a second (different) polarization orientation from becoming incident on the imaging pixels. In one implementation, the first polarization orientation is orthogonal to the second polarization orientation. In an implementation, nanostructures in the metasurface lens layer are configured to pass right-hand circularly polarized light and block left-hand circularly polarized light. In an implementation, nanostructures in the metasurface lens layer are configured to pass left-hand circularly polarized light and block right-hand circularly polarized light.
[0075] In an implementation, nanostructures in the metasurface lens layer are configured to: (1) pass a first wavelength band of image light to a first subpixel and not a second subpixel; and (2) pass a second (different) wavelength band of the image light to the second subpixel and not the first subpixel.
[0076] FIG. 10 illustrates an imaging system 1000 that may include a metasurface lens layer, in accordance with aspects of the disclosure. All or portions of imaging system 1000 may be included in an image sensor, in some implementations. Imaging system 1000 includes control logic 1008, processing logic 1012, and image pixel array 1002. Image pixel array 1002 may be arranged in rows and columns where integer y is the number of rows and integer x is the number of columns. The image pixel array 1002 may have a total of n pixel (P) and integer n may be the product of integer x and integer y. In some implementations, n is over one million imaging pixels. Each imaging pixel may include subpixels described in the disclosure.
[0077] In operation, control logic 1008 drives image pixel array 1002 to capture an image. Image pixel array 1002 may be configured to have a global shutter or a rolling shutter, for example. Each subpixel may be configured in a 3-transistor (3T) or 4-transistor (4T) readout circuit configuration. Processing logic 1012 is configured to receive the imaging signals from each subpixel. Processing logic 1012 may perform further operations such as subtracting or adding some imaging signals from other imaging signals to generate image 1015. Aspects of a metasurface lens layer in accordance with FIGs. 6-9 of this disclosure may be disposed over image pixel array 1002 in imaging system 1000. In an implementation, metasurface lens layer 801is disposed over image pixel array 1002 in imaging system 1000. In an implementation, metasurface lens layer 901 is disposed over image pixel array 1002 in imaging system 1000.
[0078] FIG. 15A illustrates a side view of an example imaging pixel 1552 having a multi-functional microlens layer 1541, in accordance with implementations of the disclosure. Multi-functional microlens layer 1541 may function as (1) a microlens and polarizer; (2) a microlens and a wavelength filter; or (3) a combination of microlens, polarizer, and wavelength filter. By combining the functionality of the microlens layer and a filtering layer and/or a polarizing layer, an image sensor may become smaller, less expensive, and reduce the fabrication steps in the manufacturing process. Multi-functional microlens layer 1541 may be implemented as a patterned LCPH layer. In an implementation, multi-functional microlens layer 1541 is implemented as a patterned LCPH layer designed as an LC-PBP. If the patterned LCPH layer includes wavelength filtering functionality, liquid crystals in the patterned LCPH layer may be doped with color dye to provide a filter for specific wavelength bands or to block specific wavelengths. Multi-functional microlens layer 1541 may be implemented as a metasurface, in accordance with implementations previously describe in this disclosure.
[0079] FIG. 15A illustrates image light 290 encountering multi-functional microlens layer 1541. Multi-functional microlens layer 1541 is disposed over the imaging pixels (e.g. imaging pixel 1552) of an image pixel array. Example imaging pixel 1552 includes subpixel 1552A and subpixel 1552B in FIG. 15A. FIG. 15B illustrates a perspective view of example imaging pixel 1552 including four subpixels 1552A, 1552B, 1552C, and 1552D.
[0080] FIG. 15A shows subpixel 1552A as a monochrome subpixel that measures/senses the intensity of visible light while subpixel 1552B is illustrated as an infrared subpixel. The light filtering functionality is provided by a microlens region of multi functional microlens layer 1541. Subpixel 1552B may be configured to measure/sense a particular band of infrared light (e.g. 800 nm light) or to measure/sense broad-spectrum infrared light having a wavelength higher than visible red light. Again, this light filtering functionality is provided by a microlens region of multi-functional microlens layer 1541. Sensing region 1511A of subpixel 1552A and sensing region 1511B of subpixel 1552B may be implemented in doped silicon included in semiconductor layer 1545. Deep Trench Isolation (DTI) and metal layers may separate the sensing regions of different subpixels.
Each subpixel of imaging pixel 1552 may be 1-2 microns.
[0081] FIG. 15B illustrates that example imaging pixel 1552 includes four subpixels where two of the subpixels (1552A and 1552C) may be mono subpixels configured to measure/sense an intensity of image light 290 and one subpixel 1552B that may be configured to measure/sense infrared light. Subpixel 1552D may be configured to measure/sense red, green, or blue light propagating to a sensing region of subpixel 1552D. The filtering functionality for each subpixels may be provided by a microlens region of multi functional microlens layer 1541 disposed above the particular subpixel. Similarly, polarization filtering for each subpixel may also be provided by a microlens region of multi functional microlens layer 1541 disposed above the particular subpixel. Each subpixel of imaging pixel 1552 may be 1-2 microns.
[0082] FIG. 16 illustrates a camera system 1600 that includes a lens assembly 1633 having a diffractive focusing element 1615, in accordance with implementations of the disclosure. Diffractive focusing element 1615 may be in implemented as an LC-PBP lens or a metasurface lens. Diffractive focusing element 1615 has a focal length 116 to focus image light onto image pixel array 1602. Lens assembly 1633 may include diffractive focusing element 1615 and circular polarizer 118. Circular polarizer 118 may pass a particular handed polarization orientation (e.g. right-hand polarized or left-hand polarized light) to diffractive focusing element 1615. Image pixel array 1602 may be implemented as a CMOS image sensor, in some implementations. Lens assembly 1633 is configured to focus image light scattered/reflected from object 110 to imaging pixels of image pixel array 1602. Image pixel array 1602 includes a plurality of imaging pixels such as imaging pixel 1651. Imaging pixel 1651 includes a first subpixel 1611A and a second subpixel 161 IB. A filter layer 137 may include different wavelength filters (e.g. red, green, blue, infrared, near-infrared) to filter the wavelength of image light that propagates to a semiconductor layer 1645 (e.g. silicon) of imaging pixel 1651. A spacer layer 135 may be disposed between microlens layer 1641 and filter layer 137. Spacer layer 135 may include Deep Trench Interface (DTI) or Buried Shielding Metal (BSM), for example.
[0083] Microlens layer 1641 includes microlens regions for focusing the image light to subpixels 1611 A and 161 IB. Microlens layer 1641 may be implemented as a patterned LCPH or a metasurface layer having the features described above in this disclosure.
Replacing a conventional refractive lens (formed of glass of plastic, for example), greatly reduces the height of the camera system 1600 and reduces the weight.
[0084] FIGs. 17A-17C illustrate an example process for fabricating an LCPH, in accordance with implementations of the disclosure. In FIG. 17A, a photoalignment material (PAM) 1712 is formed on a glass layer 1710 using a spincoat technique to form optical structure 1700. In FIG. 17B, a liquid crystal monomer layer 1714 is formed on PAM layer 1712 using a spincoat technique. In FIG. 17C, optical structure 1700 is illuminated by ultraviolet (UV) light to photo-polymerize liquid crystal monomer layer 1714 according to the particular configuration of the LCPH under fabrication. [0085] Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
[0086] The term “processing logic” in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein. In some embodiments, memories (not illustrated) are integrated into the processing logic to store instructions to execute operations and/or store data. Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
[0087] A “memory” or “memories” described in this disclosure may include one or more volatile or non-volatile memory architectures. The “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
[0088] Networks may include any network or network system such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
[0089] Communication channels may include or be routed through one or more wired or wireless communication utilizing IEEE 802.11 protocols, BlueTooth, SPI (Serial Peripheral Interface), I2C (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN), a Wide Area Network (WAN), a public network (e.g. “the Internet”), a private network, a satellite network, or otherwise.
[0090] A computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise. A server computer may be located remotely in a data center or be stored locally.
[0091] The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
[0092] These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.

Claims

1. An image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a patterned liquid crystal polarization hologram (LCPH) layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
2. The image sensor of claim 1, wherein liquid crystals in the patterned LCPH layer are doped to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
3. The image sensor of claim 1 or claim 2, wherein the microlens regions have a one-to-one correspondence with: i. subpixels of the imaging pixels; or ii. the imaging pixels.
4. The image sensor of claim 1, claim 2 or claim 3, wherein the microlens regions are rectangular.
5. The image sensor of any one of the preceding claims further comprising: a wavelength filtering layer disposed between a semiconductor layer of the imaging pixels an the patterned LCPH layer.
6. The image sensor of any one of the preceding claim, wherein the patterned LCPH layer includes: i. a Pancharatnam-Berry Phase (LC-PBP) design; or ii. a polarized volume hologram (PVH) design.
7. The image sensor of any one of the preceding claim, wherein the microlens regions of the LCPH layer have a longest dimension of less than four microns wide.
8. An image sensor comprising: imaging pixels including a first subpixel configured to sense image light and a second subpixel configured to sense the image light; and a metasurface lens layer having microlens regions disposed over the imaging pixels, wherein the microlens regions are configured to focus the image light to the first subpixel and the second subpixel of the imaging pixels.
9. The image sensor of claim 8, wherein nanostructures in the metasurface lens layer are configured to: pass a first polarization orientation to the imaging pixels and reject a second polarization orientation from becoming incident on the imaging pixels, the first polarization orientation different from the second polarization orientation; and/or preferably, wherein the nanostructures in the metasurface lens layer are configured to: pass a first wavelength band of the image light to the first subpixel and not the second subpixel; and pass a second wavelength band of the image light to the second subpixel and not the first subpixel.
10. The image sensor of claim 8 or claim 9, wherein the metasurface lens layer: i. includes non-symmetric nanostructures; and/or ii. is polarization-dependent.
11. The image sensor of claim 8, claim 9 or claim 10, wherein nanostructures of the metasurface lens layer are formed of at least one of silicon, silicon-nitride, or titanium-oxide.
12. The image sensor of any one of claims 8 to 11, wherein the microlens regions have a one-to-one correspondence with: i. subpixels of the imaging pixels; or ii. the imaging pixels.
13. The image sensor of any one of claims 8 to 12, wherein the microlens regions are rectangular.
14. A camera comprising: an image sensor including a plurality of imaging pixels configured to sense image light; and a lens assembly having a liquid crystal Pancharatnam-Berry Phase (LC-PBP) lens configured to focus the image light to the imaging pixels of the image sensor, wherein the lens assembly is without a refractive lens formed of glass or plastic.
15. The camera of claim 14 further comprising: a circular polarizer layer, wherein LC-PBP lens is disposed between the circular polarizer layer and the imaging pixels.
PCT/US2022/038909 2021-07-29 2022-07-29 Sensing with liquid crystal polarization holograms and metasurface WO2023009838A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22769423.9A EP4378001A1 (en) 2021-07-29 2022-07-29 Sensing with liquid crystal polarization holograms and metasurface
CN202280052057.9A CN117897815A (en) 2021-07-29 2022-07-29 Sensing using liquid crystal polarization hologram and super surface

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163226916P 2021-07-29 2021-07-29
US63/226,916 2021-07-29
US17/865,223 2022-07-14
US17/865,223 US20230037261A1 (en) 2021-07-29 2022-07-14 Sensing with liquid crystal polarization holograms and metasurface

Publications (1)

Publication Number Publication Date
WO2023009838A1 true WO2023009838A1 (en) 2023-02-02

Family

ID=83283457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/038909 WO2023009838A1 (en) 2021-07-29 2022-07-29 Sensing with liquid crystal polarization holograms and metasurface

Country Status (1)

Country Link
WO (1) WO2023009838A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170109865A1 (en) * 2015-10-15 2017-04-20 Samsung Electronics Co., Ltd. Apparatus and method for acquiring image
US20180143470A1 (en) * 2016-11-18 2018-05-24 Magic Leap, Inc. Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
US20180217395A1 (en) * 2017-01-27 2018-08-02 Magic Leap, Inc. Antireflection coatings for metasurfaces

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170109865A1 (en) * 2015-10-15 2017-04-20 Samsung Electronics Co., Ltd. Apparatus and method for acquiring image
US20180143470A1 (en) * 2016-11-18 2018-05-24 Magic Leap, Inc. Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
US20180217395A1 (en) * 2017-01-27 2018-08-02 Magic Leap, Inc. Antireflection coatings for metasurfaces

Similar Documents

Publication Publication Date Title
US10924723B2 (en) Imaging apparatus and image sensor array
JP2023014112A (en) Solid-state image pickup device and electronic equipment
US9793308B2 (en) Imager integrated circuit and stereoscopic image capture device
US9241111B1 (en) Array of cameras with various focal distances
US9077978B2 (en) Image-capturing apparatus and image-capturing method
WO2012042963A1 (en) Solid-state image pickup element and image pickup apparatus
JP2019523910A (en) Light field imaging device and method for depth acquisition and 3D imaging
EP3354017A1 (en) Mask-less phase detection autofocus
US11323608B2 (en) Image sensors with phase detection auto-focus pixels
JP2004146619A (en) Image-input apparatus
JP6354838B2 (en) Image pickup device, image pickup apparatus, and image processing apparatus
US11902712B2 (en) Apparatus and method of acquiring image by employing color separation lens array
CN107065425A (en) Imaging device, image processing method and program
CN106899789A (en) Optical field imaging equipment and its manufacture method
US20190197717A1 (en) Method and device for producing depth information
KR20210046989A (en) Image sensor and image sensing method with improved sensitivity
JP2008060323A (en) Solid-state imaging apparatus, manufacturing method therefor, and camera
US20230037261A1 (en) Sensing with liquid crystal polarization holograms and metasurface
EP3353808B1 (en) Color filter sensors
WO2023009838A1 (en) Sensing with liquid crystal polarization holograms and metasurface
US20230154958A1 (en) Image sensor, method of manufacturing image sensor, and electronic device including image sensor
US20230008674A1 (en) Liquid crystal polarizers for imaging
WO2023283153A1 (en) Image sensor comprising liquid crystal polarizers
TWI803719B (en) Image sensor pixel array with phase detection auto-focus pixels and method for manufacturing the same
WO2023007796A1 (en) Solid-state imaging element, manufacturing method, and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22769423

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280052057.9

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2022769423

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022769423

Country of ref document: EP

Effective date: 20240229