WO2021016839A1 - 图像传感器及其制造方法、芯片及手持装置 - Google Patents

图像传感器及其制造方法、芯片及手持装置 Download PDF

Info

Publication number
WO2021016839A1
WO2021016839A1 PCT/CN2019/098285 CN2019098285W WO2021016839A1 WO 2021016839 A1 WO2021016839 A1 WO 2021016839A1 CN 2019098285 W CN2019098285 W CN 2019098285W WO 2021016839 A1 WO2021016839 A1 WO 2021016839A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
image sensor
layer
semiconductor substrate
grid
Prior art date
Application number
PCT/CN2019/098285
Other languages
English (en)
French (fr)
Inventor
杨孟达
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Priority to CN201980001351.5A priority Critical patent/CN110574166A/zh
Priority to PCT/CN2019/098285 priority patent/WO2021016839A1/zh
Priority to EP19919564.5A priority patent/EP3799123A4/en
Priority to US17/027,612 priority patent/US20210036042A1/en
Publication of WO2021016839A1 publication Critical patent/WO2021016839A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14623Optical shielding
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14683Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
    • H01L27/14685Process for coatings or optical elements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14683Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
    • H01L27/14689MOS based technologies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1464Back illuminated imager structures

Definitions

  • the present application relates to an image sensor, a manufacturing method thereof, a chip, and a handheld device using the chip, and more particularly to an image sensor with a polarizing layer, a manufacturing method of the image sensor, an image sensor chip, and a handheld device.
  • CMOS image sensors have been mass-produced and applied.
  • Traditional image sensors can generate two-dimensional (2D) images and videos.
  • image sensors and systems that can generate three-dimensional (3D) images have received widespread attention.
  • These three-dimensional image sensors can be applied to face recognition and augmented reality (AR, AUgmented Reality). )/Virtual Reality (VR, Virtual Reality), drones, etc.
  • AR augmented reality
  • AUgmented Reality AUgmented Reality
  • VR Virtual Reality
  • drones etc.
  • the existing three-dimensional image sensor mainly has three implementation methods: stereo binocular, structured light and time of flight (ToF, Time of Flight).
  • Time-of-flight is the use of specially designed pixels to measure distances by measuring the time of photon flight and return.
  • the current technology cannot generate a depth map with sufficient accuracy.
  • how to simply improve the accuracy of the time-of-flight sensor has become an important work item.
  • One of the objectives of this application is to disclose an image sensor, a manufacturing method thereof, a chip, and a handheld device using the chip to solve the above-mentioned problems.
  • An embodiment of the present application discloses an image sensor, including a semiconductor substrate and a plurality of pixels, wherein each pixel of the plurality of pixels includes: a photosensitive sensor provided on the semiconductor substrate; a polarizing layer provided On the semiconductor substrate; a micro lens is arranged on the polarizing layer so that the polarizing layer is located between the micro lens and the semiconductor substrate.
  • An embodiment of the present application discloses a method for manufacturing an image sensor, including: providing a semiconductor substrate; forming a polarizing layer on the semiconductor substrate; and forming a microlens on the polarizing layer.
  • An embodiment of the application discloses a chip including the above-mentioned image sensor.
  • An embodiment of the present application discloses a handheld device for performing time-of-flight sensing, including: a display panel; and the above-mentioned image sensor.
  • the embodiment of the present application adds a polarizing layer to the image sensor, which can improve the accuracy of the time-of-flight sensor.
  • FIG. 1 is a cross-sectional view of an embodiment of one pixel of the image sensor of this application;
  • FIG. 2 is a top view of the image sensor shown in FIG. 1;
  • FIG. 3 is a cross-sectional view of the image sensor shown in FIG. 2 taken along the cross-sectional line;
  • FIG. 4 is a top view of an embodiment of four pixels of the image sensor of this application.
  • FIG. 3 are schematic diagrams of the manufacturing process of the image sensor shown in FIG. 3;
  • FIG. 10 is a cross-sectional view of another embodiment of one pixel of the image sensor of the application.
  • FIG. 11 is a schematic diagram of an embodiment of a handheld device of this application.
  • first and second features are in direct contact with each other; and may also include additional components are formed between the above-mentioned first and second features, so that the first and second features may not be in direct contact.
  • content of the present invention may reuse component symbols and/or labels in multiple embodiments. Such repeated use is based on the purpose of brevity and clarity, and does not in itself represent the relationship between the different embodiments and/or configurations discussed.
  • spatially relative terms here such as “below”, “below”, “below”, “above”, “above” and similar, may be used to facilitate the description of the drawing
  • the relationship between one component or feature relative to another component or feature is shown.
  • these spatially relative terms also cover a variety of different orientations in which the device is in use or operation.
  • the device may be placed in other orientations (for example, rotated by 90 degrees or in other orientations), and these spatially-relative description words should be explained accordingly.
  • the receiving module of the traditional time-of-flight sensor uses the image sensor to determine the time when light is reflected from the object to be measured. Due to the complex reflection behavior of light in the environment, the receiving module often receives a lot of unwanted noise Therefore, this application uses a polarizing layer in the image sensor of the optical signal receiving module of the time-of-flight sensor to filter the received light to increase the accuracy of the time-of-flight sensor.
  • This application uses a polarizing layer in the image sensor of the optical signal receiving module of the time-of-flight sensor to filter the received light to increase the accuracy of the time-of-flight sensor. The details will be described as follows. It should be noted that although the image sensor of this application can improve the accuracy of existing time-of-flight sensors, this use is not a limitation of this application. In other words, the image sensor of this application can also be applied to other than time-of-flight sensors. Other occasions.
  • FIG. 1 is a cross-sectional view of an embodiment of one pixel of the image sensor of this application.
  • the image sensor 100 may include multiple pixels, and the image sensor 100 in FIG. 1 only shows one of the pixels.
  • the image sensor 100 is a backside illumination (BSI) image sensor 100, and includes a back-end manufacturing (BEOL) stack 110, a semiconductor substrate 108, a polarizing layer 104, and a microlens 102.
  • the back-end process stack 110 is arranged on the front side of the semiconductor substrate 108 in the figure.
  • the back-end process stack 110 includes an interlayer dielectric (ILD) layer and a metallization layer stacked in the interlayer dielectric layer.
  • ILD interlayer dielectric
  • the interlayer dielectric layer may be a low-k dielectric (ie, a dielectric with a dielectric constant less than about 3.9) or an oxide.
  • the metallization layers are electrically coupled to each other through vias and electrically coupled to the semiconductor substrate 108 through contacts.
  • the metallization layer, the through hole and the contact can be, for example, a metal, such as aluminum copper, germanium, copper, or some other metal.
  • the semiconductor substrate 108 may be a bulk semiconductor substrate, such as a bulk silicon substrate or a silicon-on-insulator (SOI) substrate.
  • the photosensitive sensor 106 is provided on the semiconductor substrate 108.
  • the micro lens 102 is disposed on the back surface of the semiconductor substrate 108, and a polarizing layer 104 is provided between the micro lens 102 and the semiconductor substrate 108.
  • the design of the polarizing layer 104 makes it difficult for light outside a specific direction to pass. That is, the light passes through the microlens 102 before entering the polarizing layer 104. According to the design of the polarizing layer 104, light with a specific direction can be made Not all the light entering the photosensitive sensor 106 through the microlens 102 enters the photosensitive sensor 106.
  • a color filter may be additionally provided between the microlens 102 and the polarizing layer 104 as required.
  • an anti-reflection layer and/or a buffer layer may be provided between the polarizing layer 104 and the semiconductor substrate 108.
  • FIG. 2 is a top view of a further embodiment of the image sensor of FIG. 1.
  • FIG. The polarizing layer 104 of the image sensor 200 in FIG. 2 has a vertical grid structure. Please refer to FIG. 2 and FIG. 3 at the same time.
  • FIG. 3 is a cross-sectional view of the image sensor of FIG. 2 along the cross-sectional line AA′. 2 and 3, the polarizing layer 104 includes a grid layer 202 and a capping layer 204.
  • the grid layer 202 is a grid line with a specific height and surrounds the pixels of the image sensor 200 to prevent inter-pixels.
  • the grid layer 202 also has parallel grid lines arranged on the back surface of the semiconductor substrate 108 with pixels full of pixels.
  • the mesh layer 202 may include metal.
  • the mesh layer 202 may include titanium (Ti), tungsten (W), aluminum (Al), copper (Cu), and/or a combination thereof. However, this application is not limited thereto. In some implementations, the mesh layer 202 may include materials other than metal.
  • the capping layer 204 covers the grid layer 202 and fills the gaps between the gate lines of the grid layer 202.
  • the capping layer 204 may be a dielectric, such as silicon dioxide.
  • the mesh layer 202 has a plurality of openings to expose the underlying semiconductor substrate 108, and the plurality of openings divide the mesh layer 202 into a plurality of gate lines, such as a plurality of metal gate lines, as shown in FIG. 3
  • the number of gate lines in the layer 202 is for illustration purposes only. In fact, the number of gate lines in the grid layer 202 can be adjusted according to actual design. In this embodiment, the grid lines of the grid layer 202 have substantially the same height and are substantially the same distance apart.
  • the grid lines of the grid layer 202 surrounding the pixels of the image sensor 200 have a width d1, and the grid The width of the gate lines surrounding the pixels of the image sensor 200 in the layer 202 is d1, the width of the gate lines arranged in parallel is d2, and the spacing between the gate lines arranged in parallel is d3.
  • d2 and d3 are substantially the same, and are about twice as large as d1.
  • the present application is not limited to this.
  • the width or spacing of the gate lines of the grid layer 202 may have different heights and widths.
  • FIG. 4 is a top view of an embodiment of four pixels of the image sensor of this application.
  • the image sensor in FIG. 4 shows the pixel 200 of FIG. 2 and the other pixels 300, 400, and 500, each having a different grid layer 202 pattern.
  • the image sensor may include more than 4 pixels.
  • the pixels 200, 300, 400, and 500 in FIG. 4 can be used as the smallest repeating pixel group, and the smallest repeating pixel group is along the horizontal and/or vertical lines of FIG. 4 Copy the direction to get the required size of the image sensor.
  • the grid layer 202 of the pixels 200, 300, 400, and 500 all have a gate line part surrounding the pixel.
  • the difference in the grid layer 202 of the pixels 200, 300, 400, and 500 lies in the size of the grid lines arranged in parallel therein. direction.
  • the gate line direction of the pixel 300 is the gate line direction of the pixel 200 rotated 45 degrees to the right;
  • the gate line direction of the pixel 400 is the gate line direction of the pixel 300 and rotated 45 degrees to the right;
  • the gate line of the pixel 500 The direction is the raster line direction of the pixel 400 and then rotated 45 degrees to the right. Therefore, the gate line direction of the pixel 200 and the gate line direction of the pixel 400 are perpendicular to each other, and the gate line direction of the pixel 300 and the gate line direction of the pixel 500 are perpendicular to each other.
  • the pixel configuration of FIG. 4 enables the pixels 200, 300, 400, and 500 to respectively receive light from different directions, and use the light from these four directions to help calculate the flight time to improve accuracy.
  • the complexity of the pixel configuration of FIG. 4 can be increased to further enrich the amount of information that can be obtained in subsequent applications.
  • the raster line rotation angle can be reduced from 45 degrees to 22.5 degrees.
  • the number of minimum repeated pixel groups is increased to 8.
  • the pixels 200, 300, 400, and 500 in FIG. 4 do not have to be arranged in the manner shown.
  • the positions of the pixel 300 and the pixel 400 may be exchanged.
  • FIG. 5 to 9 are manufacturing processes of the image sensor 200 of FIG. 3.
  • a semiconductor substrate 108 is obtained first, and a back-end process stack 110 is provided on the front side of the semiconductor substrate 108.
  • a metal layer 202' is formed on the back surface of the semiconductor substrate 108.
  • a sputtering process, an electroplating process or an evaporation process can be used to form the metal layer 202'.
  • an anti-reflection layer and/or a buffer layer may be formed on the back surface of the semiconductor substrate 108 first.
  • an etching process is used to form the shape of the mesh layer 202, for example, an etching process is used to form the shape of the metal mesh layer in FIG. 4.
  • a capping layer 204 is formed on the mesh layer 202 to cover the mesh layer 202 and fill the gaps between the gate lines of the mesh layer 202, and then directly contact the back surface of the semiconductor substrate 108.
  • the capping layer 204 may be a dielectric, such as silicon dioxide.
  • a planarization process is applied to the upper surface of the capping layer 204.
  • the grid layer 202 and the capping layer 204 together form the polarizing layer 104.
  • the microlens 102 is formed.
  • the shape of the microlens 102 can be shaped according to the situation.
  • a color filter can be formed between the microlens 102 and the polarizing layer 104.
  • FIG. 10 is a cross-sectional view of another embodiment of one pixel of the image sensor of this application. It should be noted that the image sensor 1000 may include multiple pixels, and the image sensor 1000 in FIG. 10 only shows one of the pixels. In this embodiment, the image sensor 1000 is a front-illuminated image sensor 1000, and includes a semiconductor substrate 1008, a back-end process stack 1010, and a microlens 1002.
  • the back-end process stack 1010 is arranged on the front side of the semiconductor substrate 1008 in the figure.
  • the back-end process stack 1010 includes an interlayer dielectric (ILD) layer and a metallization layer stacked in the interlayer dielectric layer, such as a metallization layer 1004.
  • the interlayer dielectric layer may be a low-k dielectric (ie, a dielectric with a dielectric constant less than about 3.9) or an oxide.
  • the metallization layers are electrically coupled to each other through vias and electrically coupled to the semiconductor substrate 1008 through contacts.
  • the metallization layer, the through hole and the contact can be, for example, a metal, such as aluminum copper, germanium, copper, or some other metal.
  • the semiconductor substrate 1008 may be a bulk semiconductor substrate, such as a bulk silicon substrate or a silicon on insulator (SOI) substrate.
  • the photosensitive sensor 1006 is disposed on the semiconductor substrate 1008.
  • the microlens 1002 is disposed on the front surface of the semiconductor substrate 1008 so that the back-end process stack 1010 is between the microlens 1002 and the semiconductor substrate 1008.
  • the metallization layer 1004 in the back-end process stack 1010 is used for modeling as a grid layer to achieve the effect of a polarizing layer, so that light outside a specific direction is not easy to pass, that is, light first After passing through the microlens 1002, and then entering the metallization layer (polarizing layer) 1004, and according to the design of the metallization layer (polarizing layer) 1004, light with a specific directionality can enter the photosensitive sensor 1006 instead of all passing through the microlens All light from 1002 enters the photosensitive sensor 1006.
  • the shape of the metallization layer (polarizing layer) 1004 as the polarizing layer can be the same as or similar to the shape of the grid layer of the image sensor 200, 300, 400, and/or 500, including multiple grid lines arranged in parallel all over the semiconductor substrate 1008 Above, in some embodiments, the spacing of the plurality of gate lines arranged in parallel is equal.
  • the metallization layer 1004 of any layer of the back-end process stack 1010 can be used as the polarizing layer, and this application is not limited to this.
  • a color filter can also be provided between the microlens 1002 and the back-end process stack 1010 as required.
  • FIG. 11 is a schematic diagram of an embodiment of the handheld device of this application.
  • the handheld device 1100 includes a display screen assembly 1104 and an image sensor 1102.
  • the handheld device 1100 can be used to perform time-of-flight sensing and/or three-dimensional image sensing for face recognition.
  • the handheld device 1000 may be any handheld electronic device such as a smart phone, a personal digital assistant, a handheld computer system, or a tablet computer.
  • the display screen assembly 1104 may include a display panel and a protective cover.
  • the protective cover is arranged above the display panel, and the image sensor 1102 is arranged below the display panel.
  • the image sensor 1102 may include an image sensor. 100/1000, the polarizing layer 104/1004 in the image sensor 100/1000 may have the shape of the grid layer of the image sensor 200, 300, 400, and/or 500.
  • the display panel may be an organic electroluminescent display panel (OLED), but the application is not limited to this.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

一种图像传感器(100/200/300/400/500)及其制造方法、芯片及包括该图像传感器(100/200/300/400/500)的手持装置,图像传感器(100/200/300/400/500)包括半导体衬底(108)和多个像素,其中多个像素中的每一像素包括:光敏传感器(106),设置于半导体衬底(108);偏光层(104),设置于半导体衬底(108)上;微透镜(102),设置于偏光层(104)上,使偏光层(104)位于微透镜(102)和半导体衬底(108)之间。

Description

图像传感器及其制造方法、芯片及手持装置 技术领域
本申请涉及一种图像传感器及其制造方法、芯片及采用该芯片的手持装置,尤其涉及一种具偏光层的图像传感器及图像传感器的制造方法、图像传感芯片及手持装置。
背景技术
CMOS图像传感器已经得到大规模生产和应用。传统的图像传感器可以生成二维(2D)图像和视频,近来可以产生三维(3D)图像的图像传感器和系统受到广泛关注,这些三维图像传感器可以应用于脸部识别,增强现实(AR,AUgmented Reality)/虚拟现实(VR,Virtual Reality),无人机等。
现有的三维图像传感器主要有三种实现方式:立体双目,结构光和飞行时间(ToF,Time of Flight)。
飞行时间是采用特殊设计的像素,通过测量光子飞行和返回的时间来测距,但是目前的技术还不能生成足够精度的深度图。为了增加建模的精准度以及降低成本,如何简单地达到改善飞行时间传感器的精準度的目的,已成为一个重要的工作项目。
发明内容
本申请的目的之一在于公开一种图像传感器及其制造方法、芯片及采用该芯片的手持装置,来解决上述问题。
本申请的一实施例公开了一种图像传感器,包括半导体衬底和多个像素,其中所述多个像素中的每一像素包括:光敏传感器,设 置于所述半导体衬底;偏光层,设置于所述半导体衬底上;微透镜,设置于所述偏光层上,使所述偏光层位于所述微透镜和所述半导体衬底之间。
本申请的一实施例公开了一种图像传感器制造方法,包括:提供半导体衬底;于所述半导体衬底上形成偏光层;以及于所述偏光层上形成微透镜。
本申请的一实施例公开了一种芯片,包括上述的图像传感器。
本申请的一实施例公开了一种手持装置,用以执行飞行时间感测,包括:显示面板;以及上述的图像传感器。
本申请实施例在图像传感器中增加了偏光层,可改善飞行时间传感器的精准度。
附图说明
图1为本申请的图像传感器的其中一个像素的实施例的剖面图;
图2为图1所示的图像传感器的俯视图;
图3为图2所示的图像传感器沿剖面线得到的剖面图;
图4为本申请的图像传感器的四个像素的实施例的俯视图;
图5至图9为图3所示的图像传感器的制造流程示意图;
图10为本申请的图像传感器的其中一个像素的另一实施例的剖面图;
图11为本申请手持装置的实施例的示意图。
其中,附图标记说明如下:
100、200、300、400、500、1000 图像传感器
102、1002                     微透镜
104、1004                     偏光层
106、1006                     光敏传感器
108、1009                 半导体衬底
110、1010                 后端制程堆叠件
202                       网格层
202'                      金属层
204                       盖冒层
1100                      手持装置
1102                      图像传感器
1104                      显示屏组件
具体实施方式
以下揭示内容提供了多种实施方式或示例,其能用以实现本发明内容的不同特征。下文所述之组件与配置的具体例子系用以简化本发明内容。当可想见,这些叙述仅为例示,其本意并非用于限制本发明内容。举例来说,在下文的描述中,将一第一特征形成于一第二特征上或之上,可能包括某些实施例其中所述的第一与第二特征彼此直接接触;且也可能包括某些实施例其中还有额外的组件形成于上述第一与第二特征之间,而使得第一与第二特征可能没有直接接触。此外,本发明内容可能会在多个实施例中重复使用组件符号和/或标号。此种重复使用乃是基于简洁与清楚的目的,且其本身不代表所讨论的不同实施例和/或组态之间的关系。
再者,在此处使用空间上相对的词汇,譬如「之下」、「下方」、「低于」、「之上」、「上方」及与其相似者,可能是为了方便说明图中所绘示的一组件或特征相对于另一或多个组件或特征之间的关系。这些空间上相对的词汇其本意除了图中所绘示的方位之外,还涵盖了装置在使用或操作中所处的多种不同方位。可能将所述设备放置于其他方位(如,旋转90度或处于其他方位),而这些空间上相对的描述词汇就应该做相应的解释。
虽然用以界定本申请较广范围的数值范围与参数皆是约略的数值,此处已尽可能精确地呈现具体实施例中的相关数值。然而,任 何数值本质上不可避免地含有因个别测试方法所致的标准偏差。在此处,「约」通常系指实际数值在一特定数值或范围的正负10%、5%、1%或0.5%之内。或者是,「约」一词代表实际数值落在平均值的可接受标准误差之内,视本申请所属技术领域中具有通常知识者的考虑而定。当可理解,除了实验例之外,或除非另有明确的说明,此处所用的所有范围、数量、数值与百分比(例如用以描述材料用量、时间长短、温度、操作条件、数量比例及其他相似者)均经过「约」的修饰。因此,除非另有相反的说明,本说明书与附随申请专利范围所揭示的数值参数皆为约略的数值,且可视需求而更动。至少应将这些数值参数理解为所指出的有效位数与套用一般进位法所得到的数值。在此处,将数值范围表示成由一端点至另一端点或介于二端点之间;除非另有说明,此处所述的数值范围皆包括端点。
传统的飞行时间传感器的接收模组是利用图像传感器来判断光线从待测物反射回来的时间,由于光线在环境中的反射行为复杂,接收模组往往会接收到很多不希望被接收到的噪声,因此本申请利用在飞行时间传感器的光信号接收模组的图像传感器中,设置偏光层来过滤接收到的光线,以增加飞行时间传感器的精准度,其细节将描述如下。应注意的是,尽管本申请的图像传感器可改善现有的飞行时间传感器的精准度,然而此用途并非本申请的限制,换句话说,本申请的图像传感器亦可应用在飞行时间传感器以外的其他场合。
图1为本申请的图像传感器的其中一个像素的实施例的剖面图。应注意的是,图像传感器100可包含多个像素,图1中的图像传感器100仅绘示了其中一个像素。在此实施例中,图像传感器100为背面照明(BSI)图像传感器100,包含后端制程(BEOL)堆叠件110、半导体衬底108、偏光层104以及微透镜102。其中后端制程堆叠件110被配置在图中半导体衬底108正面。后端制程堆叠件110包括层间介电(ILD)层以及堆叠在层间介电层内的金属化层。例如,层间介电层可以是低k电介质(即,介电常数小于约3.9的电介质)或氧化物。金属化层通过通孔相互电耦合并且通过接触件电 耦合至半导体衬底108。金属化层、通孔和接触件例如可以是金属,诸如铝铜、锗、铜或一些其他金属。
半导体衬底108可以是块状半导体衬底,诸如体硅衬底或绝缘体上硅(SOI)衬底。光敏传感器106设置于半导体衬底108。微透镜102被配置在半导体衬底108的背面,在微透镜102和半导体衬底108之间则设置有偏光层104。偏光层104的设计可使特定方向以外的光线不容易通过,也就是说,光线先经过由微透镜102,再进入偏光层104,且依据偏光层104的设计,能让具有特定方向性的光线进入到光敏传感器106,而非所有经过微透镜102的光线都进入到光敏传感器106。
在某些实施例中,微透镜102和偏光层104之间可依需求另设置有滤色器。又,在某些实施例中,偏光层104和半导体衬底108之间可设置有抗反射层(anti-reflection layer)及/或缓冲层。
图2为图1的图像传感器的进一步实施例的俯视图。图2中的图像传感器200的偏光层104具垂直的栅状结构。请同时参考图2和图3,图3为图2的图像传感器沿剖面线A-A'得到的剖面图。由图2和图3可以看出,偏光层104包含了网格层202以及盖冒层204,网格层202为具有特定高度的栅线,并围绕图像传感器200的像素周围,以防像素间的光线发生光学串扰,除此之外,网格层202中还具有平行设置的栅线布满像素的半导体衬底108的背面。网格层202可以包含金属,举例来说,网格层202可包含钛(Ti),钨(W),铝(Al),铜(Cu)及/或其组合。但本申请不以此为限,在某些实施中,网格层202可以包含金属以外的材质。盖冒层204覆盖网格层202,并填满网格层202的栅线之间的缝隙,盖冒层204可以是介电质,例如二氧化硅。
网格层202具有多个开口使下方的半导体衬底108露出,且所述多个开口将网格层202分隔为多条栅线,如多条金属栅线,图3所绘示的网格层202的栅线的数目仅为示意用途,实际上网格层202的栅线的数目可依实际设计来做调整。在本实施例中,网格层202 的栅线实质上具同样高度,且彼此间隔实质相同,具体来说,网格层202围绕图像传感器200的像素周围的栅线的宽度为d1,网格层202围绕图像传感器200的像素周围的栅线的宽度为d1,平行设置的栅线的宽度为d2,平行设置的栅线间的间距为d3。其中d2和d3实质相同,且为d1的约两倍。然而本申请不以此为限,在某些实施例中,网格层202的栅线的宽度或间距可具有不同的高度、宽度。
图4为本申请的图像传感器的四个像素的实施例的俯视图。图4中的图像传感器绘示了图2的像素200,以及另外的像素300、400和500,分别具有不同的网格层202的图案。实际上图像传感器可包含多于4个像素,举例来说,图4的像素200、300、400及500可作为最小重复像素组,并以此最小重复像素组沿图4的水平及/或垂直方向复制,以得到所需要的图像传感器的尺寸。
具体来说,像素200、300、400和500的网格层202都具有围绕像素周围的栅线部分,像素200、300、400和500的网格层202的差异在于其中平行设置的栅线的方向。如图4所示,像素300的栅线方向为像素200的栅线方向向右旋转45度;像素400的栅线方向为像素300的栅线方向又向右旋转45度;像素500的栅线方向为像素400的栅线方向再向右旋转45度。因此,像素200的栅线方向和像素400的栅线方向彼此垂直,且像素300的栅线方向和像素500的栅线方向彼此垂直。
图4的像素配置可使像素200、300、400和500分别接收到不同方向的光线,并应用这四种方向的光线来帮助计算飞行时间以提升的准确度。应注意的是,在某些实施例中,可增加图4的像素配置的复杂度以进一步丰富后续应用能得到的信息量,举例来说,栅线旋转角度可从45度缩小为22.5度,且最小重复像素组的数目增加为8个。此外,图4的像素200、300、400和500不一定要按照绘示的方式来排列。举例来说,在某些实施例中,像素300和像素400的位置可以交换。
图5至图9为图3的图像传感器200的制造流程。在图5中, 首先得到半导体衬底108,且在半导体衬底108的正面具有后端制程堆叠件110。接著,如图6所示,形成金属层202'于半导体衬底108的背面,举例来说,可以使用溅镀工艺,电镀工艺或蒸镀工艺来形成金属层202'。在某些实施例中,在形成金属层202'之前,可先形成抗反射层及/或缓冲层于半导体衬底108的背面。
在图7中,利用蚀刻工艺来形成网格层202的造型,例如利用蚀刻工艺来形成图4的金属网格层的造型。接著在图8中,在网格层202上形成盖冒层204覆盖网格层202,并填满网格层202的栅线之间的缝隙,并进而直接接触半导体衬底108的背面。盖冒层204可以是介电质,例如二氧化硅。在某些实施例中,并对盖冒层204的上表面施以平坦化工艺,完成后,网格层202和盖冒层204共同形成偏光层104。最后在图9中,形成微透镜102,微透镜102的形状可视情况来造型,此外,在某些实施例中,在微透镜102和偏光层104之间,可另形成滤色器。
应注意的是,在微透镜和光敏传感器之间利用偏光层来提升飞行时间传感器的精准度的实施,并不以背面照明图像传感器为限制,在某些实施例中,亦可使用正面照明(FSI)图像传感器来实现。图10为本申请的图像传感器的其中一个像素的另一实施例的剖面图。应注意的是,图像传感器1000可包含多个像素,图10中的图像传感器1000仅绘示了其中一个像素。在此实施例中,图像传感器1000为正面照明图像传感器1000,包含半导体衬底1008、后端制程堆叠件1010以及微透镜1002。其中后端制程堆叠件1010被配置在图中半导体衬底1008正面。后端制程堆叠件1010包括层间介电(ILD)层以及堆叠在层间介电层内的金属化层,例如金属化层1004。层间介电层可以是低k电介质(即,介电常数小于约3.9的电介质)或氧化物。金属化层通过通孔相互电耦合并且通过接触件电耦合至半导体衬底1008。金属化层、通孔和接触件例如可以是金属,诸如铝铜、锗、铜或一些其他金属。
半导体衬底1008可以是块状半导体衬底,诸如体硅衬底或绝缘 体上硅(SOI)衬底。光敏传感器1006设置于半导体衬底1008。微透镜1002被配置在半导体衬底1008的正面,使后端制程堆叠件1010在微透镜1002和半导体衬底1008之间。
在本实施例中,后端制程堆叠件1010中金属化层1004被用来造型以做为网格层来达到偏光层的效果,使特定方向以外的光线不容易通过,也就是说,光线先经过由微透镜1002,再进入金属化层(偏光层)1004,且依据金属化层(偏光层)1004的设计,能让具有特定方向性的光线进入到光敏传感器1006,而非所有经过微透镜1002的光线都进入到光敏传感器1006。金属化层(偏光层)1004作为偏光层的造型,可和图像传感器200、300、400及/或500的网格层造型相同或相似,包括平行设置的多条栅线布满半导体衬底1008上,在某些实施例中,所述平行设置的多条栅线的间距相等。
在实施例中,可以使用后端制程堆叠件1010任一层的金属化层1004来做为偏光层,本申请并不对此多做限制。微透镜1002和后端制程堆叠件1010之间亦可依需求另设置有滤色器。
本申请还提供了一种芯片,其包括图像传感器100/1000,图像传感器100/1000中的偏光层104/1004可具有图像传感器200、300、400及/或500的造型。本申请还提供了一种手持装置,图11为本申请手持装置的实施例的示意图。手持装置1100包括显示屏组件1104以及图像传感器1102。手持装置1100可用来执行飞行时间感测及/或三维图像感测以进行脸部识别。其中,手持装置1000可为例如智能型手机、个人数字助理、手持式计算机系统或平板计算机等任何手持式电子装置。显示屏组件1104可包括显示面板以及保护盖板,所述保护盖板设置在所述显示面板的上方,图像传感器1102设置在所述显示面板的下方,举例来说,图像传感器1102可包括图像传感器100/1000,图像传感器100/1000中的偏光层104/1004可具有图像传感器200、300、400及/或500的网格层的造型。在本实施例中,所述显示面板可以是一种有机电激发光显示面板(OLED),但本申请不以此为限。
上文的叙述简要地提出了本申请某些实施例之特征,而使得本申请所属技术领域具有通常知识者能够更全面地理解本发明内容的多种态样。本申请所属技术领域具有通常知识者当可明了,其可轻易地利用本发明内容作为基础,来设计或更动其他工艺与结构,以实现与此处所述之实施方式相同的目的和/或达到相同的优点。本申请所属技术领域具有通常知识者应当明白,这些均等的实施方式仍属于本发明内容之精神与范围,且其可进行各种变更、替代与更动,而不会悖离本发明内容之精神与范围。

Claims (22)

  1. 一种图像传感器,其特征在于,所述图像传感器包括半导体衬底和多个像素,其中所述多个像素中的每一像素包括:
    光敏传感器,设置于所述半导体衬底;
    偏光层,设置于所述半导体衬底上;
    微透镜,设置于所述偏光层上,使所述偏光层位于所述微透镜和所述半导体衬底之间。
  2. 如权利要求1所述的图像传感器,其中,所述偏光层包括网格层。
  3. 如权利要求2所述的图像传感器,其中,所述网格层包括钛(Ti),钨(W),铝(Al),铜(Cu)及/或其组合。
  4. 如权利要求2所述的图像传感器,其中,所述网格层包括平行设置的多条栅线布满所述半导体衬底上。
  5. 如权利要求4所述的图像传感器,其中,所述偏光层还包括盖冒层覆盖所述网格层。
  6. 如权利要求4所述的图像传感器,其中,所述网格层中的所述平行设置的多条栅线的间距相等。
  7. 如权利要求1所述的图像传感器,其中,所述多个像素中的每一像素还包括后端制程堆叠件设置于所述微透镜和所述半导体衬底之间,且所述偏光层包括所述后端制程堆叠件中的金属化层。
  8. 如权利要求7所述的图像传感器,其中,所述金属化层包括平行设置的多条栅线布满所述半导体衬底上。
  9. 如权利要求1-8任意一项所述的图像传感器,其中,所述多个像素包括多个像素组,其中所述多个像素组中的每一个包括第一像素和第二像素,所述第一像素以及所述第二像素的所述偏光层具有不同的图案。
  10. 如权利要求9所述的图像传感器,其中,所述第一像素和所述第二像素临接。
  11. 如权利要求9所述的图像传感器,其中所述第一像素以及所述第二像素的所述偏光层的所述网格层具有不同的图案。
  12. 如权利要求10所述的图像传感器,其中所述第一像素的所述网格层包括平行设置的多条第一栅线,所述第二像素的所述网格层包括平行设置的多条第二栅线,其中所述多条第一栅线和所述多条第二栅线的方向不同。
  13. 如权利要求12所述的图像传感器,其中所述多条第一栅线和所述多条第二栅线的方向差距为45度。
  14. 如权利要求13所述的图像传感器,其中所述多个像素组中的每一个还包括第三像素以及第四像素,
    其中所述第三像素临接所述第二像素,以及所述第四像素临接所述第一像素,所述第一像素、所述第二像素、所述第三像素以及所述第四像素的所述偏光层具有不同的图案。
  15. 如权利要求14所述的图像传感器,其中所述第一像素、所述第二像素、所述第三像素以及所述第四像素的所述网格层具有不同的图案。
  16. 如权利要求15所述的图像传感器,其中所述第三像素的所述网格层包括平行设置的多条第三栅线,所述第四像素的所述网格层包括平行设置的多条第四栅线,其中所述多条第一栅线、所述多条第二栅线、所述多条第三栅线和所述多条第四栅线的方向不同。
  17. 如权利要求16所述的图像传感器,其中所述多条第三栅线和所述多条第四栅线的方向差距为45度,且所述多条第一栅线和所述多条第三栅线彼此垂直。
  18. 一种图像传感器制造方法,其特征在于,包括:
    提供半导体衬底;
    于所述半导体衬底上形成偏光层;以及
    于所述偏光层上形成微透镜。
  19. 如权利要求18所述的图像传感器制造方法,其中于所述半导体衬底上形成所述偏光层包括:
    形成金属层于所述半导体衬底;以及
    蚀刻所述金属层以得到网格层。
  20. 如权利要求19所述的图像传感器制造方法,其中于所述半导体衬底上形成所述偏光层还包括:
    形成盖冒层覆盖所述网格层。
  21. 一种芯片,其特征在于,所述芯片包括:
    如权利要求1-17中任一项所述的图像传感器。
  22. 一种手持装置,用以执行飞行时间感测,其特征在于,包括:
    显示面板;以及
    如权利要求1-17中任一项所述的图像传感器。
PCT/CN2019/098285 2019-07-30 2019-07-30 图像传感器及其制造方法、芯片及手持装置 WO2021016839A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201980001351.5A CN110574166A (zh) 2019-07-30 2019-07-30 图像传感器及其制造方法、芯片及手持装置
PCT/CN2019/098285 WO2021016839A1 (zh) 2019-07-30 2019-07-30 图像传感器及其制造方法、芯片及手持装置
EP19919564.5A EP3799123A4 (en) 2019-07-30 2019-07-30 IMAGE SENSOR AND METHOD OF MANUFACTURING THEREOF, CHIP AND HANDHELD DEVICE
US17/027,612 US20210036042A1 (en) 2019-07-30 2020-09-21 Image sensor, manufacturing method and hand-held device of the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/098285 WO2021016839A1 (zh) 2019-07-30 2019-07-30 图像传感器及其制造方法、芯片及手持装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/027,612 Continuation US20210036042A1 (en) 2019-07-30 2020-09-21 Image sensor, manufacturing method and hand-held device of the same

Publications (1)

Publication Number Publication Date
WO2021016839A1 true WO2021016839A1 (zh) 2021-02-04

Family

ID=68786101

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/098285 WO2021016839A1 (zh) 2019-07-30 2019-07-30 图像传感器及其制造方法、芯片及手持装置

Country Status (4)

Country Link
US (1) US20210036042A1 (zh)
EP (1) EP3799123A4 (zh)
CN (1) CN110574166A (zh)
WO (1) WO2021016839A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021016839A1 (zh) * 2019-07-30 2021-02-04 深圳市汇顶科技股份有限公司 图像传感器及其制造方法、芯片及手持装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103852765A (zh) * 2012-11-30 2014-06-11 英飞凌科技股份有限公司 通过偏振的选择性
CN106169488A (zh) * 2015-05-22 2016-11-30 台湾积体电路制造股份有限公司 用于使用全局快门捕获的背侧照明(bsi)互补金属氧化物半导体(cmos)图像传感器的垂直转移栅极结构
CN106972036A (zh) * 2015-11-09 2017-07-21 台湾积体电路制造股份有限公司 集成电路及其形成方法
CN108780142A (zh) * 2016-02-29 2018-11-09 泰特拉维公司 3d成像系统和方法
CN109644264A (zh) * 2016-08-25 2019-04-16 脸谱科技有限责任公司 用于深度映射的阵列检测器
US20190174120A1 (en) * 2017-05-16 2019-06-06 Samsung Electronics Co., Ltd. Time-resolving sensor using shared ppd+spad pixel and spatial-temporal correlation for range measurement
CN110574166A (zh) * 2019-07-30 2019-12-13 深圳市汇顶科技股份有限公司 图像传感器及其制造方法、芯片及手持装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5682437B2 (ja) * 2010-09-07 2015-03-11 ソニー株式会社 固体撮像素子、固体撮像装置、撮像機器、及び、偏光素子の製造方法
JP5603508B2 (ja) * 2012-05-22 2014-10-08 パナソニック株式会社 撮像処理装置および内視鏡
JP6833597B2 (ja) * 2017-04-11 2021-02-24 ソニーセミコンダクタソリューションズ株式会社 固体撮像装置
JP6951866B2 (ja) * 2017-05-18 2021-10-20 ソニーセミコンダクタソリューションズ株式会社 撮像素子

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103852765A (zh) * 2012-11-30 2014-06-11 英飞凌科技股份有限公司 通过偏振的选择性
CN106169488A (zh) * 2015-05-22 2016-11-30 台湾积体电路制造股份有限公司 用于使用全局快门捕获的背侧照明(bsi)互补金属氧化物半导体(cmos)图像传感器的垂直转移栅极结构
CN106972036A (zh) * 2015-11-09 2017-07-21 台湾积体电路制造股份有限公司 集成电路及其形成方法
CN108780142A (zh) * 2016-02-29 2018-11-09 泰特拉维公司 3d成像系统和方法
CN109644264A (zh) * 2016-08-25 2019-04-16 脸谱科技有限责任公司 用于深度映射的阵列检测器
US20190174120A1 (en) * 2017-05-16 2019-06-06 Samsung Electronics Co., Ltd. Time-resolving sensor using shared ppd+spad pixel and spatial-temporal correlation for range measurement
CN110574166A (zh) * 2019-07-30 2019-12-13 深圳市汇顶科技股份有限公司 图像传感器及其制造方法、芯片及手持装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3799123A4 *

Also Published As

Publication number Publication date
EP3799123A1 (en) 2021-03-31
US20210036042A1 (en) 2021-02-04
EP3799123A4 (en) 2021-04-28
CN110574166A (zh) 2019-12-13

Similar Documents

Publication Publication Date Title
US11015927B2 (en) Optical sensor and optical sensor system
US20200312925A1 (en) Display panel, display device and a method for manufacturing a display panel
US9905605B2 (en) Phase detection autofocus techniques
US10866648B2 (en) Display substrate and method for manufacturing the same
KR20180005588A (ko) 디스플레이 패널의 광원들을 이용한 지문 센서, 지문 센서 패키지 및 지문 센싱 시스템
CN107039468A (zh) 影像感测器及其制作方法
TW201624615A (zh) 半導體元件及其製造方法
CN109564925A (zh) 具有集成光传感器的片上系统型相机以及制造片上系统型相机的方法
TW202205654A (zh) 攝像裝置及電子機器
WO2021016839A1 (zh) 图像传感器及其制造方法、芯片及手持装置
CN104733488A (zh) 有机图像传感器及其形成方法
US11307689B2 (en) Display panel, and array substrate and manufacturing thereof
WO2019052268A1 (zh) 基板及其感测方法、触控面板和显示装置
CN104733489A (zh) 有机图像传感器及其形成方法
US20120241209A1 (en) Wafer-level electromagnetic interference shielding structure and manufacturing method thereof
US20170098029A1 (en) Manufacturing method for a semiconductor device, pattern generating method and nontransitory computer readable medium storing a pattern generating program
CN210429815U (zh) 图像传感器及芯片及手持装置
CN113594204B (zh) 显示基板及其制备方法、显示装置
WO2021016838A1 (zh) 图像传感器及其制造方法、芯片及手持装置
TWI686940B (zh) 光學感測結構及其形成方法
TWI692651B (zh) 光學元件及其製造方法
CN102891159B (zh) Cmos影像传感器的像元结构及其制造方法
US11626434B2 (en) Image sensor package
CN105987712B (zh) 光学传感器及光学感测系统
US20240145494A1 (en) Image sensor structure

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019919564

Country of ref document: EP

Effective date: 20200924

NENP Non-entry into the national phase

Ref country code: DE