CN217588937U - Pixel unit, image sensor, camera assembly and electronic equipment - Google Patents

Pixel unit, image sensor, camera assembly and electronic equipment Download PDF

Info

Publication number
CN217588937U
CN217588937U CN202221460746.7U CN202221460746U CN217588937U CN 217588937 U CN217588937 U CN 217588937U CN 202221460746 U CN202221460746 U CN 202221460746U CN 217588937 U CN217588937 U CN 217588937U
Authority
CN
China
Prior art keywords
light
semiconductor substrate
photodiode
image sensor
pixel unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202221460746.7U
Other languages
Chinese (zh)
Inventor
祁春超
孙鹏飞
王保宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202221460746.7U priority Critical patent/CN217588937U/en
Application granted granted Critical
Publication of CN217588937U publication Critical patent/CN217588937U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Solid State Image Pick-Up Elements (AREA)

Abstract

The utility model discloses a pixel unit, image sensor, camera subassembly and electronic equipment relates to image sensor's technical field, and the absorption rate of pixel unit to the longer incident light of wavelength is lower among the solution correlation technique to take place the technical problem of optical crosstalk easily between two adjacent pixel units. The pixel unit comprises a photodiode, and the photodiode is provided with a light receiving surface. The circuit structure layer is arranged on one side of the semiconductor substrate far away from the light receiving surface. The circuit structure layer includes a plurality of metal wires embedded in an interlayer dielectric and a light-concentrating structure. The multilayer metal wiring is electrically connected to the photodiode. The vertical projection of the light-concentrating structure on the semiconductor substrate overlaps at least part of the vertical projection of the photodiode on the semiconductor substrate. At least a portion of the light concentrating structures have an optical refractive index greater than that of the interlayer medium and not greater than that of the semiconductor substrate. The utility model provides a pixel unit is used for converting light signal into the signal of telecommunication.

Description

Pixel unit, image sensor, camera assembly and electronic equipment
Technical Field
The utility model relates to an image sensor's technical field especially relates to a pixel unit, image sensor, camera subassembly and electronic equipment.
Background
The image sensor is widely applied to the technical fields of digital cameras, mobile phones, video monitoring equipment, medical instruments and the like. The image sensor includes a plurality of pixel units arranged in an array, and the pixel units can convert received optical signals into electrical signals.
In the related art, the pixel units have relatively low absorption efficiency for incident light of long wavelength, such as near infrared light or infrared light, causing difficulty in near infrared imaging, and optical crosstalk easily occurs between adjacent two pixel units, reducing the reliability of use of the image sensor.
SUMMERY OF THE UTILITY MODEL
In order to solve the technical problems that the absorption efficiency of the pixel units to the incident light with long wavelength is low in the related art and the crosstalk easily occurs between two adjacent pixel units, the embodiment of the utility model provides a pixel unit, an image sensor, a camera assembly and an electronic device.
In order to achieve the above object, the embodiments of the present invention adopt the following technical solutions:
in a first aspect, embodiments of the present invention provide a pixel unit. The pixel unit includes a semiconductor substrate, a photodiode, and a circuit structure layer. The photodiode is disposed within the semiconductor substrate. The photodiode has a light receiving surface. The circuit structure layer is arranged on one side of the semiconductor substrate far away from the light receiving surface. The circuit structure layer includes an interlayer dielectric, and a plurality of metal wirings embedded in the interlayer dielectric and a light-condensing structure embedded in the interlayer dielectric. The multilayer metal wiring is electrically connected to the photodiode. Wherein, the vertical projection of the light-gathering structure on the semiconductor substrate is overlapped with at least part of the vertical projection of the photodiode on the semiconductor substrate. At least a portion of the light concentrating structures have an optical refractive index greater than that of the interlayer medium and not greater than that of the semiconductor substrate.
The embodiment of the utility model provides a pixel unit imbeds spotlight structure in the medium between the layer, and the vertical projection of spotlight structure on the semiconductor substrate overlaps with photodiode vertical projection's at least part on the semiconductor substrate, and the position that sets up that also is spotlight structure is corresponding with photodiode's position that sets up. And the optical refractive index of at least part of the light-gathering structure is larger than that of the interlayer medium and not larger than that of the semiconductor substrate. Therefore, when light with longer wavelength (such as near infrared light or infrared light) irradiates the photodiode, one part of light can be absorbed by the photodiode and converted into an electric signal, the other part of light can penetrate through the photodiode and irradiate the light condensing structure, and total reflection occurs at the contact surface of the light condensing structure and the interlayer medium, so that the light penetrating through the photodiode can be reflected to the corresponding photodiode, the absorption rate of the photodiode to the light (particularly, light with longer wavelength, such as near infrared light or infrared light) is improved, the quantum efficiency of the light with longer wavelength is enhanced, the near infrared imaging quality is improved, and the service performance of the pixel unit is improved.
Meanwhile, the light passing through the photodiodes is reflected to the corresponding photodiodes by the arrangement of the light condensation structure, the intensity of the light entering the interlayer medium can be reduced, and the light is prevented from irradiating other photodiodes under the reflection action of metal wiring or other structures in the interlayer medium, so that the optical crosstalk generated between two adjacent photodiodes is reduced, namely the optical crosstalk between two adjacent pixel units is reduced, and the use reliability of the image sensor is improved.
Optionally, the surface of the light-gathering structure close to the semiconductor substrate is flush with the surface of the circuit structure layer close to the semiconductor substrate. By the arrangement, the phenomenon that most of light rays passing through the photodiode are reflected by the metal wiring and cannot enter the light gathering structure due to overlarge distance between the light gathering structure and the photodiode is avoided. Through the arrangement, the intensity of light entering the interlayer medium is further reduced, so that the optical crosstalk between two adjacent pixel units is reduced, and the use reliability of the image sensor is improved. And the surface of the light gathering structure close to the semiconductor substrate is flush with the surface of the circuit structure layer close to the semiconductor substrate, so that the processing convenience of the pixel unit can be improved, and the production cost is reduced.
Optionally, the cross section of the light-gathering structure is trapezoidal or rectangular, and the cross section of the light-gathering structure is perpendicular to the semiconductor substrate and the circuit structure layer. So set up, can satisfy the reflection demand to different incident angles or different wavelength light, further improved pixel element's use reliability.
Optionally, when the cross section of the light-gathering structure is a trapezoid, the trapezoid includes an upper base and a lower base, and the upper base is parallel to the lower base. The length of the upper bottom is smaller than that of the lower bottom, the lower bottom is flush with the surface of one side, close to the semiconductor substrate, of the circuit structure layer, and the upper bottom is far away from the semiconductor substrate. So set up, increased the area that the spotlight structure is close to photodiode one side, further ensured that light can shine to the spotlight structure after passing photodiode, increased the intensity that enters into light in the spotlight structure to reduce the optical crosstalk who produces between two adjacent pixel units, improve image sensor's use reliability. Moreover, the area of one side, away from the photodiode, of the light condensation structure can be smaller than the area of one side, close to the photodiode, of the light condensation structure, the reflection effect of the light condensation structure on light is improved, the light intensity of light re-irradiated to the corresponding photodiode is increased, and the quantum efficiency of light with longer wavelength is improved.
Optionally, the surface of the circuit structure layer close to the semiconductor substrate is provided with a containing hole, and the light-gathering structure comprises a metal layer and a light-gathering structure body. The metal layer covers the inner wall of the receiving hole. The light-gathering structure body is embedded in the accommodating hole. The optical refractive index of the light condensation structure body is larger than that of the interlayer medium and is not larger than that of the semiconductor substrate. So set up, play the effect that blocks through the metal level to light, avoided light to pass the spotlight structure body and shine and enter into the interlaminar medium in, improved the reflection effect of spotlight structure to light, further reduced the optical crosstalk that produces between two adjacent pixel units, improved image sensor's use reliability.
Optionally, the inner wall of the accommodating hole includes a side wall and a bottom wall, and the metal layer covers at least one of the side wall and the bottom wall. Due to the arrangement, the use flexibility of the pixel unit is improved, and different use requirements are met.
Optionally, when the metal layer covers the bottom wall, the metal layer and the metal wiring are made of the same material in the same layer. By the arrangement, the processing convenience of the image sensor is improved, the processing time of the pixel unit is shortened, and the production cost of the pixel unit is reduced.
Optionally, the pixel unit further includes an optical filter and a micro lens. The optical filter is arranged on one side of the semiconductor substrate close to the light receiving surface and covers the light receiving surface. The micro lens is arranged on one side of the optical filter far away from the semiconductor substrate. So set up for the light of different colours can shine to photodiode, thereby makes electronic equipment can produce the image of banding colour, has improved pixel element's performance. In addition, the intensity of light irradiating the light receiving surface of the photodiode can be improved, so that the pixel unit can normally work in an environment with poor light, and the applicability of the pixel unit is improved.
In a second aspect, embodiments of the present invention provide an image sensor, including a plurality of pixel units as described above in the first aspect, the plurality of pixel units being arranged in an array.
The embodiment of the utility model provides an image sensor includes the pixel unit of above-mentioned first aspect, consequently has the whole beneficial effect of above-mentioned first aspect, and no longer gives redundant details here.
Optionally, the image sensor further includes a spacer disposed between the at least two pixel units. By the arrangement, optical crosstalk generated between two adjacent pixel units is further reduced, and the use reliability of the image sensor is improved.
In a third aspect, embodiments of the present invention provide a camera assembly, which includes a lens assembly and an image sensor as described above in the second aspect, wherein the image sensor is disposed on a light-emitting side of the lens assembly.
The embodiment of the utility model provides a camera subassembly includes the image sensor of above-mentioned second aspect, consequently has the whole beneficial effect of above-mentioned second aspect, and no longer gives unnecessary details here.
In a fourth aspect, embodiments of the present invention provide an electronic device, including a housing and the camera assembly of the third aspect. The shell is provided with a first through hole, and at least part of the camera assembly is embedded in the first through hole.
The embodiment of the utility model provides an electronic equipment includes the camera subassembly of above-mentioned third aspect, consequently has the whole beneficial effects of above-mentioned third aspect, no longer gives unnecessary details here.
Optionally, the housing further has a second through hole, and the electronic device further includes a transmitter disposed in the second through hole. The emitter is used for emitting light to the object to be measured, and the camera assembly is used for receiving the light reflected by the object to be measured. By the arrangement, the accuracy of the camera assembly for object recognition or detection is further improved, and the use reliability of the electronic equipment is improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 2 is a schematic block diagram of a camera assembly according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to another embodiment of the present invention;
fig. 4 is a schematic structural diagram of an image sensor according to an embodiment of the present invention;
FIG. 5 isbase:Sub>A schematic cross-sectional view of the image sensor of FIG. 4 along A-A;
FIG. 6 is another schematic cross-sectional view of the image sensor of FIG. 4 taken along the A-A direction;
fig. 7 is a schematic block diagram of a processing circuit according to an embodiment of the present invention;
fig. 8 is a schematic vertical projection diagram of an embodiment of the present invention;
fig. 9 is a schematic vertical projection view of another embodiment of the present invention;
FIG. 10 is another schematic cross-sectional view of the image sensor of FIG. 4 taken along the A-A direction;
FIG. 11 is another cross-sectional view of the image sensor of FIG. 4 taken along the line A-A;
fig. 12 is a flowchart illustrating steps of a method for fabricating a light-gathering structure according to an embodiment of the present invention;
FIG. 13 is another schematic cross-sectional view of the image sensor of FIG. 4 taken along the A-A direction;
FIG. 14 is another schematic cross-sectional view of the image sensor of FIG. 4 taken along the A-A direction;
fig. 15 is another schematic cross-sectional view of the image sensor of fig. 4 along thebase:Sub>A-base:Sub>A direction.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
In the description of the present invention, it is to be understood that the terms "center", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are merely for convenience of description and for simplicity of description, and do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, are not to be construed as limiting the present invention.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases to those skilled in the art.
As shown in fig. 1, an embodiment of the present application provides an electronic device 300, where the electronic device 300 may include an electronic product with an image acquisition function, such as a mobile phone (mobile phone), a tablet computer (pad), an intelligent access control, a product detection apparatus, a television, an intelligent wearable product (e.g., a smart watch, a smart bracelet), a Virtual Reality (VR) terminal device, an augmented reality (augmented reality AR) terminal device, and the like. The embodiment of the present application does not particularly limit the specific form of the electronic device 300, and the following description is provided for convenience.
Based on this, in some embodiments, when the electronic device 300 is a mobile phone, a tablet computer, or a product detection apparatus, a picture of an object or a processed product may be taken. Specifically, as shown in fig. 1, the electronic device 300 may include a housing 310 and a camera assembly 200, the camera assembly 200 being used to capture image information of an object.
As shown in fig. 2, the camera assembly 200 may include a lens assembly 210 and an image sensor 400, the lens assembly 210 includes a light-in side and a light-out side, and the image sensor 400 is disposed on the light-out side of the lens assembly 210. The light enters from the light-entering side of the lens assembly 210, and after exiting from the light-exiting side of the lens assembly 210, the light is received by the image sensor 400 and converted into an electrical signal, so as to realize the image capturing function of the electronic device 300.
Or, in other embodiments, the electronic device 300 may further identify the object to be detected by using an image acquisition manner. For example, when the electronic device 300 is a mobile phone or a smart gate, a human face may be recognized. In this case, as shown in fig. 3, the electronic device 300 further includes a transmitter 320. The emitter 320 is used to emit light, such as invisible light like near-infrared light or infrared light. The light emitted from the emitter 320 is reflected by an object to be detected (e.g., a human face) to the camera assembly 200, and the image sensor 400 in the camera assembly 200 receives the light reflected by the human face and converts the light into an electrical signal, so as to realize the recognition function of the electronic device 300.
As can be seen from the above, the electronic device 300 implements the image capturing function through the image sensor 400. In some embodiments of the present application, the image sensor 400 may be a Complementary Metal-Oxide Semiconductor (CMOS) image sensor, and the structure of the image sensor 400 is illustrated below.
For example, as shown in fig. 4, the image sensor 400 may include a plurality of pixel units 100, and the pixel units 100 may be capable of converting a received optical signal into an electrical signal.
In some examples, the plurality of pixel units 100 are arranged in an array, and for example, as shown in fig. 4, the plurality of pixel units 100 are arranged in a matrix. In other examples, the plurality of pixel units 100 may be arranged in other forms, such as a triangular form or a polygonal form. In still other examples, the plurality of pixel units 100 may be arranged in a scattered manner.
As can be understood, the arrangement of the plurality of pixel units 100 in an array can improve the arrangement regularity of the plurality of pixel units 100, reduce the occupied space of the plurality of pixel units 100, improve the number of the pixel units 100 that can be placed in a unit volume, and facilitate the miniaturization of the image sensor 400.
It is understood that the pixel unit 100 can convert the received optical signal of the visible light into an electrical signal and can also convert the received optical signal of the invisible light into an electrical signal. In this way, the electrical signal finally output by each pixel unit 100 is obtained, and then digital-to-analog conversion is performed on the electrical signal, so that the light ray information corresponding to the pixel unit 100 can be finally obtained. By acquiring the light information corresponding to each pixel unit 100, the image capturing function of the image sensor 400 can be realized.
Based on this, in order to enable the pixel unit 100 to realize photoelectric conversion, as shown in fig. 5 (fig. 5 isbase:Sub>A schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), in some embodiments, the pixel unit 100 includesbase:Sub>A semiconductor substrate 110, and the material of the semiconductor substrate 110 may bebase:Sub>A silicon substrate,base:Sub>A germanium substrate,base:Sub>A silicon-on-insulator substrate,base:Sub>A silicon carbide substrate, or other suitable semiconductor material. The photodiode 120 is disposed within the semiconductor substrate 110, and it is understood that the photodiode 120 is used to convert an optical signal into an electrical signal.
In some examples, as shown in fig. 5, the pixel unit 100 further includes a circuit structure layer 130, the circuit structure layer 130 includes an interlayer dielectric 132, and a plurality of metal wires 134 are embedded in the interlayer dielectric 132. Specifically, as shown by the arrow direction in fig. 5 (fig. 5 isbase:Sub>A schematic cross-sectional view of the image sensor in thebase:Sub>A-base:Sub>A direction in fig. 4), when light withbase:Sub>A longer wavelength (e.g., near infrared light or infrared light) is irradiated onto the photodiode 120,base:Sub>A part of the light is absorbed by the photodiode 120 and converted into an electrical signal, and another part of the light (shown by the dashed arrow in fig. 5) can pass through the photodiode 120base:Sub>A and irradiate onto the metal wiring 134base:Sub>A, and is reflected by the metal wiring 134base:Sub>A to the adjacent photodiode 120b, so that optical crosstalk is generated between the adjacent photodiodes 120 (photodiode 120base:Sub>A and photodiode 120 b), that is, between the adjacent pixels 100 (pixel 100base:Sub>A and pixel 100 b). The reliability of use of the pixel cell 100 is reduced. The disadvantages of the related art can be explained here with reference to FIG. 5
It is understood that the pixel unit 100a and the pixel unit 100b may have the same structure or different structures. The pixel units 100a and 100b in the embodiments of the present invention are only for convenience of describing two adjacent pixel units 100, and are not further limited.
As shown in fig. 6 (fig. 6 is another cross-sectional view of the image sensor in fig. 4 along the directionbase:Sub>A-base:Sub>A), the photodiode 120 includesbase:Sub>A light receiving surface 122, and the light receiving surface 122 faces the incident direction of the external light. It is understood that the light irradiated to the light receiving surface 122 can be converted into an electrical signal by the photodiode 120, and the light irradiated to other surfaces can also be converted into an electrical signal by the photodiode 120.
The principle of the photodiode 120 converting an optical signal into an electrical signal is illustrated below. In some embodiments, different dopants, such as trivalent boron and pentavalent phosphorous, may be added to the semiconductor substrate 110, resulting in P-type and N-type dopants. The P-type doped portion and the N-type doped portion can form a PN junction. When light irradiates the photodiode 120, photons carrying energy enter the PN junction, transferring the energy to bound electrons on the covalent bonds, causing some of the electrons to break free of the covalent bonds, thereby generating free-moving electrons and holes, referred to as photogenerated carriers. The electrons move to the N-type doped portion and the holes move to the P-type doped portion, so that the photodiode 120 can convert an optical signal into an electrical signal. In some embodiments, the P-type and N-type dopants may also include an I-type semiconductor (i.e., an intrinsic layer semiconductor) therebetween.
In some embodiments, different photodiodes 120 between the plurality of pixel cells 100 are used to absorb different wavelengths of light, such as red, green, and blue light, and convert the different wavelengths of light into different electrical signals.
As can be seen from the above, the photodiode 120 can convert an optical signal into an electrical signal. In some embodiments, the multilayer metal wiring 134 is electrically connected to the photodiode 120, so that after the photodiode 120 converts an optical signal into an electrical signal, the electrical signal can be transmitted to the outside of the pixel unit 100 through the multilayer metal wiring 134.
As shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the pixel unit 100 further includesbase:Sub>A circuit structure layer 130, the circuit structure layer 130 includes an interlayer dielectric 132, and the multi-layer metal wires 134 are embedded in the interlayer dielectric 132.
In some examples, the circuit structure layer 130 is disposed on a side of the semiconductor substrate 110 away from the light receiving surface 122, so that light is prevented from being blocked by circuit structures such as the metal wires 134 in the circuit structure layer 130, and thus the intensity of light irradiated to the photodiode 120 is ensured, and the performance of the pixel unit 100 is improved. Specifically, the image sensor 400 in which the circuit structure layer 130 is disposed on the side of the semiconductor substrate 110 away from the light receiving surface 122 may be referred to as a back-illuminated image sensor. In some embodiments, the interlayer dielectric 132 may be an insulating material such as silicon oxide or nitrogen oxide, and plays a role of isolating the multilayer metal wire 134.
In some embodiments, the material of the metal wire 134 may be copper or aluminum, which ensures the electrical conductivity of the metal wire 134. The material of the multilayer metal wiring 134 may be the same or different. In some embodiments, the multi-layer metal wiring 134 may be embedded within the interlayer dielectric 132 by etching. Specifically, the multilayer metal wiring 134 is electrically connected to the photodiode 120, so that the electrical signals generated by the photodiode 120 under illumination can be transmitted to the outside through the metal wiring 134.
In some embodiments, as shown in fig. 6 (fig. 6 is another cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the pixel unit 100 may further includebase:Sub>A transfer gate 176. The transfer gate 176 is a Complementary metal Oxide Semiconductor Field Effect transistor (CMOS transistor). The transfer gate 176 includes a signal receiving terminal, a signal input terminal, and a signal output terminal. The signal receiving terminal is used for receiving the trigger signal, the signal input terminal is electrically connected to the photodiode 120, and the signal output terminal is electrically connected to other circuit structures, such as the metal wiring 134. When the signal receiving terminal of the transfer gate 176 receives the trigger signal, conduction can be established between the signal input terminal and the signal output terminal, so that the electrical signal generated by the photodiode 120 can be transmitted to the metal wiring 134 through the transfer gate 176 and transmitted to the outside of the pixel unit 100 through the metal wiring 134.
In some embodiments, as shown in fig. 7, the pixel cell 100 may further include a processing circuit 190. The processing circuit 190 includes an analog-to-digital conversion circuit 192 and an amplification circuit 194. The processing circuit 190 may be electrically connected to the metal wiring 134 so as to be able to perform analog-to-digital conversion processing and amplification processing on the electric signal from the photodiode 120.
In order to reduce the optical crosstalk between two adjacent pixel units 100 (for example, the pixel unit 100base:Sub>A and the pixel unit 100b in fig. 5), as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor in thebase:Sub>A-base:Sub>A direction in fig. 4), the pixel unit 100 provided by the embodiment of the invention may further includebase:Sub>A light-gathering structure 136. The light-condensing structure 136 is embedded in the interlayer medium 132, and it can be understood that, as shown by the dotted arrow in fig. 6, the light-condensing structure 136 can reflect the light passing through the photodiode 120 to the corresponding photodiode 120, so as to reduce the intensity of the light irradiated into the adjacent pixel unit 100, and reduce the optical crosstalk generated between two adjacent pixel units.
Specifically, as shown in fig. 8, the vertical projection of the light-focusing structure 136 on the semiconductor substrate 110 overlaps at least a portion of the vertical projection of the photodiode 120 on the semiconductor substrate 110 (region D in fig. 8), that is, the arrangement position of the light-focusing structure 136 corresponds to the arrangement position of the photodiode 120, so that the light-focusing structure 136 can reflect the light passing through the photodiode 120 to the corresponding photodiode 120, and the reliability of the use of the pixel unit 100 is improved.
In some embodiments, as shown in fig. 9, the vertical projection of the photodiode 120 on the semiconductor substrate 110 falls within the vertical projection of the light-condensing structure 136 on the semiconductor substrate 110, so as to ensure the light-reflecting effect of the light-condensing structure 136, further reduce the crosstalk between two adjacent photodiodes 120, and improve the light-absorbing efficiency of the photodiodes 120.
Specifically, in some embodiments, as shown in FIG. 6 (FIG. 6 is another cross-sectional view of the image sensor along the A-A direction in FIG. 4), the optical refractive index of the semiconductor substrate 110 is β 1, the optical refractive index of the light-condensing structure 136 is β 2, and the optical refractive index of the interlayer medium 132 is β 3, where β 3 < β 2 ≦ β 1.
As shown by the direction of the dotted arrow in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), after the light (e.g., near infrared light or infrared light) passing through the photodiode 120base:Sub>A enters the light-focusing structure 136, since the optical refractive index β 2 of the light-focusing structure 136 is greater than the optical refractive index β 3 of the interlayer medium 132, the light can be totally reflected at the contact surface between the light-focusing structure 136 and the interlayer medium 132. Moreover, since the optical refractive index β 2 of the light-focusing structure 136 is not greater than the optical refractive index β 1 of the semiconductor substrate 110 (i.e., the optical refractive index β 2 of the light-focusing structure 136 is less than or equal to the optical refractive index β 1 of the semiconductor substrate 110), the light in the light-focusing structure 136 can be irradiated into the semiconductor substrate 110 again, that is, the light passing through the photodiode 120a can be irradiated onto the photodiode 120a again and absorbed, so that the absorption rate of the photodiode 120 for the light (especially, light with a longer wavelength, such as near infrared light or infrared light) is improved, and the quantum efficiency of the light with a longer wavelength is enhanced.
In addition, by providing the light-gathering structure 136 to reflect the light passing through the photodiode 120a to the photodiode 120a, the intensity of the light incident into the interlayer medium 132 can be reduced, and the light is prevented from being reflected to other photodiodes 120b by the metal wiring 134 in the circuit structure layer 130, so that the optical crosstalk between two adjacent photodiodes 120 is reduced, that is, the optical crosstalk between two adjacent pixel units 100 is reduced, and the use reliability of the image sensor 400 is improved.
As can be seen from the above description, the light-focusing structure 136 can reflect the light passing through the photodiode 120a to the photodiode 120a again. The material of the light collecting structure 136 will be described below by way of example.
In some embodiments, taking semiconductor substrate 110 as a silicon substrate and interlayer dielectric 132 as silicon dioxide, semiconductor silicon (chemical formula of Si) has an optical refractive index of about 3.42, and silicon dioxide (chemical formula of SiO) 2 ) Is about 1.45, the optical refractive index of the light-concentrating structure 136 is greater than 1.45 and not greater than 3.42. In some embodiments, the light concentrating structure 136 may be a metal oxide, such as iron sesquioxide (chemical formula Fe) 2 O 3 Optical refractive index of about 2.9), titanium dioxide (chemical formula of TiO) 2 Optical refractive index of about 2.55) and magnesium oxide (chemical formula MgO, optical refractive index of about 1.76).
In other embodiments, the light concentrating structure 136 is also a non-metallic compound, such as calcium oxide (chemical formula CaO, light)Refractive index of about 1.83), optical glass (optical refractive index of about 1.64), and calcium carbide (chemical formula CaC) 2 The optical refractive index is about 1.75). In other embodiments, the light-gathering structure 136 may be a metal carbide or other compound or mixture with a refractive index satisfying the requirement.
As can be seen from the above description, as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), in the pixel unit 100 according to the present invention, the light-focusing structure 136 is embedded in the interlayer medium 132, andbase:Sub>A vertical projection of the light-focusing structure 136 on the semiconductor substrate 110 overlaps at leastbase:Sub>A portion ofbase:Sub>A vertical projection of the photodiode 120 on the semiconductor substrate 110, that is, the position where the light-focusing structure 136 is disposed corresponds to the position where the photodiode 120 is disposed. Moreover, the optical refractive index of the light-gathering structure 136 is greater than the optical refractive index of the interlayer medium 132 and is not greater than the optical refractive index of the semiconductor substrate 110, so that when light with a longer wavelength (e.g., near-infrared light or infrared light) irradiates the photodiode 120, a portion of the light can be absorbed by the photodiode 120 and converted into an electrical signal, and another portion of the light can penetrate through the photodiode 120 and irradiate the light-gathering structure 136, and is totally reflected at a contact surface between the light-gathering structure 136 and the interlayer medium 132, so that the light penetrating through the photodiode 120 is reflected to the corresponding photodiode 120, the absorption rate of the photodiode 120 to the light (particularly, light with a longer wavelength, e.g., near-infrared light or infrared light) is improved, the quantum efficiency of the light with a longer wavelength is enhanced, the near-infrared imaging quality is improved, and the use performance of the pixel unit 100 is improved.
Meanwhile, by arranging the light-condensing structure 136 to reflect the light passing through the photodiode 120 to the corresponding photodiode 120, the intensity of the light entering the interlayer medium 132 can be reduced, and the light is prevented from irradiating the rest of the photodiodes 120 under the reflection action of the metal wiring 134 or other structures in the interlayer medium 132, so that the optical crosstalk generated between two adjacent photodiodes 120 is reduced, that is, the optical crosstalk between two adjacent pixel units 100 is reduced, and the use reliability of the pixel units 100 is improved.
Alternatively, as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the surface of the light-focusing structure 136 close to the semiconductor substrate 110 is flush with the surface of the circuit structure layer 130 close to the semiconductor substrate 110.
It is understood that the surface of the light-concentrating structure 136 close to the semiconductor substrate 110 and the surface of the circuit structure layer 130 close to the semiconductor substrate 110 may be completely flush or approximately flush.
Specifically, the surface of the light-gathering structure 136 close to the semiconductor substrate 110 is flush with the surface of the circuit structure layer 130 close to the semiconductor substrate 110, so that the phenomenon that most of light passing through the photodiode 120 is reflected by the metal wiring 134 and cannot enter the light-gathering structure 136 due to an excessively large distance between the light-gathering structure 136 and the photodiode 120 is avoided. Through the arrangement, the intensity of light entering the interlayer medium 132 is further reduced, so that the crosstalk between two adjacent pixel units 100 is reduced, and the use reliability of the pixel units 100 is improved.
Moreover, the surface of the light-gathering structure 136 close to the semiconductor substrate 110 is flush with the surface of the circuit structure layer 130 close to the semiconductor substrate 110, so that the processing convenience of the pixel unit 100 can be improved, and the production cost can be reduced.
Alternatively, as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the cross-section of the light-focusing structure 136 is trapezoidal or rectangular, and the cross-section of the light-focusing structure 136 is perpendicular to the semiconductor substrate 110 and the circuit structure layer 130.
The cross section of the light-focusing structure 136 is perpendicular to the semiconductor substrate 110 and the circuit structure layer 130, that is, the cross section of the light-focusing structure 136 along the direction from the semiconductor substrate 110 to the circuit structure layer 130 is trapezoidal or rectangular, so that the reflection requirements for light rays with different incident angles and different wavelengths can be met, and the use flexibility of the pixel unit 100 is improved.
In some embodiments, when the cross section of the light-gathering structure 136 is a trapezoid, the area of the trapezoid between the plurality of pixel units 100 may be the same or different. When the cross section of the light collecting structure 136 is rectangular, the areas of the rectangles between the plurality of pixel units 100 may be the same or different.
In some embodiments, when the cross section of the light-focusing structure 136 is a trapezoid, the light-focusing structure 136 may have a circular truncated cone shape or a truncated pyramid shape. When the cross section of the light-condensing structure 136 is rectangular, the light-condensing structure 136 may be a rectangular parallelepiped, a cylinder, or the like.
Alternatively, as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor in thebase:Sub>A-base:Sub>A direction in fig. 4), in the case that the cross-section of the light-gathering structure 136 isbase:Sub>A trapezoid, the trapezoid includes an upper base andbase:Sub>A lower base, the upper base is parallel to the lower base, and the length of the upper base is smaller than that of the lower base. The lower bottom is flush with the surface of the circuit structure layer 130 on the side close to the semiconductor substrate 110, and the upper bottom is far away from the semiconductor substrate 110. It will be appreciated that the upper and lower bottoms of the trapezoids are parallel to one another, the lower bottom being longer than the upper bottom.
The lower bottom of the trapezoid is flush with the surface of the circuit structure layer 130 close to the semiconductor substrate 110, so that the area of one side of the light-gathering structure 136 close to the photodiode 120 is increased, light is further ensured to irradiate the light-gathering structure 136 after passing through the photodiode 120, the intensity of the light entering the light-gathering structure 136 is increased, crosstalk generated between two adjacent photodiodes 120 is reduced, and the use reliability of the pixel unit 100 is improved.
Moreover, the trapezoidal upper bottom is far away from the semiconductor substrate 110, so that the area of the side, away from the photodiode 120, of the light condensing structure 136 can be smaller than the area of the side, close to the photodiode 120, of the light condensing structure 136, the reflection effect of the light condensing structure 136 on light is improved, the intensity of light re-irradiated to the corresponding photodiode 120 is increased, and the quantum efficiency of light with longer wavelength is improved.
As can be seen from the above, the light-concentrating structure 136 is embedded in the semiconductor substrate 110. Based on this, in order to implement the above structure, as shown in fig. 10 (fig. 10 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), optionally, an accommodating hole 138 may be formed on the surface of the circuit structure layer 130 close to the semiconductor substrate 110, and the light-focusing structure 136 may be embedded in the accommodating hole 138.
It is understood that the receiving hole 138 may have a circular truncated cone shape, a cylindrical shape, or the like, and the shape of the receiving hole 138 is adapted to the shape of the light-collecting structure 136. The accommodation hole 138 is free from the metal wiring 134, so that the influence of the accommodation hole 138 on the transmission of the electric signal is avoided, and the use reliability of the pixel unit 100 is improved.
In some embodiments, as shown in fig. 10 (fig. 10 is another cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the depth of the receiving hole 138 may be the same as the depth of the circuit structure layer 130. In other embodiments, the depth of the receiving hole 138 may be smaller than the depth of the circuit structure layer 130. In some embodiments, the depth of the receiving hole 138 may be the same or different between different pixel units 100.
Alternatively, as shown in fig. 11 (fig. 11 is another cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the light-condensing structure 136 includesbase:Sub>A metal layer 142 andbase:Sub>A light-condensing structure body 144, the metal layer 142 covers an inner wall 152 of the accommodating hole 138, and the light-condensing structure body 144 is embedded in the accommodating hole 138.
It is understood that, as shown by the direction of the dotted arrow in fig. 11 (fig. 11 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), after the light passing through the photodiode 120 is irradiated to the light-focusing structure body 144, since the light refractive index of the light-focusing structure body 144 is greater than that of the interlayer medium 132 and is not greater than that of the semiconductor substrate 110, the light can be totally reflected in the contact surface between the light-focusing structure body 144 and the interlayer medium 132. However, due to the difference of the incident angles of the light rays, part of the light rays can pass through the light-gathering structure body 144. Therefore, the metal layer 142 is disposed to cover the inner wall 152 of the accommodating hole 138, so that the metal layer 142 can be located between the light-concentrating structure body 144 and the interlayer medium 132, and the metal layer 142 blocks light.
It can be understood that at least part of the light irradiated to the metal layer 142 can be blocked and reflected by the metal layer 142, so that the light is further prevented from passing through the light-condensing structure body 144 to irradiate and enter the interlayer medium 132, the light-reflecting effect of the light-condensing structure 136 on the light is improved, the optical crosstalk generated between two adjacent pixel units 100 is further reduced, and the use reliability of the pixel units 100 is improved.
In some embodiments, as shown in fig. 12, the method for manufacturing the light-focusing structure 136 includes steps S1 to S4:
step S1, forming a plurality of layers of metal wiring lines and an interlayer medium positioned between two adjacent layers of metal wiring lines;
specifically, a layer of interlayer dielectric 132 may be formed first, and then a layer of metal may be laid on the interlayer dielectric 132. Next, a metal wiring 134 is formed by removing a portion of the metal through an etching process. By repeating the above steps, a plurality of metal wires 134 and an interlayer dielectric 132 between two adjacent metal wires 134 may be formed. In addition, a via hole penetrating the interlayer dielectric 132 may be formed on the interlayer dielectric 132, and the via hole may electrically connect the adjacent metal wirings 134.
Step S2, forming an accommodating hole at one side of the structure formed in the step S1, which is close to the semiconductor substrate;
specifically, the receiving hole 138 may be opened in the structure formed in step S1 by etching. As can be appreciated, the receiving hole 138 is clear of the metal wiring 134.
S3, coating a metal layer on the inner wall of the accommodating hole;
and S4, embedding the light-gathering structure body into the accommodating hole.
After the processing circuit 190 and the multilayer metal wiring 134 are etched, the accommodating hole 138 is formed in the surface of the circuit structure layer 130 close to the semiconductor substrate 110, so that the influence of the etching influence of the metal wiring 134 on the light-gathering structure 136 is avoided, and the use reliability of the pixel unit 100 is further improved. In some embodiments, the metal layer 142 and the metal wire 134 may be made of the same material or different materials.
In step S3, when the metal layer 142 is coated on the inner wall 152 of the receiving hole 138, as shown in fig. 13 (fig. 13 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4),base:Sub>A part of the inner wall 152 may be coated, as shown in fig. 14 (fig. 14 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), or all of the inner wall 152 may be coated.
Alternatively, as shown in fig. 15 (fig. 15 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the inner wall 152 of the accommodating hole 138 includesbase:Sub>A side wall 154 andbase:Sub>A bottom wall 156, and the metal layer 142 covers at least one of the side wall 154 and the bottom wall 156.
It is understood that the sidewalls 154 of the receiving hole 138 are connected to the surface of the circuit structure layer 130 close to the semiconductor substrate 110, and the bottom wall 156 of the receiving hole 138 is far away from the semiconductor substrate 110. The metal layer 142 covers the bottom wall 156 or the side wall 154, or covers the bottom wall 156 and the side wall 154, so that the blocking and reflecting effects of the metal layer 142 on light rays are ensured, and the reflecting performance of the light condensing structure 136 on the light rays is improved. The provision of the metal layer 142 covering at least one of the bottom wall 156 and the side wall 154 improves the flexibility of use of the pixel cell 100 to meet different requirements.
In some embodiments, as shown in fig. 13 (fig. 13 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), when the depth of the receiving hole 138 is the same as the depth of the circuit structure layer 130, the surface of the metal layer 142 covering the bottom wall 156 on the side away from the semiconductor substrate 110 is flush with the surface of the circuit structure layer 130 on the side away from the semiconductor substrate 110, so that the structural regularity of the pixel unit 100 is improved.
Alternatively, as shown in fig. 11 (fig. 11 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), when the metal layer 142 covers the bottom wall 156, the metal layer 142 and the metal wire 134 are the same material in the same layer.
It is to be understood that the same layer refers to a layer structure formed by forming a film layer for forming a specific pattern using the same film formation process and then performing a patterning process once using the same mask plate. Depending on the specific pattern, the same patterning process may include multiple exposure, development or etching processes, and the specific pattern in the layer structure may be continuous or discontinuous, and the specific patterns may be at different heights or have different thicknesses.
When the metal layer 142 covers the bottom wall 156, the same material is used for the metal layer 142 and the metal wire 134, which improves the processing convenience of the pixel unit 100, shortens the processing time of the pixel unit 100, and reduces the production cost of the pixel unit 100.
As can be seen from the above, the photodiode 120 can convert an optical signal into an electrical signal. In order to enable the electronic device 300 to generatebase:Sub>A color image and increase the intensity of light irradiating the pixel unit 100, optionally, as shown in fig. 13 (fig. 13 is another cross-sectional view of the image sensor in thebase:Sub>A-base:Sub>A direction in fig. 4), the pixel unit 100 may further includebase:Sub>A filter 162 andbase:Sub>A microlens 164. The filter 162 is disposed on the light receiving surface 122 side of the semiconductor substrate 110, and covers the light receiving surface 122. The microlens 164 is disposed on a side of the filter 162 away from the semiconductor substrate 110.
In some embodiments, the optical filters 162 between the plurality of pixel units 100 are used for filtering light with different wavelengths respectively. Specifically, the plurality of pixel units 100 may include filters 162 for filtering red light, green light, and blue light, respectively, so that the red light, the green light, and the blue light can be irradiated to different photodiodes 120, respectively, thereby realizing the synthesis of light of a plurality of different colors, so that the electronic device 300 can generate a colored image.
It can be understood that the micro lens 164 is used to gather light, so as to increase the intensity of the light irradiated to the light receiving surface 122 of the photodiode 120, so that the pixel unit 100 can normally operate in an environment with poor light, and the applicability of the pixel unit 100 is improved.
In some embodiments, the number of microlenses 164, photodiodes 120, and filters 162 may or may not be the same. In some embodiments, the micro lenses 164 may be a combination of various lenses such as a spherical lens, an aspherical lens, a prism, and a prism, which ensures the light condensing performance of the micro lenses 164.
In a second aspect, as shown in fig. 4, an embodiment of the present invention provides an image sensor 400, where the image sensor 400 includes a plurality of pixel units 100 as described above, and the plurality of pixel units 100 are arranged in an array.
The image sensor 400 provided by the embodiment of the present invention includes a plurality of pixel units 100 as described above, so that all the above-mentioned advantages are achieved, and the description thereof is omitted.
In some embodiments, image sensor 400 is a back-illuminated image sensor.
As can be seen from the above, the light-condensing structure 136 can reduce crosstalk of light generated between two adjacent pixel units 100. Optionally, as shown in fig. 10 (fig. 10 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the pixel unit 100 may further includebase:Sub>A spacer 166. The spacers 166 are disposed at least between the pixel units 100.
It is understood that the spacer 166 is used for isolating light, so that the crosstalk of light generated between adjacent pixel units 100 can be further reduced, and the reliability of the image sensor 400 can be improved. In some embodiments, the spacer 166 may be a metal or a non-metal.
In some embodiments, as shown in fig. 10 (fig. 10 is another cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the spacers 166 includebase:Sub>A first spacer 172 andbase:Sub>A second spacer 174. The first spacer 172 is disposed between the two filters 162 to block light. Specifically, the first separator 172 may be a metal grid. The second spacer 174 is disposed between the two photodiodes 120 and functions to block or reflect light. Specifically, a deep trench isolation trench may be opened in the semiconductor substrate 110 between the two photodiodes 120, and the second spacer 174 may be embedded in the deep trench isolation trench. In some embodiments, the second spacer 174 may be a metal, a compound or a combination of reflective materials, or the like.
In a third aspect, as shown in fig. 2, an embodiment of the invention provides a camera assembly 200. Camera head assembly 200 includes lens assembly 210 and image sensor 400 as described above. The image sensor 400 is disposed at the light-emitting side of the lens assembly 210.
The camera assembly 200 provided by the embodiment of the present invention includes the image sensor 400, so that all the above advantages are achieved, and are not described herein again.
It will be appreciated that the camera assembly 200 is used to take pictures or video and the lens assembly 210 is a combination of convex and concave lenses. The lens assembly 210 includes a light entrance side and a light exit side, and light enters the lens assembly 210 from the light entrance side of the lens assembly 210 and exits from the light exit side of the lens assembly 210. The image sensor 400 is disposed on the light-emitting side of the lens assembly 210, so as to receive light from the lens assembly 210, and convert the optical signal into an electrical signal.
In some embodiments, the number of the image sensors 400 may be one or more. The number of image sensors 400 and the number of lens assemblies 210 may be the same or different.
In a fourth aspect, as shown in fig. 1, an embodiment of the invention provides an electronic device 300. Electronic device 300 includes a housing 310 and a camera assembly 200. The housing 310 defines a first through hole, and at least a portion of the camera assembly 200 is inserted into the first through hole.
The embodiment of the present invention provides an electronic device 300 including the above-mentioned camera assembly 200, so that all the above-mentioned advantages are achieved, and details are not repeated herein.
In some embodiments, the electronic device 300 may be a mobile phone, a computer, or a digital camera. The number of the camera head assemblies 200 may be one or more, and the number of the first through holes is the same as the number of the camera head assemblies 200.
It will be appreciated that at least a portion of the camera assembly 200 is embedded within the first through hole such that light can illuminate the image sensor 400 through the first through hole to effect conversion of the optical signal to an electrical signal.
Optionally, as shown in fig. 3, the housing 310 is further provided with a second through hole. The electronic device 300 further comprises a transmitter 320. The emitter 320 is disposed in the second through hole, the emitter 320 is configured to emit light to the object to be measured, and the camera assembly 200 is configured to receive the light reflected by the object to be measured.
In some embodiments, the emitter 320 is configured to emit invisible light with longer wavelengths, such as near-infrared light or infrared light. The invisible light is received by the camera assembly 200 under the reflection action of the object to be detected, so that the object is identified or detected, the problem that the object cannot be accurately identified or detected under the condition that the light rays of the camera assembly 200 are poor is avoided, the accuracy of the camera assembly 200 in identifying or detecting the object is further improved, and the use reliability of the electronic equipment 300 is improved.
In some embodiments, the number of the emitters 320 may be one or more, and the number of the second through holes is the same as the number of the emitters 320. The number of the emitters 320 is multiple, invisible light is emitted to the object to be detected at different positions, and the identification or detection accuracy of the object to be detected is improved. It will be appreciated that the number of camera head assemblies 200 and emitters 320 may be the same or different.
In one embodiment, as shown in fig. 4, an image sensor 400 is provided, in particular, the image sensor 400 is a back-illuminated CMOS image sensor.
The image sensor 400 includesbase:Sub>A plurality of pixel units 100 arranged in an array, as shown in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the pixel units 100 includebase:Sub>A semiconductor substrate 110, and the semiconductor substrate 110 isbase:Sub>A silicon substrate. The photodiode 120 is disposed within a silicon substrate, and the photodiode 120 is a PN type photodiode. The circuit structure layer 130 is disposed on a side of the photodiode 120 away from the light receiving surface 122. The circuit structure layer 130 includes an interlayer dielectric 132, and the interlayer dielectric 132 is silicon oxide. Metal wiring 134 is embedded in interlayer dielectric 132 by means of etching. The photodiode 120 is electrically connected to the metal wiring 134 so that an electrical signal generated by the photodiode 120 can be transmitted to the outside through the metal wiring 134.
The pixel cell 100 further includes a transfer gate 176, and the transfer gate 176 is a CMOS transistor. The transfer gate 176 has a signal receiving terminal for receiving a trigger signal, a signal input terminal electrically connected to the photodiode 120, and a signal output terminal electrically connected to the metal wiring 134. When the signal receiving terminal of the transfer gate 176 receives the trigger signal, the signal input terminal and the signal output terminal are connected, and the electrical signal generated by the photodiode 120 under illumination is transmitted to the metal wiring 134 through the transfer gate 176 and is transmitted to the outside of the pixel unit 100 through the metal wiring 134.
The light focusing structure 136 is embedded within the interlayer medium 132, and the light focusing structure 136 is disposed opposite the photodiode 120. Specifically, as shown in fig. 10 (fig. 10 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the circuit structure layer 130 has an accommodating hole 138 onbase:Sub>A surface thereof close to the silicon substrate, and as shown in fig. 11 (fig. 11 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the light-condensing structure 136 includesbase:Sub>A metal layer 142 andbase:Sub>A light-condensing structure body 144. The metal layer 142 covers the bottom wall 156 of the receiving hole 138, the light-focusing structure body 144 is embedded in the receiving hole 138, and the surface of the light-focusing structure body 144 close to the silicon substrate is flush with the surface of the circuit structure layer 130 close to the silicon substrate.
As shown in fig. 13 (fig. 13 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the filter 162 is disposed on one side of the light receiving surface 122 and covers the light receiving surface 122. Specifically, the image sensor 400 includes a plurality of filters 162, and the plurality of filters 162 are respectively used to filter light of different wavelengths, so that red light, filtered light, and blue light can be respectively irradiated to the photodiodes 120 through the filters 162. The micro-lenses 164 are disposed on a side of the optical filter 162 away from the silicon substrate, and the micro-lenses 164 are used to focus light.
Specifically, as shown by the arrow in fig. 6 (fig. 6 is another schematic cross-sectional view of the image sensor along thebase:Sub>A-base:Sub>A direction in fig. 4), the external light is filtered by the filter 162 under the converging action of the microlens 164 and then irradiates the photodiode 120. A part of the light irradiated to the photodiode 120 is absorbed and converted into an electric signal, and is transmitted to the outside of the pixel unit 100 through the metal wiring 134. While another portion of the light (as indicated by the dashed arrow in fig. 6) can pass through the photodiode 120 into the light-concentrating structure body 144 due to the longer wavelength, such as near infrared light or infrared light. Because the optical refractive index of the light condensing structure body 144 is greater than the optical refractive index of the interlayer medium 132, light can be totally reflected on the contact surface of the light condensing structure body 144 and the interlayer medium 132, and the optical refractive index of the light condensing structure body 144 is not greater than the optical refractive index of the silicon substrate, and the light can irradiate the corresponding photodiode 120 again under the reflection action of the light condensing structure body 144. In addition, the metal layer 142 covers the bottom wall 156, so as to shield light and further prevent the light in the light-gathering structure body 144 from being incident into the interlayer medium 132.
By arranging the light-gathering structure body 144 and the metal layer 142, the absorption efficiency of the photodiode 120 for long-wavelength light (e.g., near-infrared light or infrared light) is improved, so that the quantum efficiency of light with longer wavelength is improved, the near-infrared imaging quality is improved, and the usability of the pixel unit 100 is ensured.
Moreover, by providing the light-gathering structure body 144 and the metal layer 142, the intensity of light entering the interlayer medium 132 can be reduced, light is prevented from entering other photodiodes 120 under the reflection action of the metal wiring 134 or other circuit structures in the interlayer medium 132, optical crosstalk between two adjacent photodiodes 120 is reduced, that is, optical crosstalk between two adjacent pixel units 100 is reduced, and the use reliability of the pixel units 100 is improved.
One surface of the light-gathering structure body 144 close to the silicon substrate is flush with one surface of the circuit structure layer 130 close to the silicon substrate, so that the phenomenon that the distance between the light-gathering structure body 144 and the photodiode 120 is too large is avoided, light cannot enter the light-gathering structure body 144 after passing through the photodiode 120, the intensity of the light entering the interlayer medium 132 is further reduced, and the optical crosstalk between two adjacent pixel units 100 is reduced.
It is understood that the light-gathering structure body 144 can be a metal oxide, a non-metal oxide, a metal nitride, a non-metal compound or a mixture, etc., which improves the flexibility of the pixel unit 100.
As shown in fig. 11 (fig. 11 is another schematic cross-sectional view of the image sensor in fig. 4 along thebase:Sub>A-base:Sub>A direction), the cross-section of the light-gathering structure 136 along the direction from the silicon substrate to the circuit structure layer 130 is trapezoidal, the lower bottom of the trapezoid is flush with one surface of the circuit structure layer 130 close to the silicon substrate, and the upper bottom of the trapezoid is far away from the silicon substrate, so that the intensity of light entering the light-gathering structure 136 is further increased, the light-gathering structure 136 has an effect of reflecting the light, the optical crosstalk between two adjacent pixel units 100 is reduced, and the reliability of the pixel units 100 is improved.
As shown in fig. 10 (fig. 10 is another schematic cross-sectional view of the image sensor along the directionbase:Sub>A-base:Sub>A in fig. 4), the image sensor 400 further includesbase:Sub>A first spacer 172 andbase:Sub>A second spacer 174. Specifically, the first spacer 172 is a metal grid disposed between the two optical filters 162, and reduces optical crosstalk between the two optical filters 162. A backside deep trench isolation trench is formed in the silicon substrate between the two photodiodes 120, and the second spacer 174 is embedded in the backside deep trench isolation trench, so as to reduce optical crosstalk between the two photodiodes 120, thereby further improving the reliability of the image sensor 400. Specifically, the second spacer 174 is made of metal.
The particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the specific embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present invention, and all should be covered within the protection scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (13)

1. A pixel cell, comprising:
a semiconductor substrate;
a photodiode disposed within the semiconductor substrate, the photodiode having a light receiving surface;
the circuit structure layer is arranged on one side, far away from the light receiving surface, of the semiconductor substrate and comprises an interlayer medium, a plurality of layers of metal wires embedded in the interlayer medium and a light-gathering structure embedded in the interlayer medium, and the plurality of layers of metal wires are electrically connected with the photodiode;
wherein a vertical projection of the light-concentrating structure on the semiconductor substrate overlaps at least a portion of a vertical projection of the photodiode on the semiconductor substrate;
at least a portion of the light concentrating structures have an optical refractive index greater than an optical refractive index of the interlayer medium and not greater than an optical refractive index of the semiconductor substrate.
2. The pixel unit according to claim 1, wherein a surface of the light-concentrating structure on a side close to the semiconductor substrate is flush with a surface of the circuit structure layer on a side close to the semiconductor substrate.
3. The pixel unit according to claim 2, wherein the cross section of the light-focusing structure is trapezoidal or rectangular, and the cross section of the light-focusing structure is perpendicular to the semiconductor substrate and the circuit structure layer.
4. The pixel unit according to claim 3, wherein in a case that a cross section of the light-condensing structure is a trapezoid, the trapezoid includes an upper base and a lower base, the upper base is parallel to the lower base, a length of the upper base is smaller than a length of the lower base, the lower base is flush with a surface of the circuit structure layer on a side close to the semiconductor substrate, and the upper base is far away from the semiconductor substrate.
5. The pixel unit according to any one of claims 1 to 4, wherein a surface of the circuit structure layer on a side close to the semiconductor substrate is provided with a receiving hole, and the light-condensing structure comprises:
a metal layer covering an inner wall of the accommodation hole;
and the light condensation structure body is embedded in the accommodating hole, and the light refractive index of the light condensation structure body is greater than that of the interlayer medium and is not greater than that of the semiconductor substrate.
6. The pixel cell of claim 5, wherein the inner wall of the receiving hole includes a sidewall and a bottom wall, and the metal layer covers at least one of the sidewall and the bottom wall.
7. The pixel cell of claim 6, wherein the metal layer is the same material as the metal wire when the metal layer covers the bottom wall.
8. The pixel cell of claim 1, further comprising:
the optical filter is arranged on one side, close to the light receiving surface, of the semiconductor substrate and covers the light receiving surface;
and the micro lens is arranged on one side of the optical filter, which is far away from the semiconductor substrate.
9. An image sensor comprising a plurality of pixel cells according to any one of claims 1 to 8, the plurality of pixel cells being arranged in an array.
10. The image sensor of claim 9, further comprising:
and the separator is arranged between at least two pixel units.
11. A camera assembly, comprising:
a lens assembly;
the image sensor of claim 9 or 10, disposed on an exit side of the lens assembly.
12. An electronic device, comprising:
the shell is provided with a first through hole;
the camera assembly of claim 11, at least a portion of the camera assembly being embedded within the first through-hole.
13. The electronic device of claim 12, wherein the housing further defines a second through hole, the electronic device further comprising:
the emitter is arranged in the second through hole and used for emitting light to an object to be measured, and the camera assembly is used for receiving the light reflected by the object to be measured.
CN202221460746.7U 2022-06-10 2022-06-10 Pixel unit, image sensor, camera assembly and electronic equipment Active CN217588937U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202221460746.7U CN217588937U (en) 2022-06-10 2022-06-10 Pixel unit, image sensor, camera assembly and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202221460746.7U CN217588937U (en) 2022-06-10 2022-06-10 Pixel unit, image sensor, camera assembly and electronic equipment

Publications (1)

Publication Number Publication Date
CN217588937U true CN217588937U (en) 2022-10-14

Family

ID=83530081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202221460746.7U Active CN217588937U (en) 2022-06-10 2022-06-10 Pixel unit, image sensor, camera assembly and electronic equipment

Country Status (1)

Country Link
CN (1) CN217588937U (en)

Similar Documents

Publication Publication Date Title
US9054243B2 (en) Solid-state image sensor and imaging system
CN111325125B (en) electronic device
CN101800233B (en) Solid-state imaging device, method of manufacturing the same, and electronic apparatus
KR102270950B1 (en) Solid-state image pickup device, method of manufacturing the same, and electronic apparatus
KR102425587B1 (en) Imaging element and imaging device
JP6364667B2 (en) Photodetector, solid-state imaging device, and manufacturing method thereof
US7808023B2 (en) Method and apparatus providing integrated color pixel with buried sub-wavelength gratings in solid state imagers
US20090189055A1 (en) Image sensor and fabrication method thereof
CN106992193B (en) Image sensor with a plurality of pixels
CN106469740A (en) Infrared image sensor
CN101546779A (en) Solid-state image pickup device
US8541856B2 (en) Optical touch-screen imager
US8969778B2 (en) Plasmonic light collectors
US6943425B2 (en) Wavelength extension for backthinned silicon image arrays
JP2021509544A (en) Semiconductor optical sensor for detecting visible light and ultraviolet light and its manufacturing process
TWI588980B (en) Image sensor and image capture device
JP2011243885A (en) Solid-state imaging device and method of manufacturing the same
CN217588937U (en) Pixel unit, image sensor, camera assembly and electronic equipment
JP2008042024A (en) Solid-state imaging device
CN109065564B (en) Image sensor and forming method thereof
CN102569327B (en) The imaging sensor and its manufacture method of built-in Fresnel lens
CN217588939U (en) Image sensor, camera assembly and electronic equipment
CN114975500A (en) Pixel unit, image sensor, camera assembly and electronic equipment
CN115148753A (en) Photodiode array, preparation method, sensor, camera and electronic equipment
JP2006324810A (en) Optical module

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant