WO2020116088A1 - Dispositif à semi-conducteur et dispositif d'imagerie - Google Patents

Dispositif à semi-conducteur et dispositif d'imagerie Download PDF

Info

Publication number
WO2020116088A1
WO2020116088A1 PCT/JP2019/043904 JP2019043904W WO2020116088A1 WO 2020116088 A1 WO2020116088 A1 WO 2020116088A1 JP 2019043904 W JP2019043904 W JP 2019043904W WO 2020116088 A1 WO2020116088 A1 WO 2020116088A1
Authority
WO
WIPO (PCT)
Prior art keywords
conductor
image
chip
semiconductor chip
substrate
Prior art date
Application number
PCT/JP2019/043904
Other languages
English (en)
Japanese (ja)
Inventor
俊介 矢守
麗 高森
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2020116088A1 publication Critical patent/WO2020116088A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • the present disclosure relates to a semiconductor device and an imaging device. More specifically, the present invention relates to a semiconductor device on which a semiconductor chip having a light-receiving surface for generating an image signal is mounted, and an imaging device including the semiconductor device.
  • imaging devices such as a digital still camera, a digital video camera (for example, a camera-integrated recorder) that captures an image of a subject and generates image data, a surveillance camera, a smartphone, and a mobile phone with a camera have been widely used.
  • a semiconductor device included in these image pickup devices for example, there is a semiconductor device in which a semiconductor chip having a light receiving surface for generating an image signal is mounted.
  • a driving member such as a lens driving mechanism is often arranged near such a semiconductor device.
  • the operation of the driving member may cause noise in the image signal generated by the semiconductor chip.
  • a lens holder driving device includes an electromagnetic wave shield member that shields the electromagnetic wave caused by the current supplied to the camera shake correction coil from being radiated below the flexible printed circuit board (for example, patents).
  • Reference 1. an electromagnetic wave shield member that shields the electromagnetic wave caused by the current supplied to the camera shake correction coil from being radiated below the flexible printed circuit board
  • a driving member such as a lens driving mechanism may be arranged near a semiconductor chip (having a light receiving surface for generating an image signal).
  • the magnetic flux from the driving member of the semiconductor chip may cause noise in the image signal generated by the semiconductor chip. Further, noise caused by magnetic flux based on a signal transmitted by a bonding wire when mounting a semiconductor chip may become a problem.
  • the present disclosure has been made in view of the above-mentioned problems, and has an object to reduce noise generated in a semiconductor device.
  • a first aspect of the present disclosure is a semiconductor device in which a conductor is arranged on a portion other than the light receiving surface of a semiconductor chip having a light receiving surface for generating an image signal.
  • the conductor may be arranged so as to cover the outer surface of the semiconductor chip.
  • the conductor may be arranged so as to cover the entire outer surface of the semiconductor chip. ..
  • the semiconductor chip has a substantially rectangular shape in a top view, and the conductor is arranged in a square shape in a top view so as to cover the entire outer surface of the semiconductor chip. May be done.
  • the conductor may be arranged between the substrate on which the semiconductor chip is mounted and the semiconductor chip.
  • the semiconductor chip may be mounted on the upper side of the conductor.
  • the conductor may be an adhesive used to mount the semiconductor chip.
  • the semiconductor chip has a substantially rectangular shape in a top view
  • the conductor is a first conductor arranged between the substrate on which the semiconductor chip is mounted and the semiconductor chip.
  • a second conductor arranged to cover the outer surface of the semiconductor chip, the first conductor having a substantially rectangular shape in a top view, and the second conductor being the semiconductor chip. May be arranged in a square shape in a top view so as to cover all of the outer side surfaces of the semiconductor chip, and the semiconductor chip may be mounted on the upper side of the first conductor.
  • the size of the first conductor is larger than the size of the square-shaped second conductor in a top view, and the second conductor is the same as the first conductor. It may be arranged on the upper side.
  • the semiconductor chip may be flip-chip connected to a substrate on which the semiconductor chip is mounted, and the conductor may be arranged so as to cover an outer surface of the semiconductor chip.
  • the conductor may be arranged between the substrate and the semiconductor chip.
  • the conductor may be arranged so as to avoid the solder that electrically connects the substrate and the semiconductor chip.
  • the conductor may not be connected to the ground potential of the semiconductor chip.
  • a second aspect of the present disclosure is a semiconductor device in which a conductor is arranged on a portion other than the light receiving surface of a semiconductor chip having a light receiving surface for generating an image signal, and an image signal generated by the semiconductor device. And a processing circuit for processing the.
  • Adopting such a mode brings about an effect of reducing the noise generated in the image signal generated by the semiconductor chip, by the conductor arranged on the portion other than the light receiving surface of the semiconductor chip.
  • FIG. 3 is a perspective view showing an example of an external configuration of the image sensor 1 according to the first embodiment of the present disclosure.
  • FIG. 4 is a diagram for explaining the effect of conductors 131 and 132 of the image sensor 1 according to the first embodiment of the present disclosure.
  • FIG. 4 is a diagram for explaining the effect of conductors 131 and 132 of the image sensor 1 according to the first embodiment of the present disclosure.
  • It is a figure showing an example of a manufacturing method of image pick-up element 1 concerning a 1st embodiment of this indication.
  • FIG. 9 is a perspective view and a cross-sectional view showing a configuration example of an image sensor 200 according to a second embodiment of the present disclosure. It is a figure which shows an example of the manufacturing method of the image pick-up element 200 which concerns on the 2nd Embodiment of this indication. It is a figure which shows an example of the manufacturing method of the image pick-up element 200 which concerns on the 2nd Embodiment of this indication. It is a figure showing an example of a manufacturing method of image pick-up element 300 concerning a 3rd embodiment of this indication. It is a figure showing an example of a manufacturing method of image pick-up element 300 concerning a 3rd embodiment of this indication.
  • FIG. 16 is a cross-sectional view showing an example of a configuration of a solder 551, an underfill 552, and conductors 561 and 562 provided on a substrate 510 in the image sensor 500 according to the fourth embodiment of the present disclosure.
  • FIG. 16 is a cross-sectional view showing a configuration example of an image sensor 600 according to a modified example of the fourth embodiment of the present disclosure. It is a sectional view showing an example of composition of an image sensor which can apply a technology concerning this indication.
  • FIG. 16 is a cross-sectional view showing a first configuration example of a stacked solid-state imaging device 23020.
  • FIG. 16 is a cross-sectional view showing a second configuration example of a stacked solid-state imaging device 23020.
  • FIG. 16 is a cross-sectional view showing a third configuration example of a stacked solid-state imaging device 23020.
  • FIG. 14 is a cross-sectional view showing another configuration example of a stacked solid-state imaging device to which the technology according to the present disclosure can be applied.
  • FIG. 1 is a diagram illustrating a configuration example of an image sensor according to the first embodiment of the present disclosure.
  • the image sensor 1 shown in FIG. 1 includes a pixel array section 10, a vertical drive section 20, a column signal processing section 30, and a control section 40.
  • the pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional lattice shape.
  • the pixel 100 produces
  • the pixel 100 has a photoelectric conversion unit that generates electric charges according to the applied light.
  • the pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. Generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later.
  • signal lines 11 and 12 are arranged in an XY matrix.
  • the signal line 11 is a signal line that transmits a control signal of a pixel circuit in the pixel 100, is arranged for each row of the pixel array unit 10, and is commonly wired to the pixels 100 arranged in each row.
  • the signal line 12 is a signal line for transmitting an image signal generated by the pixel circuit of the pixel 100, is arranged for each column of the pixel array section 10, and is commonly wired to the pixels 100 arranged in each column. It These photoelectric conversion units and pixel circuits are formed on a semiconductor substrate.
  • the vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100.
  • the vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 11 in the figure.
  • the column signal processing unit 30 processes the image signal generated by the pixel 100.
  • the column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 12 in FIG.
  • the processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion for converting an analog image signal generated in the pixel 100 into a digital image signal.
  • the image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1.
  • the control unit 40 controls the entire image sensor 1.
  • the control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical drive unit 20 and the column signal processing unit 30.
  • the control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 via the signal lines 41 and 42, respectively.
  • the electronic circuits that make up the pixel circuit and the vertical drive unit 20 described above operate with reference to the ground potential (GND).
  • FIG. 2 is a perspective view showing an example of the external configuration of the image sensor 1 according to the first embodiment of the present disclosure.
  • the X direction, the Y direction, and the Z direction are three directions orthogonal to each other.
  • the image sensor 1 has a substrate 110, an image sensor chip 120, conductors 131 and 132, and a bonding wire 140.
  • the first embodiment shows an example in which the image pickup device chip 120 is mounted on the flat substrate 110.
  • the image sensor 1 is an example of the semiconductor device described in the claims.
  • the substrate 110 is a package substrate on which the imaging element chip 120 is mounted on the upper surface (upper surface in the Z direction) (upper surface). Further, inside the substrate 110, a plurality of wirings (not shown) for electrically connecting an external terminal (not shown) provided on the substrate 110 and the image pickup device chip 120 are provided. .. Each of these wirings is electrically connected to the imaging element chip 120 by a plurality of bonding wires 140 via pads (not shown) formed on the upper surface of the substrate 110. Note that each of these wirings can be provided between each layer forming the substrate 110. Further, as the material of each of these wirings, for example, tungsten, copper, or the like can be used.
  • the substrate 110 may be made of a material such as synthetic resin or ceramic.
  • the image pickup device chip 120 receives the light emitted to the light receiving surface (pixel area (the surface on which the pixel 100 shown in FIG. 1 is arranged)) through the optical system (not shown), and each pixel emits the light.
  • the semiconductor chip outputs an image signal according to the amount of received light.
  • the image sensor chip 120 is mounted on the upper side (upper side in the Z direction) of the conductor 131 arranged on the upper surface of the substrate 110.
  • a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) sensor can be used.
  • the image pickup device chip 120 is an example of the semiconductor chip described in the claims.
  • the conductor 131 is a conductor (conductor) arranged between the substrate 110 and the imaging element chip 120.
  • the conductor 131 is a paste-like conductor (so as to be larger than the size of the imaging element chip 120 on the upper surface (upper surface in the Z direction) of the substrate 110 (upper surface)).
  • it is formed by applying a conductive paste). That is, the rectangular area of the conductor 131 is made larger than the rectangular area of the imaging element chip 120 in a top view (when viewed from the upper side in the Z direction). This is to prevent the generation of gaps and improve the magnetic flux shielding effect.
  • the size of the conductor 131 may be the same (or substantially the same) as the size of the imaging element chip 120 in a top view, and is the same (or substantially the same) as the size of the light receiving surface of the imaging element chip 120. The same).
  • the conductive paste for example, one in which metal particles are dispersed in a resin such as an adhesive, such as a silver paste or a copper paste, can be used.
  • the conductor 131 can also be used as a die bond resin for mounting the image pickup device chip 120 on the substrate 110 by bonding.
  • the conductor 132 is a conductor arranged so as to surround the image pickup device chip 120 on the upper surface (upper surface in the Z direction) of the substrate 110.
  • the conductor 132 is formed, for example, by stacking and applying paste-like conductors (for example, a conductive paste) so as to surround the periphery of the imaging element chip 120, as shown in FIGS. 6A and 6B described later. can do. Further, the conductor 132 is arranged in a square shape so as to surround the periphery of the imaging element chip 120 in a top view.
  • the height of the conductor 132 (length in the Z direction) is the same as (or substantially the same as) the height of the image sensor chip 120 (the length in the Z direction), or the height of the image sensor chip 120 (Z). Direction length). This is to improve the magnetic flux shielding effect described later.
  • the conductor 132 is formed so that the image sensor chip 120 and the conductor 132 are adjacent to each other without providing a gap between the image sensor chip 120 and the conductor 132.
  • the conductor 132 is installed such that the conductor 132 is attached to the outer surface of the image sensor chip 120. Note that the imaging element chip 120 and the conductor 132 are not adjacent to each other, a gap is provided between the imaging element chip 120 and the conductor 132, and a space is formed between the imaging element chip 120 and the conductor 132. You can
  • the conductors 131 and 132 are not connected to the ground potential of the image sensor chip 120. Eddy currents are induced in the conductors 131 and 132 by the magnetic flux described below. This eddy current causes a potential difference inside the conductors 131 and 132. When the conductors 131 and 132 are connected to the ground potential of the image sensor chip 120, the ground potential changes due to the potential difference generated inside the conductors 131 and 132, which causes noise. By separating the conductors 131 and 132 from the ground potential of the image sensor chip 120, it is possible to prevent noise from entering.
  • FIG. 3A schematically shows the loop wiring 121 in the image sensor 1.
  • the loop wiring 121 shown in FIG. 3A is a schematic view of a circuit (for example, a pixel system circuit) formed by the wiring in the substrate 110 and the wiring in the imaging element chip 120 for ease of description. Yes, it does not show the actual wiring.
  • FIG. 4A schematically shows the loop wiring 821 in the image sensor 800 according to the comparative example.
  • the image sensor 800 has a substrate 810, an image sensor chip 820, and a bonding wire 840.
  • the loop wiring 821 shown in FIG. 4A is also a schematic view of a circuit formed by the wiring in the substrate 810 and the wiring in the imaging element chip 820 for the sake of easy description, and shows the actual wiring. Not a thing.
  • a magnetic flux (indicated by an arrow 842) around the bonding wire 840 is generated around the bonding wire 840.
  • a magnetic flux (indicated by an arrow 842) generated by a current (indicated by an arrow 841) flowing through the bonding wire 840 is blocked between the imaging element chip 820 and the bonding wire 840. Absent. Therefore, the magnetic flux (indicated by the arrow 842) generated by the current (indicated by the arrow 841) flowing through the bonding wire 840 is linked to the loop wiring 821.
  • the conductor 131 is arranged on the surface (bottom surface) on the lower side (lower side in the Z direction) of the imaging element chip 120. Further, around the image pickup element chip 120, a conductor 132 is arranged in a square shape so as to cover a side surface (a surface in the X direction and the Y direction) of the image pickup element chip 120. That is, in the image pickup element chip 120, the surfaces other than the light receiving surface are covered with the conductors 131 and 132.
  • FIG. 3A when a current (indicated by an arrow 141) flows through the bonding wire 140, a magnetic flux around the bonding wire 140 as an axis is formed around the bonding wire 140 as in the comparative example shown in FIG. 4A. (Indicated by arrow 142) occurs.
  • the conductor 132 is arranged between the imaging element chip 120 and the bonding wire 840. Therefore, the magnetic flux (indicated by the arrow 142) generated by the current flowing in the bonding wire 140 (indicated by the arrow 141) can be blocked by the conductor 132.
  • the magnetic flux interlinking with the loop wiring 121 can be attenuated. Accordingly, it is possible to prevent the electromotive force from being generated in the loop wiring 121 based on the magnetic flux (indicated by the arrow 142), and to reduce the noise (for example, the horizontal streak noise).
  • FIG. 3B schematically shows the loop wiring 421 in the image sensor 400.
  • the image sensor 400 includes a substrate 410, an image sensor chip 420, and conductors 431 and 432.
  • An actuator 450 is provided near the image sensor chip 420.
  • the actuator 450 drives the photographing lens in the image pickup apparatus in which the image pickup element 400 is arranged. By adjusting the position of the photographing lens with the actuator 450, the subject can be focused.
  • loop wiring 421 illustrated in FIG. 3B schematically illustrates a circuit (for example, a pixel system circuit) formed by the wiring on the substrate 410 and the wiring on the imaging element chip 420 for ease of description. Yes, it does not show the actual wiring.
  • FIG. 4B schematically shows the loop wiring 921 in the image sensor 900 according to the comparative example.
  • the image sensor 900 has a substrate 910 and an image sensor chip 920.
  • a circuit formed by the wiring in the substrate 910 and the wiring in the image pickup device chip 920 is schematically shown for ease of explanation, and the actual wiring is shown. Not a thing.
  • An actuator 950 is provided near the image sensor chip 920.
  • the conductor 431 is arranged on the surface (bottom surface) on the lower side (the lower side in the Z direction) of the imaging element chip 420. Further, around the image pickup element chip 420, a conductor 432 is arranged in a square shape so as to cover the side surfaces (surfaces in the X direction and the Y direction) of the image pickup element chip 420.
  • the magnetic flux (arrow 451 is shown in the same manner as in the comparative example shown in FIG. 4B. ) Occurs.
  • the conductor 432 is arranged between the imaging element chip 420 and the actuator 450. Therefore, the magnetic flux (indicated by the arrow 451) generated from the actuator 450 can be blocked by the conductor 432. This can prevent the occurrence of electromotive force due to the magnetic flux (indicated by the arrow 451) in the loop wiring 421 and reduce the occurrence of noise (for example, the occurrence of horizontal streak noise).
  • FIGS. 5 and 6 are diagrams showing an example of a method for manufacturing the image sensor 1 according to the first embodiment of the present disclosure.
  • the substrate 110 on which the image pickup device chip 120 is arranged is prepared.
  • a paste-like conductor (for example, a conductive paste) is applied to the upper surface of the substrate 110 using the dispenser 50 to form the conductor 131.
  • the conductor 131 is formed on the upper surface of the substrate 110 so that the size of the conductor 131 is larger than the size of the imaging element chip 120.
  • the imaging element chip 120 is mounted on the conductor 131 formed on the substrate 110.
  • the imaging element chip 120 can be fixed by curing the conductor 131 instead of the adhesive (for example, die bond resin). That is, in the image sensor 1, the conductor 131 can be configured to also serve as an adhesive. However, the image pickup device chip 120 may be fixed to the conductor 131 using an adhesive (for example, a die bond resin).
  • a paste-like conductor (for example, a conductive paste) is further applied to the conductor 131 formed on the upper surface of the substrate 110 by using the dispenser 50 to conduct the conductive material.
  • Form body 132 the conductor 132 can be formed by stacking and applying paste-like conductors so as to surround the imaging element chip 120 arranged on the conductor 131. Further, the conductor 132 is arranged in a square shape so as to surround the periphery of the imaging element chip 120 in a top view. Further, the conductor 132 can be formed so that the image sensor chip 120 and the conductor 132 are adjacent to each other without providing a gap between the image sensor chip 120 and the conductor 132. After applying the conductor 132, the paste-like conductor is cured.
  • wire bonding is performed, and the wiring inside the substrate 110 and the imaging element chip 120 are electrically connected by a plurality of bonding wires 140.
  • a conductor other than the paste-like conductor may be used.
  • a conductive seal or an electromagnetic shield (magnetic shield or electrostatic shield) made of a metal film can be used as the conductors 131 and 132.
  • the conductor 132 can be attached to the outer surface of the imaging element chip 120 to be installed.
  • a conductor such as copper (for example, copper foil), and aluminum can be used.
  • the conductors 131 and 132 may be formed by plating or vapor deposition.
  • the image sensor chip 120 is bonded to the substrate 110 with a die bond resin.
  • various conductors can be used. Note that the electrical conductivity of the conductors used as the conductors 131 and 132 is proportional to the magnetic flux shielding effect. Therefore, it is preferable to use a conductor having high conductivity. Copper, for example, can be adopted as the conductor having high conductivity.
  • a copper foil tape may be attached to each surface (surfaces other than the light receiving surface) of the image sensor chip 120 for use.
  • the conductors 131 and 132 are arranged on each surface (surfaces other than the light receiving surface) of the image pickup element chip 120 .
  • the conductor may be arranged on a part of the surfaces.
  • the conductor may be arranged only on the side surface of the image pickup device chip 120 (the surface other than the light receiving surface) close to the bonding wire 140.
  • the conductors may be arranged only on the side surface adjacent to the bonding wire 140 and the side surface adjacent to the side surface of the surface of the image pickup device chip 120 (the surface other than the light receiving surface).
  • conductors are arranged on the side surface (or the side surface adjacent thereto) close to the bonding wire 140 and the bottom surface. Good.
  • Second Embodiment> In the first embodiment, an example in which the image pickup device chip is mounted on the flat substrate has been shown. However, the present disclosure is not limited to this. Therefore, in the second embodiment, an example in which an image pickup device chip is mounted on an image pickup device having a cavity structure is shown. The second embodiment is substantially the same as the first embodiment except the structure of the image sensor. Therefore, in the following, points different from those of the first embodiment will be mainly described, and other description will be omitted.
  • FIG. 7 is a perspective view and a cross-sectional view showing a configuration example of the image sensor 200 according to the second embodiment of the present disclosure.
  • FIG. 7A shows a perspective view of the image sensor 200.
  • FIG. 7B shows a cross-sectional view of the image sensor 200.
  • the image sensor 200 includes a substrate 210, an image sensor chip 220, conductors 231 and 232, and a bonding wire 240.
  • the substrate 210 is a package substrate having a cavity structure having a recess 211 (shown in FIG. 8A) for mounting the image pickup device chip 220. That is, the imaging element chip 220 is mounted on the bottom surface of the recess 211 in the substrate 210. Further, inside the substrate 210, a plurality of wirings (not shown) that electrically connect an external terminal (not shown) provided on the substrate 210 and the image pickup device chip 220 are provided. Each of these wirings is electrically connected to the imaging element chip 220 by a plurality of bonding wires 240 via pads (not shown) formed on the upper surface of the substrate 210. Note that each of these wirings can be provided between each layer of the substrate 210.
  • the conductor 231 is a conductor (conductor) arranged between the substrate 210 and the imaging element chip 220.
  • the conductor 231 can be formed by, for example, applying a paste-like conductor (for example, a conductive paste) to the entire bottom surface of the recess 211 in the substrate 210, as shown in FIG. 8B.
  • the conductor 231 can also be used as, for example, a die bond resin for mounting the imaging element chip 220 on the substrate 210 by bonding.
  • the conductor 232 is a conductor arranged between the side surface of the concave portion 211 of the substrate 210 and the outer surface of the imaging element chip 220. That is, the conductor 232 is arranged so as to surround the image pickup element chip 220.
  • the conductor 232 is, for example, as shown in FIGS. 9A and 9B, a paste-like conductor (for example, a conductive paste) between the side surface of the concave portion 211 of the substrate 210 and the outer surface of the imaging element chip 220. Can be formed by applying and filling. In this way, the gap between the imaging element chip 220 and the conductor 232 is not provided, and the imaging element chip 220 and the conductor 232 are formed adjacent to each other.
  • the conductor 232 is arranged in a square shape so as to surround the periphery of the imaging element chip 220 in a top view.
  • the height of the conductor 232 (length in the Z direction) is the same as (or substantially the same as) the height of the imaging element chip 220 (the length in the Z direction) or the height of the imaging element chip 220 (Z. Direction length).
  • the conductors 231 and 232 are not connected to the ground potential of the image sensor chip 220. This is to prevent noise from entering.
  • FIGS. 8 and 9 are diagrams illustrating an example of a manufacturing method of the imaging device 200 according to the second embodiment of the present disclosure.
  • a substrate 210 having a cavity structure in which the image sensor chip 220 is arranged is prepared. It should be noted that the substrate 210 having a cavity structure is a substrate having a recess 211 for mounting the image pickup device chip 220.
  • a conductor 231 is formed by applying a paste-like conductor (for example, a conductive paste) to the bottom surface of the recess 211 of the substrate 210 using the dispenser 50.
  • a paste-like conductor for example, a conductive paste
  • the conductor 231 is formed such that the size of the conductor 231 is larger than the size of the imaging element chip 220 in a top view.
  • the imaging element chip 220 is placed on the conductor 231 formed on the bottom surface of the recess 211 of the substrate 210.
  • the imaging element chip 120 can be fixed by curing the conductor 231 instead of the adhesive (for example, die bond resin).
  • a paste-like conductor (for example, a conductive paste) is further applied to the conductor 231 formed on the bottom surface of the recess 211 of the substrate 210 using the dispenser 50. Then, the conductor 232 is formed.
  • a conductor 232 is formed by applying and filling a paste-like conductor so as to surround the imaging element chip 220 arranged on the conductor 231. To do. After applying the conductor 232, the paste-like conductor is cured.
  • the manufacturing process can be simplified as compared with the method of stacking and applying the paste-shaped conductors described in FIG.
  • wire bonding is performed, and the wiring inside the substrate 210 and the imaging element chip 220 are electrically connected by a plurality of bonding wires 240.
  • Third Embodiment> an example of an image pickup device (image pickup device in which an image pickup device chip is mounted on a flat substrate) applied with a conductor during dicing is shown.
  • the third embodiment is substantially the same as the first embodiment except for the structure of the image sensor and the manufacturing method. Therefore, in the following, the manufacturing method of the image sensor will be described focusing on the points different from the first embodiment, and the other description will be omitted.
  • FIGS. 10 and 11 are diagrams showing an example of a method of manufacturing the image sensor 300 according to the third embodiment of the present disclosure.
  • a blade 60 is used to perform dicing to separate chips (imaging element chips) on the silicon wafer 350.
  • a film is attached to the back surface of the silicon wafer 350.
  • each chip (imaging element chip) is cut by cutting the silicon wafer 350, the film attached to the back surface of the silicon wafer 350 is not cut. Therefore, each chip (imaging element chip) is not separated, and a groove is formed between each chip (imaging element chip) by the cut portion between each chip (imaging element chip) and the film.
  • a paste-shaped conductor for example, a paste-like conductor (eg, a groove) 351 between the chips (imaging element chips) formed by cutting the silicon wafer 350 is formed.
  • a conductive paste is applied to form a conductor on the side surface of each chip.
  • the conductor applied to the cut portion 351 between the chips (imaging element chips) is indicated by a thick line.
  • the conductor applied to the cutting portion 351 between the chips (imaging element chips) is cut and separated for each chip (imaging element chip).
  • a plurality of chips (imaging element chips) having conductors formed on the side surfaces are generated.
  • the formation process of the conductor 332 can be simplified.
  • the image sensor chip 320 thus generated is placed on the substrate 310, as shown in FIG. 11B.
  • a conductor 331 is formed by applying a paste-like conductor (for example, a conductive paste) on the upper surface of the substrate 310 using the dispenser 50.
  • a paste-like conductor for example, a conductive paste
  • the conductor 331 is formed so that the size of the conductor 331 is larger than the size of the imaging element chip 320 (the imaging element chip 320 on which the conductor 332 is formed) on the upper surface of the substrate 310.
  • the imaging element chip 320 having the conductor 332 formed thereon is placed on the conductor 331 formed on the substrate 310.
  • the imaging element chip 320 can be fixed by curing the conductor 331 instead of the adhesive (for example, die bond resin).
  • wire bonding is performed, and the wiring inside the substrate 310 and the imaging element chip 320 are electrically connected by a plurality of bonding wires 340.
  • a conductor is applied to the side surface of the image sensor chip 320 when performing the dicing process. That is, an example is shown in which a paste-like conductor is applied to the side surface of the image sensor chip 320 at the wafer stage. However, the conductor may be applied to the bottom surface of the imaging element chip 320 when performing the dicing process. Further, in the third embodiment, an example in which a conductor is applied to the image sensor chip 320 when performing the dicing process has been shown. However, when performing the dicing process, the conductor may be arranged by another method. For example, processing such as forming a conductor (conductor film) on the image sensor chip 320 by vapor deposition or plating the image sensor chip 320 with a conductor may be performed.
  • FIG. 12 is a cross-sectional view showing a configuration example of the image sensor 500 according to the fourth embodiment of the present disclosure.
  • FIG. 13 is a diagram showing an example of the configuration of the solder 551 and the conductors 561 and 562 provided on the back surface of the image sensor 500 according to the fourth embodiment of the present disclosure. Note that, in FIGS. 12 and 13, the solder 551 is indicated by a black circle for ease of explanation.
  • the image sensor 500 has a substrate 510, a frame 520, a seal glass 530, an image sensor chip 540, a solder 551, an underfill 552, and conductors 561 and 562.
  • an example in which the image sensor chip 540 is sealed by the frame 520 and the seal glass 530 is shown.
  • the substrate 510, the image sensor chip 540, and the conductors 561 and 562 correspond to the substrate, the image sensor chip, and the conductor described in the first to third embodiments. Therefore, with respect to the substrate 510, the image pickup element chip 540, and the conductors 561 and 562, the description of the portions common to the first to third embodiments will be omitted.
  • the frame 520 is a square-shaped frame when viewed from above, and is installed on the upper surface of the substrate 510.
  • the substrate 510 and the frame 520 form a recess for accommodating the image sensor chip 540.
  • the seal glass 530 is fixed to the frame 520 so as to cover the light receiving surface (pixel area) of the image sensor chip 540, and hermetically seals the space 521 in which the image sensor chip 540 is arranged.
  • the seal glass 530 is bonded to the frame 520 with an adhesive or the like to close the recess formed by the substrate 510 and the frame 520.
  • the seal glass 530 has a light-transmitting property, and has a function of preventing scratches, dust, and the like from adhering to the imaging element chip 540.
  • the material of the seal glass 530 for example, borosilicate glass, quartz glass, non-alkali glass, Pyrex (registered trademark) or the like can be used.
  • an IR (infrared) cut filter that blocks infrared light, a crystal low pass filter, or the like may be used.
  • the solder 551 is a solder ball for electrically connecting the wiring inside the substrate 510 and the imaging element chip 540. Further, the solder 551 can be arranged in a two-dimensional grid pattern between the substrate 510 and the image pickup device chip 540 as shown in FIG. 13, for example. In this case, the solder 551 can be arranged in a square shape so as to surround the conductor 561. By melting the solder 551 and performing the soldering, the wiring inside the substrate 510 and the wiring of the imaging element chip 540 can be electrically joined.
  • the underfill 552 is a liquid curable resin (sealing resin) used to protect the solder 551, and is arranged on the back surface of the imaging element chip.
  • the underfill 552 is poured into the gap of the solder 551 and hardened by heat treatment.
  • the conductor 561 can be arranged on at least a part of the bottom surface of the imaging element chip 540. Specifically, the conductor 561 can be arranged on the bottom surface of the imaging element chip 540 in a region other than the portion where the solder 551 is arranged. As described above, when the conductor 561 is arranged on the bottom surface of the image pickup element chip 540, the solder 551 can be avoided.
  • the conductor 562 can be arranged on the upper side of the underfill 552 and on the side surface of the image pickup element chip 540. Specifically, the conductor 562 can be arranged in a square shape in a top view so as to cover the side surface of the imaging element chip 540.
  • the conductors 561 and 562 the paste-like conductors described in the first to third embodiments may be used. Further, the conductors 561 and 562 may be formed on the side surface or the bottom surface of the image pickup device chip 540 by vapor deposition or plating.
  • the conductors 561 and 562 can be arranged on the bottom surface and the side surface of the image pickup element chip 540.
  • the present disclosure is not limited to this. Therefore, in the following, an example in which a conductor is arranged only on the side surface of the image pickup element chip in the case of flip chip connection for electrically connecting the substrate and the image pickup element chip will be shown as a modification.
  • FIG. 14 is a cross-sectional view showing a configuration example of an image sensor 600 according to a modified example of the fourth embodiment of the present disclosure. Note that, in FIG. 14, the solder 651 is indicated by a black circle for ease of explanation.
  • the image sensor 600 includes a substrate 610, a frame 620, a seal glass 630, an image sensor chip 640, a solder 651, an underfill 652, and a conductor 661.
  • the image sensor 600 is a modification of the image sensor 500 shown in FIGS. 12 and 13, and is different in that the conductor 561 arranged on the bottom surface of the image sensor chip 540 is omitted in the image sensor 500. Except for this point, the image sensor 600 is the same as the image sensor 500 shown in FIGS. 12 and 13, and therefore the differences from the image sensor 500 will be mainly described here.
  • the solder 661 is arranged between the substrate 610 and the image pickup element chip 640 in a two-dimensional grid pattern. Further, the solder 651 is arranged in a square shape, for example.
  • a pixel system net (wiring relating to an image signal generated by a pixel of the image pickup device) is often arranged immediately below the image pickup device chip so that the loop area becomes small.
  • the logic net (wiring relating to the control signal for controlling the image sensor) weakens the magnetic flux penetrating the loop, so it should be routed directly under the image sensor chip and be routed on the outer periphery of the board. There are many. Therefore, it is important to reduce the influence of the magnetic flux from the wires (for example, bonding wires) wired on the outer periphery of the substrate on the image pickup element chip.
  • actuators and the like are often arranged near the image sensor chip of the image sensor.
  • various motors for example, a motor for moving a zoom lens or a focus lens, a motor for camera shake correction
  • a motor for camera shake correction may be arranged near the image sensor chip of the image sensor.
  • a current flows through the electromagnetic coil of the motor, a magnetic flux is generated. Therefore, it is important to reduce the influence of the magnetic flux on the image pickup element chip.
  • a conductor is arranged on each surface (a surface other than the light receiving surface) of the image sensor chip.
  • the magnetic flux from the outside for example, the actuator
  • the magnetic flux from the logic power source of the image sensor chip or the substrate from penetrating the image sensor chip.
  • image noise for example, horizontal streak noise.
  • the manufacturing process of the image sensor 600 can be simplified. Further, it is possible to prevent interference between the conductor on the bottom surface of the image pickup element chip 640 and the solder ball 551, and prevent occurrence of a defect such as a short circuit due to the solder 651.
  • Image sensor configuration> A cross-sectional configuration example of the image sensor to which the technology according to the present disclosure can be applied will be described.
  • FIG. 15 is a cross-sectional view showing a configuration example of an image sensor to which the technology according to the present disclosure can be applied.
  • a PD (photodiode) 20019 receives incident light 20001 incident from the back surface (upper surface in the figure) side of the semiconductor substrate 20018.
  • a flattening film 20013, a CF (color filter) 20012, and a microlens 20011 are provided above the PD 20019, and incident light 20001 that is sequentially incident through each portion is received by a light-receiving surface 20017 and photoelectrically converted. Be seen.
  • the n-type semiconductor region 20020 is formed as a charge storage region for storing charges (electrons).
  • the n-type semiconductor region 20020 is provided inside the p-type semiconductor regions 200616 and 20041 of the semiconductor substrate 20018.
  • a p-type semiconductor region 20041 having a higher impurity concentration than the back surface (upper surface) side is provided on the front surface (lower surface) side of the semiconductor substrate 20018 of the n-type semiconductor area 20020.
  • the PD20019 has a HAD (Hole-Accumulation Diode) structure, and a p-type semiconductor is formed so as to suppress generation of a dark current at each interface between the upper surface side and the lower surface side of the n-type semiconductor region 20020. Areas 200616 and 20041 are formed.
  • HAD Hole-Accumulation Diode
  • a pixel separation unit 20030 that electrically separates the plurality of pixels 20010 is provided, and the PD 20019 is provided in a region partitioned by the pixel separation unit 20030.
  • the pixel separation unit 20030 is formed in a grid shape so as to be interposed between a plurality of pixels 20010, and the PD 20019 is formed by the pixel separation unit 20030. It is formed in a partitioned area.
  • each PD20019 the anode is grounded, and in the image sensor, the signal charge (for example, electrons) accumulated in the PD20019 is read out via a transfer Tr (MOS FET) or the like (not shown) to obtain an electric signal. It is output to VSL (vertical signal line) not shown.
  • MOS FET MOS FET
  • the wiring layer 20050 is provided on the front surface (lower surface) of the semiconductor substrate 20018 opposite to the back surface (upper surface) on which the respective parts such as the light-shielding film 20014, CF 20012, and microlens 20011 are provided.
  • the wiring layer 20050 includes a wiring 20051 and an insulating layer 20052, and is formed in the insulating layer 20052 so that the wiring 20051 is electrically connected to each element.
  • the wiring layer 20050 is a so-called multilayer wiring layer, and is formed by alternately stacking an interlayer insulating film forming an insulating layer 20052 and a wiring 20051 a plurality of times.
  • a wiring to the Tr such as a transfer Tr for reading out electric charges from the PD 20019 and each wiring such as VSL are laminated via an insulating layer 20052.
  • a support substrate 20061 is provided on the surface of the wiring layer 20050 opposite to the side where the PD 20019 is provided.
  • a substrate made of a silicon semiconductor having a thickness of several hundred ⁇ m is provided as the supporting substrate 20061.
  • the light shielding film 20014 is provided on the back surface (upper surface in the figure) side of the semiconductor substrate 20018.
  • the light shielding film 20014 is configured to shield a part of the incident light 20001 traveling from above the semiconductor substrate 20018 to the back surface of the semiconductor substrate 20018.
  • the light shielding film 20014 is provided above the pixel separation unit 20030 provided inside the semiconductor substrate 20018.
  • the light-shielding film 20014 is provided on the back surface (upper surface) of the semiconductor substrate 20018 so as to project in a convex shape through an insulating film 20055 such as a silicon oxide film.
  • the light shielding film 20014 is not provided and is opened so that the incident light 20001 enters the PD 20019.
  • the planar shape of the light-shielding film 20014 is a lattice shape, and an opening through which the incident light 20001 passes to the light-receiving surface 20017 is formed.
  • the light blocking film 20044 is formed of a light blocking material that blocks light.
  • the light-shielding film 20014 is formed by sequentially stacking a titanium (Ti) film and a tungsten (W) film.
  • the light-shielding film 20014 can be formed, for example, by sequentially stacking a titanium nitride (TiN) film and a tungsten (W) film.
  • the light shielding film 20014 is covered with a planarization film 20013.
  • the planarization film 20013 is formed using an insulating material which transmits light.
  • the pixel separation portion 20030 has a groove 20031, a fixed charge film 20032, and an insulating film 20033.
  • the fixed charge film 20032 is formed on the rear surface (upper surface) side of the semiconductor substrate 20018 so as to cover the groove 20031 partitioning the plurality of pixels 20010.
  • the fixed charge film 20032 is provided so as to cover the inner surface of the groove 20031 formed on the back surface (upper surface) side of the semiconductor substrate 20018 with a constant thickness.
  • the insulating film 20033 is provided (filled) so as to fill the inside of the groove 20031 covered with the fixed charge film 20032.
  • the fixed charge film 20032 is made of a high dielectric material having a negative fixed charge so that a positive charge (hole) accumulation region is formed at the interface with the semiconductor substrate 20018 and the generation of dark current is suppressed. Is formed. Since the fixed charge film 20032 is formed to have a negative fixed charge, an electric field is applied to the interface with the semiconductor substrate 20018 by the negative fixed charge, and a positive charge (hole) accumulation region is formed.
  • the fixed charge film 20032 can be formed of, for example, a hafnium oxide film (HfO 2 film). Further, the fixed charge film 20032 can be formed so as to include at least one of oxides such as hafnium, zirconium, aluminum, tantalum, titanium, magnesium, yttrium, and a lanthanoid element.
  • oxides such as hafnium, zirconium, aluminum, tantalum, titanium, magnesium, yttrium, and a lanthanoid element.
  • FIG. 16 is a diagram illustrating an outline of a configuration example of a stacked solid-state imaging device to which the technology according to the present disclosure can be applied.
  • 16A shows a schematic configuration example of a non-stacked solid-state imaging device.
  • the solid-state imaging device 23010 has one die (semiconductor substrate) 23011 as shown in A of FIG.
  • the die 23011 has a pixel region 23012 in which pixels are arranged in an array, a control circuit 23013 for driving the pixels and other various controls, and a logic circuit 23014 for signal processing.
  • 16B and 16C show an example of a schematic configuration of a stack type solid-state imaging device. As shown in B and C of FIG.
  • the solid-state imaging device 23020 is configured as one semiconductor chip by stacking two dies of a sensor die 23021 and a logic die 23024 and electrically connecting them.
  • the sensor die 23021 has a pixel region 23012 and a control circuit 23013 mounted thereon, and the logic die 23024 has a logic circuit 23014 including a signal processing circuit for performing signal processing.
  • the pixel area 23012 is mounted on the sensor die 23021, and the control circuit 23013 and the logic circuit 23014 are mounted on the logic die 23024.
  • FIG. 17 is a cross-sectional view showing a first configuration example of the stacked solid-state imaging device 23020.
  • the sensor die 23021 In the sensor die 23021, PDs (photodiodes), FDs (floating diffusions), Trs (MOS FETs), which form pixels serving as the pixel regions 23012, and Trs serving as the control circuit 23013 are formed. Further, the sensor die 23021 is formed with a wiring layer 23101 having a plurality of layers, in this example, three layers of wiring 23110. Note that the control circuit 23013 (which becomes Tr) can be configured as the logic die 23024 instead of the sensor die 23021.
  • a Tr forming the logic circuit 23014 is formed on the logic die 23024. Further, the logic die 23024 is formed with a wiring layer 23161 having a plurality of layers, in this example, three layers of wiring 23170. Further, the logic die 23024 is provided with a connection hole 23171 having an insulating film 23172 formed on the inner wall surface thereof, and a connection conductor 23173 connected to the wiring 23170 and the like is embedded in the connection hole 23171.
  • the sensor die 23021 and the logic die 23024 are attached so that their wiring layers 23101 and 23161 face each other, whereby a laminated solid-state imaging device 23020 in which the sensor die 23021 and the logic die 23024 are laminated is configured.
  • a film 23191 such as a protective film is formed on a surface where the sensor die 23021 and the logic die 23024 are attached to each other.
  • the sensor die 23021 is formed with a connection hole 23111 that penetrates the sensor die 23021 from the back surface side (the side on which light is incident on the PD) (upper side) of the sensor die 23021 and reaches the uppermost wiring 23170 of the logic die 23024. Further, in the sensor die 23021, a connection hole 23121 is formed near the connection hole 23111 to reach the first layer wiring 23110 from the back surface side of the sensor die 23021. An insulating film 23112 is formed on the inner wall surface of the connection hole 23111, and an insulating film 23122 is formed on the inner wall surface of the connection hole 23121. Then, the connection conductors 23113 and 23123 are embedded in the connection holes 23111 and 23121, respectively.
  • connection conductor 23113 and the connection conductor 23123 are electrically connected on the back surface side of the sensor die 23021, whereby the sensor die 23021 and the logic die 23024 are connected to the wiring layer 23101, the connection hole 23121, the connection hole 23111, and the wiring layer. It is electrically connected via 23161.
  • FIG. 18 is a sectional view showing a second configuration example of the stacked solid-state imaging device 23020.
  • the sensor die 23021 (the wiring layer 23101 (the wiring 23110 of the wiring layer 23110 )) and the logic die 23024 (the wiring layer 23161 (the wiring of the wiring layer 23161) of the sensor die 23021 are formed by one connection hole 23211 formed in the sensor die 23021. 23170)) is electrically connected.
  • connection hole 23211 is formed so as to penetrate the sensor die 23021 from the back surface side of the sensor die 23021 to reach the uppermost layer wiring 23170 of the logic die 23024 and reach the uppermost layer wiring 23110 of the sensor die 23021.
  • An insulating film 23212 is formed on the inner wall surface of the connection hole 23211, and a connection conductor 23213 is embedded in the connection hole 23211.
  • the sensor die 23021 and the logic die 23024 are electrically connected by the two connection holes 23111 and 23121, but in FIG. 18, the sensor die 23021 and the logic die 23024 are connected by the one connection hole 23211. It is electrically connected.
  • FIG. 19 is a cross-sectional view showing a third configuration example of the stacked solid-state imaging device 23020.
  • a film 23191 such as a protective film is not formed on the surface on which the sensor die 23021 and the logic die 23024 are bonded, and therefore, on the surface on which the sensor die 23021 and the logic die 23024 are bonded.
  • a film 23191 such as a protective film is formed.
  • FIG. 20 is a cross-sectional view showing another configuration example of the stacked solid-state imaging device to which the technology according to the present disclosure can be applied.
  • the solid-state imaging device 23401 has a three-layer laminated structure in which three dies including a sensor die 23411, a logic die 23412, and a memory die 23413 are laminated.
  • the memory die 23413 includes, for example, a memory circuit that stores data that is temporarily necessary for signal processing performed by the logic die 23412.
  • the logic die 23412 and the memory die 23413 are stacked in that order below the sensor die 23411. It can be laminated below 23411.
  • the sensor die 23411 is formed with a PD serving as a photoelectric conversion unit of a pixel and a source/drain region of the pixel Tr.
  • a gate electrode is formed around the PD via a gate insulating film, and a pixel Tr 23421 and a pixel Tr 23422 are formed by a source/drain region paired with the gate electrode.
  • the pixel Tr23421 adjacent to the PD is the transfer Tr, and one of the source/drain regions of the pair forming the pixel Tr23421 is the FD.
  • An interlayer insulating film is formed on the sensor die 23411, and a connection hole is formed in the interlayer insulating film.
  • a pixel Tr 23421 and a connection conductor 23431 connected to the pixel Tr 23422 are formed in the connection hole.
  • the sensor die 23411 is formed with a wiring layer 23433 having a plurality of layers of wiring 23432 connected to each connection conductor 23431.
  • an aluminum pad 23434 serving as an electrode for external connection is formed on the lowermost layer of the wiring layer 23433 of the sensor die 23411. That is, in the sensor die 23411, the aluminum pad 23434 is formed at a position closer to the bonding surface 23440 to the logic die 23412 than the wiring 23432.
  • the aluminum pad 23434 is used as one end of wiring for inputting/outputting signals to/from the outside.
  • the sensor die 23411 is formed with a contact 23441 used for electrical connection with the logic die 23412.
  • the contact 23441 is connected to the contact 23451 of the logic die 23412 and also to the aluminum pad 23442 of the sensor die 23411.
  • the sensor die 23411 is formed with a pad hole 23443 so as to reach the aluminum pad 23442 from the back side (upper side) of the sensor die 23411.
  • the technology according to the present disclosure can be applied to the image sensor as described above.
  • the technology according to the present disclosure (this technology) can be applied to various products.
  • the present technology may be realized as an image pickup device mounted in an image pickup apparatus such as a camera.
  • FIG. 21 is a block diagram showing a schematic configuration example of a camera which is an example of an imaging device to which the present technology can be applied.
  • the camera 1000 in the figure includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, and a display unit 1008. And a recording unit 1009.
  • the lens 1001 is a taking lens of the camera 1000.
  • the lens 1001 collects light from a subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
  • the image pickup element 1002 is a semiconductor element that picks up light from a subject condensed by the lens 1001.
  • the image sensor 1002 generates an analog image signal according to the emitted light, converts it into a digital image signal, and outputs it.
  • the image capturing control unit 1003 controls image capturing by the image sensor 1002.
  • the imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002.
  • the imaging control unit 1003 can also perform autofocus in the camera 1000 based on the image signal output from the image sensor 1002.
  • the auto focus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
  • a method (image plane phase difference autofocus) of detecting a focus position by detecting an image plane phase difference by a phase difference pixel arranged in the image sensor 1002 can be used. It is also possible to apply a method (contrast autofocus) of detecting the position where the contrast of the image is highest as the focus position.
  • the imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focus position, and performs autofocus.
  • the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
  • DSP Digital Signal Processor
  • the lens driving unit 1004 drives the lens 1001 under the control of the imaging control unit 1003.
  • the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
  • the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic for generating image signals of insufficient colors among the image signals corresponding to red, green and blue for each pixel, noise reduction for removing noise of the image signals and encoding of the image signals. Applicable
  • the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
  • the operation input unit 1006 receives an operation input from the user of the camera 1000.
  • this operation input unit 1006 for example, a push button or a touch panel can be used.
  • the operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. After that, a process according to the operation input, for example, a process of capturing an image of a subject is started.
  • the frame memory 1007 is a memory that stores a frame that is an image signal for one screen.
  • the frame memory 1007 is controlled by the image processing unit 1005 and holds frames in the process of image processing.
  • the display unit 1008 displays the image processed by the image processing unit 1005.
  • a liquid crystal panel can be used for the display unit 1008.
  • the recording unit 1009 records the image processed by the image processing unit 1005.
  • a memory card or a hard disk can be used.
  • the present technology can be applied to the image sensor 1002 among the configurations described above.
  • the image pickup devices 1, 200, 300, 400, 500, and 600 described in FIGS. 1 to 14 can be applied to the image pickup device 1002.
  • image noise for example, horizontal streak noise
  • the image processing unit 1005 is an example of the processing circuit described in the claims.
  • the camera 1000 is an example of the imaging device described in the claims.
  • the camera has been described as an example, but the technology according to the present disclosure may be applied to other devices such as a monitoring device. Further, the present disclosure can be applied to a semiconductor device in the form of a semiconductor module as well as an electronic device such as a camera. Specifically, the technology according to the present disclosure can be applied to an imaging module which is a semiconductor module in which the imaging device 1002 and the imaging control unit 1003 of FIG. 21 are enclosed in one package.
  • FIG. 22 is a diagram showing an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
  • FIG. 22 illustrates a situation in which an operator (doctor) 11131 is operating on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical tools 11110 such as a pneumoperitoneum tube 11111 and an energy treatment tool 11112, and a support arm device 11120 that supports the endoscope 11100.
  • a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 includes a lens barrel 11101 into which a region of a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the base end of the lens barrel 11101.
  • the endoscope 11100 configured as a so-called rigid endoscope having the rigid barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible barrel. Good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. It is irradiated toward the observation target in the body cavity of the patient 11132 via the lens.
  • the endoscope 11100 may be a direct-viewing endoscope, or may be a perspective or side-viewing endoscope.
  • An optical system and an image pickup device are provided inside the camera head 11102, and the reflected light (observation light) from the observation target is condensed on the image pickup device by the optical system.
  • the observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU: Camera Control Unit) 11201.
  • CCU Camera Control Unit
  • the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and integrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives the image signal from the camera head 11102, and performs various image processing such as development processing (demosaic processing) on the image signal for displaying an image based on the image signal.
  • image processing such as development processing (demosaic processing)
  • the display device 11202 displays an image based on the image signal subjected to the image processing by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), and supplies the endoscope 11100 with irradiation light when imaging a surgical site or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various kinds of information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment instrument control device 11205 controls driving of the energy treatment instrument 11112 for cauterization of tissue, incision, sealing of blood vessel, or the like.
  • the pneumoperitoneum device 11206 is used to inflate the body cavity of the patient 11132 through the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of securing the visual field by the endoscope 11100 and the working space of the operator.
  • the recorder 11207 is a device capable of recording various information regarding surgery.
  • the printer 11208 is a device capable of printing various information regarding surgery in various formats such as text, images, and graphs.
  • the light source device 11203 that supplies irradiation light to the endoscope 11100 when imaging a surgical site can be configured by, for example, an LED, a laser light source, or a white light source configured by a combination thereof.
  • a white light source is formed by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, so that the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the laser light from each of the RGB laser light sources is irradiated on the observation target in a time-division manner, and the drive of the image pickup device of the camera head 11102 is controlled in synchronization with the irradiation timing so as to correspond to each of the RGB. It is also possible to take the captured image in time division. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 11203 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changing the intensity of the light to acquire an image in a time-division manner and synthesizing the images, a high dynamic without so-called blackout and whiteout. Images of the range can be generated.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of the absorption of light in body tissue, by irradiating a narrow band of light as compared with the irradiation light (that is, white light) at the time of normal observation, the mucosal surface layer
  • the so-called narrow band imaging is performed in which a predetermined tissue such as blood vessels is imaged with high contrast.
  • fluorescence observation in which an image is obtained by fluorescence generated by irradiating the excitation light may be performed.
  • the body tissue is irradiated with excitation light to observe the fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is also injected.
  • the excitation light corresponding to the fluorescence wavelength of the reagent can be irradiated to obtain a fluorescence image and the like.
  • the light source device 11203 can be configured to be capable of supplying narrowband light and/or excitation light compatible with such special light observation.
  • FIG. 23 is a block diagram showing an example of the functional configuration of the camera head 11102 and the CCU 11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a driving unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are communicably connected to each other via a transmission cable 11400.
  • the lens unit 11401 is an optical system provided at the connecting portion with the lens barrel 11101.
  • the observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the image pickup unit 11402 is composed of an image pickup element.
  • the number of image pickup elements forming the image pickup section 11402 may be one (so-called single-plate type) or plural (so-called multi-plate type).
  • image signals corresponding to RGB are generated by each image pickup element, and a color image may be obtained by combining them.
  • the image capturing unit 11402 may be configured to include a pair of image capturing elements for respectively acquiring image signals for the right eye and the left eye corresponding to 3D (Dimensional) display.
  • the 3D display enables the operator 11131 to more accurately understand the depth of the living tissue in the operation site.
  • a plurality of lens units 11401 may be provided corresponding to each image pickup element.
  • the image pickup unit 11402 does not necessarily have to be provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Accordingly, the magnification and focus of the image captured by the image capturing unit 11402 can be adjusted appropriately.
  • the communication unit 11404 is configured by a communication device for transmitting/receiving various information to/from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling the driving of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405.
  • the control signal includes, for example, information that specifies the frame rate of the captured image, information that specifies the exposure value at the time of capturing, and/or information that specifies the magnification and focus of the captured image. Contains information about the condition.
  • the image capturing conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, the so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives the image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the driving of the camera head 11102 to the camera head 11102.
  • the image signal and the control signal can be transmitted by electric communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal that is the RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various controls regarding imaging of a surgical site or the like by the endoscope 11100 and display of a captured image obtained by imaging the surgical site or the like. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a captured image of the surgical site or the like based on the image signal subjected to the image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
  • the control unit 11413 detects a surgical instrument such as forceps, a specific body part, bleeding, and a mist when the energy treatment instrument 11112 is used by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may superimpose and display various types of surgery support information on the image of the operation unit using the recognition result. By displaying the surgery support information in a superimposed manner and presenting it to the operator 11131, the burden on the operator 11131 can be reduced, and the operator 11131 can proceed with the operation reliably.
  • the transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
  • wired communication is performed using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the imaging unit 11402 of the camera head 11102 among the configurations described above.
  • the image pickup devices 1, 200, 300, 400, 500, and 600 described with reference to FIGS. 1 to 14 can be applied to the image pickup unit 10402.
  • the technique according to the present disclosure it is possible to reduce the occurrence of image noise and prevent deterioration of the image quality of the image generated by the camera 1000.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 24 is a block diagram showing a schematic configuration example of a vehicle control system which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a drive force generation device for generating a drive force of a vehicle such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to wheels, and a steering angle of the vehicle. It functions as a steering mechanism for adjusting and a control device such as a braking device for generating a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a head lamp, a back lamp, a brake lamp, a winker, or a fog lamp.
  • the body system control unit 12020 may receive radio waves or signals from various switches transmitted from a portable device that substitutes for a key.
  • the body system control unit 12020 receives input of these radio waves or signals and controls the vehicle door lock device, power window device, lamp, and the like.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the image capturing unit 12031 to capture an image of the vehicle exterior and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the image pickup unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of received light.
  • the image pickup unit 12031 can output the electric signal as an image or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • a driver state detection unit 12041 that detects the state of the driver is connected.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether or not the driver is asleep.
  • the microcomputer 12051 calculates the control target value of the driving force generation device, the steering mechanism or the braking device based on the information on the inside and outside of the vehicle acquired by the outside information detection unit 12030 or the inside information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes functions of ADAS (Advanced Driver Assistance System) including collision avoidance or impact mitigation of a vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintenance traveling, a vehicle collision warning, or a vehicle lane departure warning. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, the steering mechanism, the braking device, or the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, thereby It is possible to perform cooperative control for the purpose of autonomous driving or the like that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
  • the voice image output unit 12052 transmits an output signal of at least one of a voice and an image to an output device capable of visually or audibly notifying information to an occupant of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 25 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, 12105 as the imaging unit 12031.
  • the image capturing units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield inside the vehicle.
  • the image capturing unit 12101 provided on the front nose and the image capturing unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire images in front of the vehicle 12100.
  • the imaging units 12102 and 12103 included in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the image capturing unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the front images acquired by the image capturing units 12101 and 12105 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.
  • FIG. 25 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors
  • the imaging range 12114 indicates The imaging range of the imaging part 12104 provided in a rear bumper or a back door is shown. For example, by overlaying the image data captured by the image capturing units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image capturing units 12101 to 12104 may be a stereo camera including a plurality of image capturing elements or may be an image capturing element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object within the imaging range 12111 to 12114 and the temporal change of this distance (relative speed with respect to the vehicle 12100).
  • the closest three-dimensional object on the traveling path of the vehicle 12100 which travels in the substantially same direction as the vehicle 12100 at a predetermined speed (for example, 0 km/h or more), can be extracted as a preceding vehicle. it can.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, which autonomously travels without depending on the operation of the driver.
  • the microcomputer 12051 uses the distance information obtained from the imaging units 12101 to 12104 to convert three-dimensional object data regarding a three-dimensional object into another three-dimensional object such as a two-wheeled vehicle, an ordinary vehicle, a large vehicle, a pedestrian, and a utility pole. It can be classified, extracted, and used for automatic avoidance of obstacles. For example, the microcomputer 12051 identifies an obstacle around the vehicle 12100 into an obstacle visible to the driver of the vehicle 12100 and an obstacle difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or more than the set value and there is a possibility of collision, the microcomputer 12051 outputs the audio through the audio speaker 12061 and the display unit 12062.
  • a driver can be assisted for avoiding a collision by outputting an alarm to the driver and performing forced deceleration or avoidance steering through the drive system control unit 12010.
  • At least one of the image capturing units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian is present in the images captured by the imaging units 12101 to 12104. To recognize such a pedestrian, for example, a procedure of extracting a feature point in an image captured by the image capturing units 12101 to 12104 as an infrared camera and a pattern matching process on a series of feature points indicating the contour of an object are performed to determine whether the pedestrian is a pedestrian.
  • the audio image output unit 12052 causes the recognized pedestrian to have a rectangular contour line for emphasis.
  • the display unit 12062 is controlled so as to superimpose. Further, the audio image output unit 12052 may control the display unit 12062 to display an icon indicating a pedestrian or the like at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the image pickup devices 1, 200, 300, 400, 500 and 600 described with reference to FIGS. 1 to 14 can be applied to the image pickup unit 12031.
  • drawings in the above-described embodiments are schematic, and the dimensional ratios of the respective parts and the like do not necessarily match the actual ones. Moreover, it is needless to say that the drawings may include portions having different dimensional relationships and ratios.
  • the present technology may have the following configurations.
  • the semiconductor chip has a substantially rectangular shape in top view, The semiconductor device according to any one of (1) to (3), wherein the conductor is arranged in a square shape in a top view so as to cover the entire outer surface of the semiconductor chip.
  • the first conductor has a substantially rectangular shape in a top view
  • the second conductor is arranged in a square shape in a top view so as to cover the entire outer surface of the semiconductor chip
  • the semiconductor device according to (1) wherein the semiconductor chip is mounted on the upper side of the first conductor.
  • the size of the first conductor is larger than the size of the square-shaped second conductor
  • Imaging element 110 1, 200, 300, 400, 500, 600, 800, 900 Imaging element 110, 210, 310, 410, 510, 610, 810, 910 Substrate 120, 220, 320, 420, 540, 640, 820, 920 Imaging element Chip 131, 132, 231, 232, 331, 332, 431, 432, 561, 562, 661 Conductor 140, 240, 340, 830 Bonding wire 211 Recessed portion 351 Cutting portion 520, 620 Frame 530, 630 Seal glass 521, 621 Space 551, 651 Solder 552, 652 Underfill 1000 Camera 1002 Imaging device 10402, 12031, 12101 to 12105 Imaging unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

La présente invention réduit le bruit généré dans un dispositif à semi-conducteur. Ce dispositif semi-conducteur comprend une puce semi-conductrice et un conducteur. La puce semi-conductrice incluse dans le dispositif à semi-conducteur comporte une surface de réception de lumière pour générer un signal d'image, et délivre le signal d'image généré sur la surface de réception de lumière. De plus, le conducteur inclus dans le dispositif à semi-conducteur est disposé, dans le dispositif à semi-conducteur, dans une section autre que la surface de réception de lumière qui est de la puce semi-conductrice et qui est destinée à générer le signal d'image.
PCT/JP2019/043904 2018-12-03 2019-11-08 Dispositif à semi-conducteur et dispositif d'imagerie WO2020116088A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-226475 2018-12-03
JP2018226475A JP2020092114A (ja) 2018-12-03 2018-12-03 半導体装置および撮像装置

Publications (1)

Publication Number Publication Date
WO2020116088A1 true WO2020116088A1 (fr) 2020-06-11

Family

ID=70974641

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/043904 WO2020116088A1 (fr) 2018-12-03 2019-11-08 Dispositif à semi-conducteur et dispositif d'imagerie

Country Status (2)

Country Link
JP (1) JP2020092114A (fr)
WO (1) WO2020116088A1 (fr)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003309196A (ja) * 2002-04-16 2003-10-31 Sony Corp 磁気不揮発性メモリ素子の磁気シールドパッケージ
JP2004063765A (ja) * 2002-07-29 2004-02-26 Fuji Photo Film Co Ltd 固体撮像装置およびその製造方法
JP2004128143A (ja) * 2002-10-01 2004-04-22 Sharp Corp 光結合装置及びその製造方法並びに電子機器
JP2011147091A (ja) * 2010-01-18 2011-07-28 Sharp Corp カメラモジュールおよびその製造方法、電子情報機器
JP2012104796A (ja) * 2010-07-15 2012-05-31 Fujitsu Ltd 電子機器
JP2012109307A (ja) * 2010-11-15 2012-06-07 Renesas Electronics Corp 半導体装置及び半導体装置の製造方法
JP2015015662A (ja) * 2013-07-08 2015-01-22 株式会社ニコン 撮像ユニット及び撮像装置
JP2015065223A (ja) * 2013-09-24 2015-04-09 株式会社東芝 半導体装置及びその製造方法
JP2015119133A (ja) * 2013-12-20 2015-06-25 株式会社村田製作所 撮像装置
JP2016070848A (ja) * 2014-09-30 2016-05-09 株式会社東芝 磁気シールドパッケージ

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003309196A (ja) * 2002-04-16 2003-10-31 Sony Corp 磁気不揮発性メモリ素子の磁気シールドパッケージ
JP2004063765A (ja) * 2002-07-29 2004-02-26 Fuji Photo Film Co Ltd 固体撮像装置およびその製造方法
JP2004128143A (ja) * 2002-10-01 2004-04-22 Sharp Corp 光結合装置及びその製造方法並びに電子機器
JP2011147091A (ja) * 2010-01-18 2011-07-28 Sharp Corp カメラモジュールおよびその製造方法、電子情報機器
JP2012104796A (ja) * 2010-07-15 2012-05-31 Fujitsu Ltd 電子機器
JP2012109307A (ja) * 2010-11-15 2012-06-07 Renesas Electronics Corp 半導体装置及び半導体装置の製造方法
JP2015015662A (ja) * 2013-07-08 2015-01-22 株式会社ニコン 撮像ユニット及び撮像装置
JP2015065223A (ja) * 2013-09-24 2015-04-09 株式会社東芝 半導体装置及びその製造方法
JP2015119133A (ja) * 2013-12-20 2015-06-25 株式会社村田製作所 撮像装置
JP2016070848A (ja) * 2014-09-30 2016-05-09 株式会社東芝 磁気シールドパッケージ

Also Published As

Publication number Publication date
JP2020092114A (ja) 2020-06-11

Similar Documents

Publication Publication Date Title
CN110520997B (zh) 半导体器件、半导体器件制造方法及电子设备
TWI782058B (zh) 固體攝像裝置
JP7146376B2 (ja) 撮像装置、および電子機器
WO2017159174A1 (fr) Appareil imageur à semi-conducteurs et procédé de fabrication d'appareil imageur à semi-conducteurs
US11606519B2 (en) Imaging device
US11940602B2 (en) Imaging device
WO2020137285A1 (fr) Élément d'imagerie et procédé de fabrication d'élément d'imagerie
KR102572367B1 (ko) 반도체 장치 및 반도체 장치의 제조 방법
JPWO2019235249A1 (ja) 撮像装置
WO2020246323A1 (fr) Dispositif d'imagerie
TWI821431B (zh) 半導體元件及其製造方法
WO2022172711A1 (fr) Élément de conversion photoélectrique et dispositif électronique
US20230117904A1 (en) Sensor package, method of manufacturing the same, and imaging device
WO2020246293A1 (fr) Dispositif d'imagerie
WO2019097909A1 (fr) Élément semi-conducteur, dispositif à semi-conducteur et procédé de fabrication d'élément semi-conducteur
WO2020116088A1 (fr) Dispositif à semi-conducteur et dispositif d'imagerie
WO2019138749A1 (fr) Élément semi-conducteur, plaque de montage, dispositif à semi-conducteur, et procédé de fabrication de dispositif à semi-conducteur
JPWO2019235247A1 (ja) 撮像装置
JP7422676B2 (ja) 撮像装置
WO2021261234A1 (fr) Dispositif d'imagerie à semi-conducteur, son procédé de fabrication et appareil électronique
WO2023234069A1 (fr) Dispositif d'imagerie et appareil électronique
US20230048188A1 (en) Light-receiving device
WO2022118670A1 (fr) Dispositif d'imagerie, appareil électronique et procédé de fabrication
WO2021192584A1 (fr) Dispositif d'imagerie et son procédé de production
WO2020105331A1 (fr) Dispositif d'imagerie à semi-conducteurs et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19894317

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19894317

Country of ref document: EP

Kind code of ref document: A1