WO2017169822A1 - Élément de capture d'image à semi-conducteurs, dispositif de capture d'image, dispositif d'endoscope et instrument électronique - Google Patents

Élément de capture d'image à semi-conducteurs, dispositif de capture d'image, dispositif d'endoscope et instrument électronique Download PDF

Info

Publication number
WO2017169822A1
WO2017169822A1 PCT/JP2017/010578 JP2017010578W WO2017169822A1 WO 2017169822 A1 WO2017169822 A1 WO 2017169822A1 JP 2017010578 W JP2017010578 W JP 2017010578W WO 2017169822 A1 WO2017169822 A1 WO 2017169822A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
solid
state
light
Prior art date
Application number
PCT/JP2017/010578
Other languages
English (en)
Japanese (ja)
Inventor
清輝 黒木
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017169822A1 publication Critical patent/WO2017169822A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present disclosure relates to a solid-state imaging device, an imaging apparatus, an endoscope apparatus, and an electronic device, and in particular, a solid-state imaging element, an imaging apparatus, an endoscope apparatus, and a small-sized, high-definition binocular image that can be captured. It relates to electronic equipment.
  • Non-Patent Document 1 When combining two image sensors that respectively capture two images that generate parallax in order to measure distance or to form a three-dimensional image, a structure in which two image sensor modules are conventionally arranged in parallel and bonded together was common (see Non-Patent Document 1).
  • Non-Patent Document 1 since the binocular solid-state imaging device as in the technique described in Non-Patent Document 1 is packaged with each imaging device, the structure is realized with a small-diameter tube such as an endoscope. Therefore, there is a risk that the resolution of the image sensor is sacrificed in order to reduce the size.
  • the present disclosure has been made in view of such a situation, and in particular, a two-lens solid that is high-resolution and small by bonding an imaging element on the back surface and bonding a prism before packaging.
  • An imaging device is realized.
  • two imaging devices that capture an image are bonded to the back side of each effective imaging surface, and imaging is performed on each of the effective imaging surfaces of the two imaging devices. It is a solid-state image sensor to which a mirror for entering light in a direction is bonded.
  • the two imaging elements are independently incident with light from the imaging direction and collected with two lenses, and the two imaging elements condensed with the lenses are supplied with independent light. It is possible to further include a light color mixing prevention plate that is incident from the imaging direction.
  • the short side of the effective imaging surface can be arranged on the light incident side.
  • the mirror can be a prism mirror.
  • the two image sensors can be bonded together with an adhesive.
  • the two image sensors can be bonded together by plasma bonding.
  • the mirror may be a prism mirror, and the two image sensors may be formed on both surfaces of a single wafer, and the prisms of the prism mirror may be bonded to both surfaces of the wafer. .
  • Both of the two image sensors can have the same spectral sensitivity.
  • the two image sensors can have different spectral sensitivities.
  • the reading order of the two image sensors can be point-symmetric with each other.
  • the reading order of the two image sensors is the same as each other, the output signals of the two image sensors are temporarily stored, and the image of either one of the output signals is processed, and the other image sensor is processed.
  • a conversion unit that converts the image in the same image direction as the image of the output signal can be further included.
  • the two imaging elements can each include the conversion unit.
  • the conversion unit may be provided in a logic chip sandwiched between the two image sensors.
  • two imaging elements that capture an image are bonded to each other on the back side of each effective imaging surface, and each of the effective imaging surfaces of the two imaging elements is It is an endoscope apparatus in which a mirror for entering light in an imaging direction is bonded.
  • two imaging elements that capture an image are bonded to each other on the back side of each effective imaging surface, and the imaging direction is set on each of the effective imaging surfaces of the two imaging elements.
  • This is an electronic device to which a mirror for making the light incident is bonded.
  • two imaging elements that capture an image are bonded to each other on the back side of each effective imaging surface, and an imaging direction is applied to each of the effective imaging surfaces of the two imaging elements.
  • This is an image pickup apparatus to which a mirror that makes the light incident is bonded.
  • two image sensors that capture an image are bonded to each other on the back side of each effective image pickup surface, and light in the image pickup direction is applied to each of the effective image pickup surfaces of the two image sensors.
  • a mirror for making the light incident is attached.
  • FIG. 5 is a diagram illustrating a configuration example of a memory logic circuit for performing signal processing on the image signal of FIG. 4.
  • FIG. 5 is a diagram illustrating a configuration example of a memory logic circuit for performing signal processing on the image signal of FIG. 4.
  • FIG. 5 is a diagram for explaining another example of the configuration of a memory logic circuit for signal processing the image signal of FIG. 4. It is a figure explaining the other example of an image sensor module. It is a figure explaining the further another example of an image sensor module. It is a block diagram showing an example of composition of an imaging device as electronic equipment to which this art is applied. It is a figure explaining the usage example of the solid-state imaging device to which the technique of this indication is applied. It is a block diagram which shows an example of a schematic structure of an in-vivo information acquisition system. It is a figure which shows an example of a schematic structure of an endoscopic surgery system. It is a block diagram which shows an example of a function structure of a camera head and CCU. It is a block diagram which shows an example of a schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part.
  • a conventional endoscope apparatus 1 using a binocular solid-state image sensor 2 is a binocular solid-state image sensor configured by adhering general single-lens image sensor modules 2a-1 and 2a-2 side by side. 2 was inserted into the endoscope tube 5.
  • the image pickup device modules 2a-1 and 2a-2 are provided with image pickup devices 3-1 and 3-2 and flexible wirings 4-1 and 4-2, each of which is composed of CMOS (Complementary Metal Oxide ⁇ Semiconductor) or the like.
  • CMOS Complementary Metal Oxide ⁇ Semiconductor
  • binocular solid-state imaging device 2 in FIG. 1 is packaged with the respective imaging device modules 2a-1 and 2a-2, in order to realize the structure with a small diameter tube such as an endoscope.
  • a small diameter tube such as an endoscope.
  • the binocular solid-state imaging device of the present disclosure is configured to be able to capture a high-resolution image even with a small-diameter tube such as an endoscope.
  • the binocular solid-state imaging device of the present disclosure can be applied to, for example, the imaging device module 22 of the endoscope apparatus 11 as shown in FIG.
  • FIG. 2 shows an example in which the lower left portion faces the imaging direction of the endoscope apparatus 11 of the present disclosure and the two-lens imaging areas Z11 and Z12 are arranged in the vertical direction in the figure.
  • 2 is a front view of the endoscope apparatus 11.
  • the lower right part of FIG. 2 is a side sectional view of the endoscope apparatus 11 when the two-lens imaging regions Z11 and Z12 are arranged in the vertical direction.
  • the upper right part of FIG. 2 is a side sectional view of the endoscope apparatus 11 when the two-lens imaging regions Z11 and Z12 are arranged in the horizontal direction.
  • FIG 2 shows an example in which the binocular solid-state imaging device of the present disclosure is used as the imaging device module 22 provided in the endoscope tube 21 of the endoscope apparatus 11, but other than the endoscope tube 21. It may be provided.
  • the endoscope apparatus 11 includes a color mixture prevention plate 31, lenses 32-1 and 32-2, prism mirrors 33-1 and 33-2, imaging elements 34-1 and 34-2, a flexible wiring reinforcing resin 35, and a flexible wiring. 36-1 and 36-2.
  • the color mixing prevention plate 31, the prism mirrors 33-1 and 33-2, and the image pickup devices 34-1 and 34-2 constitute the image pickup device module 22.
  • the imaging element module 22 may further include all or either of the lenses 32-1 and 32-2, the flexible wiring reinforcing resin 35, and the flexible wirings 36-1 and 36-2. Good.
  • the imaging elements 34-1 and 34-2 are obtained by bonding the substrates constituting each of the imaging elements 34-1 and 34-2 in a state where the substrates are aligned with high accuracy by using an adhesive, and lenses 33-1 and 33, respectively.
  • -2 captures an image based on the incident light collected by -2, converts the captured image into an image signal, and outputs the image signal to a subsequent apparatus via the flexible wirings 36-1 and 36-2.
  • the image pickup devices 34-1 and 34-2 may be plasma bonded without using an adhesive in the state of a wafer to be manufactured.
  • plasma welding since there is no adhesive that is a different material between the image pickup devices 34-1 and 34-2, a difference in linear expansion coefficient does not occur, and distortion due to thermal effects does not occur.
  • the image sensor module 22 as a binocular solid-state image sensor suitable for a range of applications can be configured.
  • the color mixing prevention plate 31 is a light shielding plate that partitions an endoscope tube provided at an intermediate position where the imaging elements 34-1 and 34-2 are bonded to each other and to the lenses 32-1 and 32-2. . That is, the color mixing prevention plate 31 is configured so that light incident through the optical path L1 enters the imaging region Z11 of the imaging device 34-1 and light incident through the optical path L2 enters the imaging region Z12 of the imaging device 34-2. Yes. In other words, in the color mixing prevention plate 31, the light incident through the optical path L1 enters the imaging region Z12 of the imaging device 34-2, or the light incident through the optical path L2 captures the image of the imaging device 34-1. The incident to the region Z11 is prevented, thereby suppressing the occurrence of color mixing.
  • the prism mirrors 33-1 and 33-2 reflect and project the optical paths L1 and L2 incident from the imaging direction on the imaging regions Z11 and Z12 of the imaging devices 34-1 and 34-2, respectively.
  • the prism mirrors 33-1 and 33-2 are configured with a minimum size that can cover the imaging regions Z11 and Z12. For example, when the imaging elements 34-1 and 34-2 are rectangles having an aspect ratio of 16: 9 or 4: 3, the prism mirrors 33-1 and 33-2 are respectively connected to the imaging elements 34-1 and 34-2. If the size of the short side of the image sensor is large enough to cover, a smaller image sensor module 22 can be configured.
  • the optical paths L1 and L2 of the incident light are reflected by the prism mirrors 33-1 and 33-2 and are incident on the respective image pickup devices 34-1 and 34-2.
  • the optical paths L1 and L2 of the incident light are guided to the image pickup devices 34-1 and 34-2, other configurations may be used.
  • a mirror may be simply provided.
  • connection portions between the terminals of the image pickup devices 34-1 and 34-2 and the flexible wirings 36-1 and 36-2 are fixed by the flexible wiring reinforcing resin 35, and the connection strength is reinforced. .
  • flexible wirings 36-1 and 36-2 are connected to the output terminal portion of the image sensor module 22, and image signals output from the image sensors 34-1 and 34-2. An example in which is output is shown.
  • the method of outputting image signals from the image sensors 34-1 and 34-2 is not limited to the flexible wirings 36-1 and 36-2, and the lead wires are joined to the image sensors 34-1 and 34-2. It may be a structured.
  • flexible wiring and lead wires can be pulled out along the endoscope tube 21 without being bent in the endoscope tube 21.
  • the flexible wirings 36-1 and 36-2 are not bent, so that the assemblability is good, and since the bending is not possible, the stress applied to the connection portion can be reduced, and the connection reliability can be improved. It has become.
  • the flexible wiring reinforcing resin 35 not only improves the connection reliability of the connecting portions between the flexible wirings 36-1 and 36-2 and the terminal portions of the imaging devices 34-1 and 34-2, but also includes the imaging devices 34-1 and 34-1. It is also possible to protect the end face of 34-2 and the end faces of the prism mirrors 33-1 and 33-2.
  • the image pickup devices 34-1 and 34-2 are made of silicon chips and are covered with the prism mirrors 33-1 and 33-2, the chips of the image pickup devices 34-1 and 34-2 are missing. Or the generation
  • the upper left part of the drawing is a front view seen from the imaging direction when the imaging element module 22, which is a two-lens solid-state imaging element of the present disclosure, is inserted into the endoscope tube 21.
  • the upper right part in the figure is a side cross-sectional view when the imaging element module 22 is inserted into the endoscope tube 21.
  • the upper part of FIG. 3 explains the size of the diameter of the endoscope tube 21 when viewed from the front when viewed from the imaging direction when the imaging element module 22 is inserted into the endoscope tube 21. Therefore, the lenses 32-1 and 32-2 are omitted.
  • the lower left upper part in the figure is a front view seen from the imaging direction when the conventional binocular solid-state imaging device 2 is inserted into the endoscope tube 5, and the lower right part in the figure is the conventional lower part. It is side surface sectional drawing when the 2 eyes solid-state image sensor 2 is inserted in the endoscope tube 5.
  • FIG. 3 is a front view seen from the imaging direction when the conventional binocular solid-state imaging device 2 is inserted into the endoscope tube 5
  • the lower right part in the figure is the conventional lower part. It is side surface sectional drawing when the 2 eyes solid-state image sensor 2 is inserted in the endoscope tube 5.
  • the image sensor modules 2a-1 and 2a-2 in the conventional two-lens solid-state image sensor 2 have imaging regions Z21 and Z22 in the image sensors 3-1 and 3-2, respectively. Also, packages 2b-1 and 2b-2 are provided so as to surround a wide range.
  • the diameter D2 of the endoscope tube 5 is set according to the outer size of the packages 2b-1 and 2b-2 viewed from the imaging direction. Therefore, the diameter D2 of the endoscope tube 5 cannot be made smaller than the packages 2b-1 and 2b-2.
  • the imaging elements 34-1 and 34-2 have a structure in which the back portions are bonded to each other, and the package 2b-1, No structure corresponding to 2b-2 exists, and only prism mirrors 33-1 and 33-2 corresponding to the sizes of the imaging regions Z11 and Z12 in the imaging devices 34-1 and 34-2 are provided. It is. Therefore, the diameter D1 of the endoscope tube 21 can be set to the minimum size that can accommodate the imaging regions Z11 and Z12 of the imaging elements 34-1 and 34-2.
  • the imaging regions Z21 and Z22 of the imaging devices 3-1 and 3-2 and the imaging devices 34-1 and 34- are the same size, at least the diameter D1 of the endoscope tube 21 using the imaging device module 22 is the endoscope tube 5 using the binocular solid-state imaging device 2. It is possible to make the structure sufficiently smaller than the diameter D2.
  • the imaging regions Z11 and Z12 which are effective imaging surfaces of the imaging devices 34-1 and 34-2 are squares that are not square, by arranging the short side on the light incident side, The diameter of 33-2 can be further reduced. For this reason, the diameter D1 of the endoscope tube 21 into which the imaging element module 22 is inserted can be further reduced, and the size can be further reduced.
  • the imaging element module 22 that is a two-lens solid-state imaging element of the present disclosure
  • the imaging element module 22 when imaging an image P ⁇ b> 1 indicated as “7”, it is perpendicular to the image P ⁇ b> 1 from above.
  • the lenses 32-1 and 32-2 are arranged in the direction, and the connection terminals 34a-1 and 34a-2 are arranged rearward with respect to the incident direction with respect to the imaging devices 34-1 and 34-2.
  • the image P1 incident through the optical paths L1 and L2 is captured as follows.
  • the image P1 incident through the optical path L1 is shifted from the image P1 shown in the upper right portion of the drawing on the image pickup device 34-1 through the prism 32-1 through the lens 32-1. Captured as an inverted image P11.
  • the image P1 incident through the optical path L2 passes through the lens 32-2 and the prism mirror 33-2, and on the image sensor 34-2, as shown in the lower right part of the figure.
  • the image P1 is imaged as an upside down image P12.
  • the image P11 picked up by the image pickup device 34-1 and the image P12 picked up by the image pickup device 34-2 are picked up as inverted images.
  • a memory logic circuit 51 for processing the image signals of the images P11 and P12 is provided at the subsequent stage of the image pickup devices 34-1 and 34-2. Processed and output.
  • the image sensor 34-1 sequentially reads out pixel signals in the left direction from the upper right pixel indicated by the star in the image P11 in the figure. Each time reading is completed, the process of sequentially reading the pixel signals in the left direction from the pixel in the rightmost column below one row is repeated.
  • the image sensor 34-2 sequentially reads out pixel signals from the upper right pixel indicated by the star in the image P12 in the figure in the left direction, and reads out one row. Each time is completed, the process of sequentially reading out the pixel signals in the left direction from the pixel in the rightmost column below one row is repeated.
  • the image sensor 11-1 outputs an image P 11 in which the left and right of “7” of the image P 1 are inverted as an image signal S 1, and the image sensor 3-4 inverts the “7” of image P 1.
  • the image P12 is output as the image signal S2.
  • the memory logic circuit 51 detects the pixel signal S1 corresponding to the image P11 in which the left and right of “7” of the image P1 are inverted, and the pixel corresponding to the image P12 in which the upper and lower of “7” of the image P1 is inverted.
  • the signal S2 is stored as the image signals M1 and M2 as they are.
  • the memory logic circuit 51 stores one of the stored image signals M1, M2 as, for example, the image signal M3 by inverting the image signal M2 upside down.
  • the image signal M3 is an image signal obtained by inverting the left and right of “7” of the image P1.
  • the memory logic circuit 51 combines the image signals M1 and M3, generates the parallax information D, and outputs it.
  • parallax information is output as a signal processing result from the images P11 and P12 imaged by the image sensor module 22.
  • an example in which parallax information is output as a signal processing result has been described.
  • other signal processing may be used as long as signal processing using a parallax image is performed, for example, a three-dimensional image signal is generated. You may make it output.
  • the example in which the image signal M3 is generated by inverting the image signal M2 in the second processing has been described.
  • the image signal M4 is inverted upside down.
  • the image signals M2 and M4 may be combined to generate the parallax information D.
  • the two image pickup devices 34-1 and 34-2 have the same spectral sensitivity, and both process image signals obtained by receiving light of the same wavelength band made of visible light.
  • the wavelength distribution of light received by any one of the imaging elements 34 may be other than the visible light region with different spectral sensitivities.
  • the temperature distribution is set so that infrared light can be received. You may make it image-capture and perform a signal processing using a visible light image and an infrared light image.
  • the pixel signals are sequentially read in the left direction from the pixels in the upper right part indicated by the asterisks in FIG. 5 of the images P11 and P12, respectively.
  • the example in which the vertically inverted image P12 is supplied as the image signals S1 and S2 has been described, but the above-described inversion processing is omitted by matching the readout start position and readout order of the readout pixels with the orientation of the image. It may be.
  • the image sensor 34-1 sequentially reads the image P11 from the upper right pixel of the image indicated by the star in the left direction, and 1 each time reading of one row is completed. The process of sequentially reading out pixel signals from the pixels in the rightmost column below the row in the left direction is repeated.
  • the image sensor 34-2 sequentially reads the image P12 from the lower left portion of the image indicated by the star in the right direction, and from the leftmost column on the first row to the right every time one row is read. The process of sequentially reading out pixel signals is repeated.
  • the pixel readout order is a process of reading out pixel signals at positions that are point-symmetric with each other in each of the imaging devices 34-1 and 34-2, so that an image captured by the imaging device 34-2 is obtained. Since P12 is read out in a state where the pixel position is inverted upside down, the image P12 read out from the image sensor 34-2 at the time when the reading is completed is on the read out pixel signal S2. As a result, the image P1 similar to the image P11 can be handled as a horizontally inverted image signal.
  • the memory logic circuit 51 can generate the parallax information as it is.
  • the memory logic circuit 51 converts the image signals output from the image sensors 34-1 and 34-2 to terminals 34a-1 and 34a-2 and flexible wirings 36-1,
  • the signal processing result obtained via 36-2 and subjected to the signal processing described with reference to FIG. 5 may be output to the subsequent stage.
  • the memory logic circuit 51 is provided on the substrate (chip) on which the imaging elements 34-1 and 34-2 are bonded, and is provided between the imaging region and the terminal 34a. You may be made to do. That is, in the example of FIG. 8, the memory logic circuit 51 is laid out on the signal processing chip on the image sensor 34-1. In this case, a through electrode is provided on the substrate to which the image pickup devices 34-1 and 34-2 are bonded, and the memory logic circuit 51 is provided on either surface, and the memory logic circuit 51 is connected to the image pickup device 34-. The image signals picked up by 1 and 34-2 are signal-processed and output through the flexible wiring 36. As a result, since the area for laying out the memory logic circuit 51 can be reduced, the image sensor module 22 can be further downsized.
  • ⁇ Other configuration position 2 of the memory logic circuit> As shown by the image sensor module 22 in FIG. 9, the back surfaces of the image sensors 34-1 and 34-2 are bonded to both surfaces of the memory logic circuit 51, and the image sensors 34-1 and 34-2 are respectively connected by through electrodes. Alternatively, the flexible wiring 71 may be connected only to the terminal 34a-1 of the image sensor 34-1.
  • the imaging elements 34-1 and 34-2 may be formed on the silicon wafer 101 by double-side patterning.
  • the image sensor module 22 including a binocular solid-state image sensor that has a small number of parts and is easy to assemble.
  • the present invention is not limited to the endoscope device, and may be incorporated into, for example, a WEB camera of a personal computer, a smartphone camera, or a car safety device.
  • a WEB camera of a personal computer a WEB camera of a personal computer
  • smartphone camera a smartphone camera
  • car safety device By applying to such a configuration, it becomes possible to make a three-dimensional image using a personal computer and measure the distance to the object.
  • it when installing as a safety device for automobiles, for example, it can be installed in a very small space such as the back side of a room mirror. It is possible to prevent such a situation.
  • the two-lens image pickup device with the back surface bonded thereto is small enough to allow light to enter the image pickup regions Z11 and Z12 of the image pickup device with the color mixing prevention plate interposed therebetween.
  • the prism mirror By providing the prism mirror, it is possible to realize a small and high-definition binocular solid-state imaging device.
  • a two-lens solid-state imaging device can be realized.
  • the back surface of the image sensor that becomes a two-lens image is bonded by plasma bonding, or a two-eye image sensor is formed on both surfaces of a single wafer by patterning, so that the linear expansion difference between the two-lens image sensors is increased.
  • the influence of the thermal expansion difference of the two-lens image sensor due to the temperature effect in the manufacturing process and the drive heat generation of the image sensor can be reduced, so that two images captured by each of the two-lens image sensor It is possible to improve the synthesis accuracy of the images.
  • the distance to the subject is measured and the image is three-dimensionalized from the parallax information between the two-lens imaging elements. This can be realized and an image having a depth can be taken.
  • the light receiving sensitivity of one image sensor 34 may be a short wavelength spectral sensitivity
  • the light receiving sensitivity of the other image sensor 34 may be a long wavelength spectral sensitivity.
  • the surface layer image and the deep part image of the imaged subject can be acquired, and a highly functional image to be synthesized can be obtained.
  • the above-described imaging element module 22 can be applied to various electronic devices such as an imaging device such as a digital still camera and a digital video camera, a mobile phone having an imaging function, or other devices having an imaging function. it can.
  • FIG. 11 is a block diagram illustrating a configuration example of an imaging apparatus as an electronic apparatus to which the present technology is applied.
  • An imaging apparatus 201 illustrated in FIG. 11 includes an optical system 202, a shutter device 203, a solid-state imaging device 204, a drive circuit 205, a signal processing circuit 206, a monitor 207, and a memory 208, and displays still images and moving images. Imaging is possible.
  • the optical system 202 includes one or more lenses, guides light (incident light) from a subject to the solid-state image sensor 204, and forms an image on the light receiving surface of the solid-state image sensor 204.
  • the shutter device 203 is disposed between the optical system 202 and the solid-state imaging device 204, and controls the light irradiation period and the light-shielding period to the solid-state imaging device 204 according to the control of the drive circuit 1005.
  • the solid-state image sensor 204 is configured by a package including the above-described solid-state image sensor.
  • the solid-state imaging device 204 accumulates signal charges for a certain period in accordance with light imaged on the light receiving surface via the optical system 202 and the shutter device 203.
  • the signal charge accumulated in the solid-state image sensor 204 is transferred according to a drive signal (timing signal) supplied from the drive circuit 205.
  • the drive circuit 205 outputs a drive signal for controlling the transfer operation of the solid-state image sensor 204 and the shutter operation of the shutter device 203 to drive the solid-state image sensor 204 and the shutter device 203.
  • the signal processing circuit 206 performs various types of signal processing on the signal charges output from the solid-state imaging device 204.
  • An image (image data) obtained by the signal processing by the signal processing circuit 206 is supplied to the monitor 207 and displayed, or supplied to the memory 208 and stored (recorded).
  • FIG. 12 is a diagram illustrating a usage example in which the above-described solid-state imaging device 21 is used.
  • the imaging device described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays as follows.
  • Devices for taking images for viewing such as digital cameras and mobile devices with camera functions
  • Devices used for traffic such as in-vehicle sensors that capture the back, surroundings, and interiors of vehicles, surveillance cameras that monitor traveling vehicles and roads, and ranging sensors that measure distances between vehicles, etc.
  • Equipment used for home appliances such as TVs, refrigerators, air conditioners, etc. to take pictures and operate the equipment according to the gestures ⁇ Endoscopes, equipment that performs blood vessel photography by receiving infrared light, etc.
  • Equipment used for medical and health care ⁇ Security equipment such as security surveillance cameras and personal authentication cameras ⁇ Skin measuring instrument for photographing skin and scalp photography Such as a microscope to do beauty Equipment used for sports-Equipment used for sports such as action cameras and wearable cameras for sports applications-Used for agriculture such as cameras for monitoring the condition of fields and crops apparatus
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 13 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technique according to the present disclosure (present technique) can be applied.
  • the in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.
  • the capsule endoscope 10100 is swallowed by the patient at the time of examination.
  • the capsule endoscope 10100 has an imaging function and a wireless communication function, and moves inside the organ such as the stomach and the intestine by peristaltic motion or the like until it is spontaneously discharged from the patient.
  • Images (hereinafter also referred to as in-vivo images) are sequentially captured at predetermined intervals, and information about the in-vivo images is sequentially wirelessly transmitted to the external control device 10200 outside the body.
  • the external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001. Further, the external control device 10200 receives information about the in-vivo image transmitted from the capsule endoscope 10100 and, based on the received information about the in-vivo image, displays the in-vivo image on the display device (not shown). The image data for displaying is generated.
  • an in-vivo image obtained by imaging the inside of the patient's body can be obtained at any time in this manner until the capsule endoscope 10100 is swallowed and discharged.
  • the capsule endoscope 10100 includes a capsule-type casing 10101.
  • a light source unit 10111 In the casing 10101, a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, and a power supply unit 10116 and the control unit 10117 are stored.
  • the light source unit 10111 is composed of a light source such as an LED (Light Emitting Diode), for example, and irradiates the imaging field of the imaging unit 10112 with light.
  • a light source such as an LED (Light Emitting Diode), for example, and irradiates the imaging field of the imaging unit 10112 with light.
  • the image capturing unit 10112 includes an image sensor and an optical system including a plurality of lenses provided in front of the image sensor. Reflected light (hereinafter referred to as observation light) of light irradiated on the body tissue to be observed is collected by the optical system and enters the image sensor. In the imaging unit 10112, in the imaging element, the observation light incident thereon is photoelectrically converted, and an image signal corresponding to the observation light is generated. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113.
  • the image processing unit 10113 is configured by a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the imaging unit 10112.
  • the image processing unit 10113 provides the radio communication unit 10114 with the image signal subjected to signal processing as RAW data.
  • the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to signal processing by the image processing unit 10113, and transmits the image signal to the external control apparatus 10200 via the antenna 10114A.
  • the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A.
  • the wireless communication unit 10114 provides a control signal received from the external control device 10200 to the control unit 10117.
  • the power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from a current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using a so-called non-contact charging principle.
  • the power supply unit 10116 is composed of a secondary battery, and stores the electric power generated by the power supply unit 10115.
  • FIG. 13 in order to avoid complication of the drawing, illustration of an arrow or the like indicating a power supply destination from the power supply unit 10116 is omitted, but the power stored in the power supply unit 10116 is stored in the light source unit 10111.
  • the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117 can be used for driving them.
  • the control unit 10117 includes a processor such as a CPU, and a control signal transmitted from the external control device 10200 to drive the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power feeding unit 10115. Control accordingly.
  • a processor such as a CPU
  • the external control device 10200 is configured by a processor such as a CPU or GPU, or a microcomputer or a control board in which a processor and a storage element such as a memory are mounted.
  • the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
  • the capsule endoscope 10100 for example, the light irradiation condition for the observation target in the light source unit 10111 can be changed by a control signal from the external control device 10200.
  • an imaging condition for example, a frame rate or an exposure value in the imaging unit 10112
  • a control signal from the external control device 10200 can be changed by a control signal from the external control device 10200.
  • the contents of processing in the image processing unit 10113 and the conditions (for example, the transmission interval, the number of transmission images, etc.) by which the wireless communication unit 10114 transmits an image signal may be changed by a control signal from the external control device 10200. .
  • the external control device 10200 performs various image processing on the image signal transmitted from the capsule endoscope 10100, and generates image data for displaying the captured in-vivo image on the display device.
  • image processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing ( Various signal processing such as electronic zoom processing can be performed.
  • the external control device 10200 controls driving of the display device to display an in-vivo image captured based on the generated image data.
  • the external control device 10200 may cause the generated image data to be recorded on a recording device (not shown) or may be printed out on a printing device (not shown).
  • the technology according to the present disclosure can be applied to the imaging unit 10112 among the configurations described above.
  • the imaging element module 22 in FIG. 3 can be applied to the imaging unit 10112.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 14 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
  • FIG. 14 illustrates a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
  • an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
  • An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111.
  • the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
  • the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
  • the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
  • FIG. 15 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 includes an imaging element.
  • One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
  • image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
  • 3D 3D
  • the imaging unit 11402 is not necessarily provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Auto White Balance
  • the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
  • the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
  • the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
  • the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400.
  • communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the imaging unit 11402 of the camera head 11102 among the configurations described above.
  • the imaging element module 22 in FIG. 3 can be applied to the imaging unit 10402.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 16 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 17 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detection of a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 17 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed.
  • voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the imaging element module 22 in FIG. 3 can be applied to the imaging unit 12031.
  • this indication can also take the following structures.
  • Two image pickup devices for picking up images are pasted on the back side of each effective image pickup surface, A solid-state image pickup device in which a mirror that makes light in an image pickup direction enter each of the effective image pickup surfaces of the two image pickup devices.
  • ⁇ 3> The solid-state imaging device according to ⁇ 1> or ⁇ 2>, wherein a short side of the effective imaging surface is disposed on a light incident side.
  • ⁇ 4> The solid-state imaging device according to any one of ⁇ 1> to ⁇ 3>, wherein the mirror is a prism mirror.
  • ⁇ 5> wiring for inputting and outputting signals to the image sensor;
  • ⁇ 6> The solid-state imaging device according to any one of ⁇ 1> to ⁇ 5>, wherein the two imaging devices are bonded together with an adhesive.
  • ⁇ 7> The solid-state imaging device according to any one of ⁇ 1> to ⁇ 6>, wherein the two imaging devices are bonded together by plasma bonding.
  • the mirror is a prism mirror
  • the two image sensors are formed on both sides of one wafer
  • ⁇ 9> The solid-state imaging device according to any one of ⁇ 1> to ⁇ 8>, wherein both of the two imaging devices have the same spectral sensitivity.
  • ⁇ 10> The solid-state imaging device according to any one of ⁇ 1> to ⁇ 9>, wherein the two imaging devices have different spectral sensitivities.
  • the solid-state imaging device according to any one of ⁇ 1> to ⁇ 10>, wherein the reading order of the two imaging devices is point-symmetric with each other.
  • the reading order of the two image sensors is the same,
  • the image processing apparatus further includes a conversion unit that temporarily stores output signals of the two image pickup devices and performs signal processing on an image of one of the output signals and converts the image in the same image direction as the image of the other output signal.
  • the solid-state image sensor in any one of ⁇ 11>.
  • ⁇ 13> The solid-state imaging device according to ⁇ 12>, wherein each of the two imaging devices includes the conversion unit.
  • ⁇ 14> The solid-state imaging device according to ⁇ 12>, wherein the conversion unit is provided in a logic chip sandwiched between the two imaging devices.
  • Two image pickup devices for picking up images are bonded to the back side of each effective image pickup surface, An endoscope apparatus in which a mirror that allows light in an imaging direction to enter is bonded to each of the effective imaging surfaces of the two imaging elements.
  • Two image pickup devices for picking up images are bonded on the back side of each effective image pickup surface, An electronic apparatus in which a mirror that allows light in an imaging direction to enter is bonded to each of the effective imaging surfaces of the two imaging elements.
  • Two image pickup devices for picking up images are bonded on the back side of each effective image pickup surface, An imaging apparatus in which a mirror that allows light in an imaging direction to enter is bonded to each of the effective imaging surfaces of the two imaging elements.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Astronomy & Astrophysics (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Endoscopes (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un élément de capture d'image à semi-conducteurs, un dispositif de capture d'image, un dispositif d'endoscope et un instrument électronique qui permettent d'obtenir un élément de capture d'image à semi-conducteurs binoculaire qui a une résolution élevée et est compact. Dans cet élément de capture d'image à semi-conducteurs, des surfaces arrière de deux éléments de capture d'image sont liées l'une à l'autre, et des miroirs de prisme qui guident la lumière incidente à partir d'une direction de capture d'image sur des régions de capture d'image effectives de chaque élément de capture d'image sont prévus. La présente invention peut être appliquée à des éléments de capture d'image à semi-conducteurs binoculaires.
PCT/JP2017/010578 2016-03-30 2017-03-16 Élément de capture d'image à semi-conducteurs, dispositif de capture d'image, dispositif d'endoscope et instrument électronique WO2017169822A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-067332 2016-03-30
JP2016067332 2016-03-30

Publications (1)

Publication Number Publication Date
WO2017169822A1 true WO2017169822A1 (fr) 2017-10-05

Family

ID=59965365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010578 WO2017169822A1 (fr) 2016-03-30 2017-03-16 Élément de capture d'image à semi-conducteurs, dispositif de capture d'image, dispositif d'endoscope et instrument électronique

Country Status (1)

Country Link
WO (1) WO2017169822A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108888119A (zh) * 2018-09-10 2018-11-27 青岛海尔智能技术研发有限公司 一种自动注水装置及注水控制方法
CN115379102A (zh) * 2022-09-19 2022-11-22 昆山丘钛光电科技有限公司 相机模组及内窥镜

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001017389A (ja) * 1999-07-07 2001-01-23 Olympus Optical Co Ltd 固体撮像装置
JP2003250071A (ja) * 2002-02-26 2003-09-05 Sony Corp イメージセンサおよび撮像装置
JP2010021283A (ja) * 2008-07-09 2010-01-28 Panasonic Corp 固体撮像装置およびその製造方法
JP2010245506A (ja) * 2009-03-19 2010-10-28 Sony Corp 半導体装置とその製造方法、及び電子機器
JP2011082215A (ja) * 2009-10-02 2011-04-21 Olympus Corp 撮像素子、撮像ユニット

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001017389A (ja) * 1999-07-07 2001-01-23 Olympus Optical Co Ltd 固体撮像装置
JP2003250071A (ja) * 2002-02-26 2003-09-05 Sony Corp イメージセンサおよび撮像装置
JP2010021283A (ja) * 2008-07-09 2010-01-28 Panasonic Corp 固体撮像装置およびその製造方法
JP2010245506A (ja) * 2009-03-19 2010-10-28 Sony Corp 半導体装置とその製造方法、及び電子機器
JP2011082215A (ja) * 2009-10-02 2011-04-21 Olympus Corp 撮像素子、撮像ユニット

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108888119A (zh) * 2018-09-10 2018-11-27 青岛海尔智能技术研发有限公司 一种自动注水装置及注水控制方法
CN115379102A (zh) * 2022-09-19 2022-11-22 昆山丘钛光电科技有限公司 相机模组及内窥镜

Similar Documents

Publication Publication Date Title
JP7449317B2 (ja) 撮像装置
WO2017163927A1 (fr) Boîtier de taille de puce, procédé de production, appareil électronique et endoscope
WO2017159361A1 (fr) Élément imageur à semi-conducteurs et dispositif électronique
WO2018139278A1 (fr) Élément de capture d'image, procédé de fabrication, et dispositif électronique
US10915009B2 (en) Compound-eye camera module and electronic device
JP7167036B2 (ja) センサチップおよび電子機器
WO2018180569A1 (fr) Dispositif de capture d'image à semi-conducteur et appareil électronique
WO2018159344A1 (fr) Élément d'imagerie à semiconducteur, dispositif électronique, et dispositif à semiconducteur
JP2019047237A (ja) 撮像装置、および電子機器、並びに撮像装置の製造方法
JP7237595B2 (ja) 撮像素子、製造方法、および電子機器
JP6869717B2 (ja) 撮像装置および撮像装置の製造方法、並びに、電子機器
WO2018135261A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif électronique, et procédé de fabrication d'élément d'imagerie à semi-conducteurs
WO2017169889A1 (fr) Module de caméra, procédé de production de module de caméra, dispositif d'imagerie et dispositif électronique
US11785321B2 (en) Imaging device
JP6976751B2 (ja) 撮像装置および撮像装置の製造方法、並びに、電子機器
WO2018131510A1 (fr) Élément de capture d'image à l'état solide et dispositif électronique
WO2017169822A1 (fr) Élément de capture d'image à semi-conducteurs, dispositif de capture d'image, dispositif d'endoscope et instrument électronique
WO2019171879A1 (fr) Dispositif d'imagerie
WO2017150168A1 (fr) Élément d'imagerie et dispositif électronique
WO2020084973A1 (fr) Dispositif de traitement d'image
JP2019040892A (ja) 撮像装置、カメラモジュール、及び、電子機器
US20220269035A1 (en) Solid-state image sensor, imaging apparatus, and method of controlling solid-state image sensor
US11457201B2 (en) Imaging device and electronic apparatus
US20240145506A1 (en) Semiconductor package and electronic device
WO2019239767A1 (fr) Dispositif de capture d'image

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17774366

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17774366

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP