US20150222805A1 - Imaging element and imaging device - Google Patents

Imaging element and imaging device Download PDF

Info

Publication number
US20150222805A1
US20150222805A1 US14/405,196 US201314405196A US2015222805A1 US 20150222805 A1 US20150222805 A1 US 20150222805A1 US 201314405196 A US201314405196 A US 201314405196A US 2015222805 A1 US2015222805 A1 US 2015222805A1
Authority
US
United States
Prior art keywords
pixels
pixel
image
focus detection
along
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/405,196
Inventor
Hironobu Murata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MURATA, HIRONOBU
Publication of US20150222805A1 publication Critical patent/US20150222805A1/en
Priority to US15/642,686 priority Critical patent/US10412294B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N5/23212
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths

Definitions

  • the present invention relates to an image sensor and an image-capturing device.
  • An image-capturing device which performs focus detection by a split-pupil phase detection method based upon output signals from a plurality of pixels dedicated for focus detection arranged on a part of an image sensor (see Japanese Laid Open Patent Publication No. 2011-77770).
  • the focus detection pixels in the related art are arranged at positions separated from one another, and thus the focus detection accuracy will be lower compared to focus detection pixels arranged at successive positions.
  • the interpolation processing executed to generate image signals in correspondence to the positions at which the focus detection pixels are arranged would become complicated.
  • an image sensor comprises: a plurality of light receiving units arranged along a first direction and along a second direction different from the first direction in a two-dimensional array; and a light receiving unit arranged at a central position among four light receiving units set next to each other along the first direction and the second direction among the plurality of light receiving units, which includes a light shielding member arranged over part thereof.
  • the light receiving unit having the light shielding member arranged over part thereof is a light receiving unit used for purposes of focus detection.
  • an image sensor comprises: a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein: the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another; a plurality of first pixel rows, each including a plurality of first pixels arranged one after another along a first direction, and a plurality of second pixel rows, each including a plurality of second pixels and a plurality of third pixels arranged at alternate positions along the first direction, are arranged at alternate positions along a second direction running perpendicular to the first direction; the first pixel rows are offset relative to the second pixel rows along the first direction by an extent approximately equal to a half pitch; a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some of the first pixels in at least some of the plurality of first pixel rows; and the plurality of focus
  • an image sensor comprises: a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein: the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another; a plurality of pixel row sets, each containing a first pixel row including a plurality of first pixels arranged one after another along a second direction, a second pixel row arranged immediately rightward relative to the first pixel row and including a plurality of second pixels arranged one after another along the second direction, a third pixel row arranged immediately rightward relative to the second pixel row and including a plurality of first pixels arranged one after another along the second direction and a fourth pixel row arranged immediately rightward relative to the third pixel row and including a plurality of third pixels arranged one after another along the second direction, are set one after another along a first direction running perpendicular to the second direction
  • the first pixels each include a green color filter
  • the second pixels each include a blue color filter
  • the third pixels each include a red color filter.
  • the pixels each assume a polygonal shape, in a plan view, which includes four sides inclining by approximately 45° relative to the first direction.
  • the focus detection pixels are arranged in place of all the first pixels.
  • an image-capturing device comprises: an image sensor according to any one of the third through seventh aspects; an image generation unit that generates image signals based upon output signals provided from the image sensor; and a focus detection unit that detects a focusing condition at the image-capturing optical system based upon output signals provided from the image sensor.
  • an image-capturing device comprises: an image sensor according to the fifth aspect; an adding unit that adds together output signals from two first pixels arranged next to each other along the first direction, adds together output signals from two second pixels arranged next to each other along the second direction and adds together output signals from two third pixels arranged next to each other along the second direction so as to output Bayer array signals; and an image generation unit that generates image signals corresponding to images formed with light fluxes having passed through the image-capturing optical system based upon the Bayer array signals output by the adding unit.
  • the focus detection pixels at the image sensor include a plurality of first focus detection pixels that receive one of the pair of light fluxes and a plurality of second focus detection pixels that receive the other of the pair of light fluxes, and the first focus detection pixels and the second focus detection pixels are arranged at alternate positions;
  • the adding unit outputs multiplication result signals each obtained by adding together output signals from a first focus detection pixel and a second focus detection pixel set next to each other along the first direction and multiplying a signal resulting from addition by a predetermined factor;
  • the image generation unit generates the image signals based upon the Bayer array signals output from the adding unit and the multiplication result signals.
  • FIG. 1 illustrates an example of a structure that may be adopted in a digital camera achieved in an embodiment of the present invention
  • FIGS. 2(A) and 2(B) illustrate unit pixels in the image sensor
  • FIGS. 3(A) and 3(B) illustrate unit pixels in the image sensor
  • FIGS. 4(A) and 4(B) illustrate micro-lenses
  • FIG. 5 is a plan view presenting an example of a positional arrangement with which pixels may be arranged in the image sensor
  • FIG. 6 illustrates how focus detection is executed through the split-pupil phase detection method
  • FIGS. 7(A) and 7(B) illustrate how G image signals are generated
  • FIGS. 8(A) and 8(B) illustrate how B image signals and R image signals are generated
  • FIG. 9 illustrates image signals output in a Bayer array
  • FIGS. 10(A) and 10(B) are views presenting examples of positional arrangements with which the focus detection pixels may be arranged in Variation 1.
  • FIG. 1 illustrates an example of a structure that may be adopted in a digital camera according to an embodiment of the present invention.
  • the digital camera 1 comprises an interchangeable lens 2 and a camera body 3 .
  • the interchangeable lens 2 is mounted at the camera body 3 via a mount unit 4 .
  • the interchangeable lens 2 includes a lens control unit 5 , a main lens 9 , a zooming lens 8 , a focusing lens 7 and an aperture 6 .
  • the lens control unit 5 which includes a microcomputer, a memory and the like, executes drive control for the focusing lens 7 and the aperture 6 , detects the opening condition at the aperture 6 , detects the positions of the zooming lens 8 and the focusing lens 7 , transmits lens information to a body control unit 14 located in the camera body 3 , as will be described later, receives camera information from the body control unit 14 , and the like.
  • An image sensor 12 , an image sensor drive control unit 19 , the body control unit 14 , a liquid crystal display element drive circuit 15 , a liquid crystal display element 16 , an eyepiece lens 17 , operation members 18 and the like are arranged at the camera body 3 .
  • a detachable memory card 20 is loaded in the camera body 3 .
  • the image sensor 12 which may be a CCD image sensor or a CMOS image sensor, is arranged at a predetermined image-forming plane of the interchangeable lens 2 and captures a subject image formed via the interchangeable lens 2 .
  • the body control unit 14 includes a microcomputer, a memory and the like.
  • the body control unit 14 executes overall operational control of the digital camera.
  • the body control unit 14 and the lens control unit 5 are configured so as to be able to communicate with each other via an electrical contact point 13 at the mount unit 4 .
  • the image sensor drive control unit 19 generates a control signal, which is required at the image sensor 12 , in response to an instruction issued by the body control unit 14 .
  • the liquid crystal display element drive circuit 15 drives the liquid crystal display element 16 configuring a liquid crystal viewfinder (EVF: electronic viewfinder) in response to an instruction issued by the body control unit 14 .
  • EMF electronic viewfinder
  • the photographer views an image displayed at the liquid crystal display element 16 via the eyepiece lens 17 .
  • the memory card 20 is a storage medium where image data and the like are stored.
  • the subject image formed on the image sensor 12 via the interchangeable lens 2 undergoes photoelectric conversion at the image sensor 12 .
  • the timing with which photoelectric conversion signals are stored and read out (the frame rate) at the image sensor 12 is controlled based upon the control signal provided by the image sensor drive control unit 19 .
  • Signals output from the image sensor 12 are converted to digital data at an A/D conversion unit (not illustrated) and the digital data resulting from the conversion are provided to the body control unit 14 .
  • the body control unit 14 calculates a defocus quantity based upon output signals, which correspond to a specific focus detection area, provided from the image sensor, and transmits the defocus quantity to the lens control unit 5 .
  • the lens control unit 5 Based upon the defocus quantity received from the body control unit 14 , the lens control unit 5 calculates a focusing lens drive quantity, and drives the focusing lens 7 via a motor (not illustrated) or the like based upon the lens drive quantity so as to move the focusing lens 7 to the focus match position.
  • the body control unit 14 generates, based upon signals output from the image sensor 17 in response to a photographing instruction, image data to be recorded.
  • the body control unit 14 stores the image data thus generated into the memory card 20 and also provides the image data to the liquid crystal display element drive circuit 15 so as to display a reproduced image corresponding to the image data at the liquid crystal display element 16 .
  • the operation members 18 including a shutter release button, a focus detection area setting member and the like, are also located at the camera body 3 .
  • the body control unit 14 detects operation signals output from the operation members 18 and controls operations (photographing processing, focus detection area setting and the like) corresponding to the detection results.
  • each unit pixel 20 assumes a substantially square shape, designed to achieve a predetermined size, and they each include a rectangular photoelectric conversion unit (PD) 30 .
  • PD photoelectric conversion unit
  • each unit pixel 20 is arranged with a specific orientation achieved by rotating it so that its sides incline at an approximately 45° angle relative to the X direction and the Y direction. It is to be noted that the photoelectric conversion unit 30 is arranged while the photoelectric conversion unit 30 is not rotated and its aspect ratio is altered.
  • this photoelectric conversion unit 30 In addition, in order to ensure that the area of this photoelectric conversion unit 30 remains the same as the area of the photoelectric conversion unit 30 illustrated in FIG. 2(A) , the four corners of the photoelectric conversion unit 30 are rounded off in correspondence to the shape of the unit pixel 20 .
  • a transistor (not illustrated) is arranged in the unit pixel 20 .
  • unit pixels 20 Two different types of unit pixels 20 , i.e., pixels used for the purpose of focus detection (hereafter referred to as focus detection pixels), and pixels other than focus detection pixels (hereafter referred to as image-capturing pixels) are arranged at the image sensor 12 .
  • Half of the photoelectric conversion unit 30 at a focus detection pixel is shielded or covered with, for instance, a light shielding metal film 40 or the like and thus, only the other half of the photoelectric conversion unit 30 is left unshielded or open, as illustrated in FIGS. 3(A) and 3(B) .
  • unit pixels 20 each having the right half of its photoelectric conversion unit 30 left unshielded (hereafter referred to as right-opening focus detection pixels), as illustrated in FIG. 3(A)
  • unit pixels 20 each having the left half of its photoelectric conversion unit 30 left unshielded (hereafter referred to as left-opening focus detection pixels), as illustrated in FIG. 3(B) , are provided as focus detection pixels.
  • the unit pixels 20 undergo a wiring process and a flattening process of the known art, and then color filter layers are arranged at the unit pixels 20 .
  • a color filter layer, through which only the red color component of the light is transmitted, is arranged at an image-capturing pixel (R pixel) that is to receive the red color component light.
  • a color filter layer, through which only the green color component of the light is transmitted, is arranged at an image-capturing pixel (G pixel) that is to receive the green color component light.
  • a color filter layer, through which only the blue color component of the light is transmitted, is arranged at an image-capturing pixel (B pixel) that is to receive the blue color component light.
  • a color filter layer through which only the green color component of light is transmitted is arranged at a focus detection pixel. Namely, the focus detection pixels receive the green color component light alone.
  • the unit pixels 20 undergo an on-chip lens formation process so as to form micro-lenses at the unit pixels 20 .
  • the micro-lenses arranged at the unit pixels 20 in the embodiment assume a shape achieved by cutting a round spherical lens on four sides, as illustrated in FIG. 4(A) . Since micro-lenses 50 assuming such a shape can be arranged in a denser array, as illustrated in FIG. 4(B) , the lens openings can be widened.
  • FIG. 5 illustrates only part of the image sensor 12 .
  • the pixels, each having the left half or the right half thereof shaded in FIG. 5 are focus detection pixels.
  • a pixel with the left half thereof shaded is a right-opening focus detection pixel, whereas a pixel with the right half thereof shaded is a left-opening focus detection pixel.
  • FIG. 5 illustrates the micro-lenses 50 as round members in order to simplify the illustration, the actual micro-lenses 50 assume the shape illustrated in FIG. 4(B) .
  • the image sensor 12 includes first pixel rows 60 , each made up with a plurality of G pixels arranged along the X direction with a predetermined pitch, and second pixel rows 70 , each made up with a plurality of B pixels and a plurality of R pixels, arranged at alternate positions along the X direction with the predetermined pitch.
  • the plurality of first pixel rows 60 and the plurality of second pixel rows 70 are arranged at alternate positions along the Y direction with the predetermined pitch.
  • a first pixel row 60 and an adjacent second pixel row 70 are arranged with an offset relative to each other along the X direction by an extent equal to half the predetermined pitch.
  • the unit pixels 20 are each arranged so that its sides all incline at approximately 45° relative to the X direction and the Y direction and thus, by arranging adjacent pixel rows with an offset to an extent equivalent to half the predetermined pitch, the plurality of unit pixels 20 can be arranged in a dense array.
  • the image sensor 12 includes third pixel rows 80 , each made up with a plurality of G pixels arranged along the Y direction with a predetermined pitch, fourth pixel rows 90 , each made up with a plurality of B pixels arranged along the Y direction with the predetermined pitch, fifth pixel rows 100 , each made up with a plurality of G pixels arranged along the Y direction with the predetermined pitch, and sixth pixel rows 110 , each made up with a plurality of R pixels arranged along the Y direction with the predetermined pitch.
  • a fourth pixel row 90 is arranged directly rightward relative to a third pixel row 80
  • a fifth pixel row 100 is arranged directly rightward relative to the fourth pixel row 90
  • a sixth pixel row 110 is arranged directly rightward relative to the fifth pixel row 100 .
  • the third through sixth pixel rows 80 through 110 are each arranged with an offset relative to the adjacent pixel rows along the Y direction by an extent equivalent to half the predetermined pitch.
  • a plurality of sets of pixel rows, each formed with a third pixel row 80 through a sixth pixel row 110 are arranged side-by-side along the X direction.
  • a plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged in place of some of the G pixels.
  • a plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged in place of some of the G pixels.
  • the plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged successively at alternate positions along the X direction within a specific focus detection area.
  • a pair of images formed with the light fluxes from the different pupil positions of the split pupil described above match each other.
  • a signal waveform i.e., a signal string a 1 , a 2 , a 3 , a 4 . . .
  • a signal waveform i.e., a signal string b 1 , b 2 , b 3 , b 4 . . .
  • a signal waveform i.e., a signal string b 1 , b 2 , b 3 , b 4 . . .
  • the pair of images formed with light fluxes resulting from the pupil splitting do not match each other on the image sensor 12 .
  • the positional relationship (the image shift direction and the image shift quantity) between the signal waveform (the signal string a 1 , a 2 , a 3 , a 4 , . . . ) obtained through the right-opening focus detection pixels and the signal waveform (the signal string b 1 , b 2 , b 3 , b 4 , . . . ) obtained through the left-openings focus detection pixels is affected by the extent of deviation relative to the focus match state (i.e., the defocus quantity).
  • the body control unit 14 calculates the focusing condition (defocus quantity) at the interchangeable lens 2 based upon the positional relationship between the signal waveform (the signal string a 1 , a 2 , a 3 , a 4 , . . . ) obtained through the right-opening focus detection pixels and the signal waveform (the signal string b 1 , b 2 , b 3 , b 4 , . . . ) obtained through the left-opening focus detection pixels, and transmits the calculation results to the lens control unit 5 where they are used as camera information. Based upon the camera information, the lens control unit 5 drives the focusing lens 7 forward/backward along the optical axis and thus, focus is adjusted so as to form a sharp image on the image sensor 12 .
  • the lens control unit 5 drives the focusing lens 7 forward/backward along the optical axis and thus, focus is adjusted so as to form a sharp image on the image sensor 12 .
  • FIGS. 7(A) and 7(B) illustrate how G image signals are generated. Double-headed arrows in FIG. 7(A) each indicate a set of G pixels, the signals output from which are added together.
  • FIG. 7(B) illustrates G image signals obtained as a result of the addition operation.
  • the body control unit 14 executing the image signal generation processing, generates G image signals by adding together the output signals from two G pixels set next to each other along the X direction in each first pixel row 60 .
  • the body control unit 14 adds the output signal from a G pixel located at an mth position along the X direction and an (n+1)th position along the Y direction (hereafter notated as a G(m, n+1) pixel) and the output signal from a G(m+2, n+1) pixel in the first pixel row 60 taking up the (n+1)th position along the Y direction, so as to generate a G image signal corresponding to a G(m+1, n+1) pixel.
  • the body control unit 14 adds the output signal from a G(m+4, n+1) pixel and the output signal from a G(m+6, n+1) pixel so as to generate a G image signal corresponding to a G(m+5, n+1) pixel. In this manner it generates G image signals corresponding to a G(m+9, n+1) pixel, a G(m+13, n+1) pixel, a G(m+17, n+1) pixel and so forth in the first pixel row 60 taking up the (n+1)th position along the Y direction.
  • the body control unit 14 adds the output signal from a G(m+2, n+3) pixel and the output signal from a G(m+4, n+3) pixel in the first pixel row 60 taking up the (n+3)th position along the Y direction, so as to generate a G image signal corresponding to a G(m+3, n+3) pixel.
  • the body control unit 14 adds the output signal from a G(m+6, n+3) pixel and a G(m+8, n+3) pixel so as to generate a G image signal corresponding to a G(m+7, n+3) pixel.
  • the body control unit 14 generates G image signals in the first pixel rows 60 at the (n+5)th position, the (n+9)th position, the (n+13)th position and so forth along the Y direction in much the same way as that in which it generates G image signals in the first pixel rows 60 at the (n+1)th position along the Y direction, as described above, and generates G image signals in the first pixel rows 60 the (n+7)th position, the (n+11)th position and the (n+15)th position along the Y direction in much the same way as that in which it generates G image signals in the first pixel rows 60 at the (n+3)th position along the Y direction.
  • the sets of G pixels, the signals output from which are added together in a given first pixel row 60 , and the sets of G pixels, the output signals from which are added together in the first pixel row 60 arranged at the position next to the given first pixel row along the Y direction are offset relative to each other along the X direction by an extent equivalent to a single G pixel.
  • G image signals corresponding to a G pixel array that includes a plurality of G pixel rows, each formed with a plurality of G pixels arranged along the X direction with a four-pixel pitch, set side-by-side along the Y direction with each pair of successive G pixel rows formed next to each other along the Y direction offset relative to each other along the X direction by an extent equivalent to a two-pixel pitch, are generated as illustrated in FIG. 7(B) .
  • a G(m+6, n+7) pixel and a G(m+10, n+7) pixel in FIG. 7(A) are right-opening focus detection pixels
  • a G(m+8, n+7) pixel and a G(m+12, n+7) pixel in FIG. 7(A) are left-opening focus detection pixels.
  • the body control unit 14 generates G image signals in conjunction with the focus detection pixels through a similar process, each by adding together the output signal from the right-opening focus detection pixel and the output signal from the left-opening focus detection pixel next to the right-opening focus detection pixel along the X direction.
  • the body control unit 14 generates a G image signal corresponding to a G(m+7, n+7) pixel by adding together the output signal from the G(m+6, n+7) pixel and the output signal from the G(m+8, n+7) pixel, and generates a G image signal corresponding to a G(m+11, n+7) pixel by adding together the output signal from the G(m+10, n+7) pixel and the output signal from the G(m+12, n+7) pixel. It is to be noted that the G(m+7, n+7) pixel and the G(m+11, n+7) pixel are shaded in FIG. 7(B) .
  • the G image signals corresponding to the G(m+7, n+7) pixel and the G(m+11, n+7) pixel are each generated by adding together the output signal from the right-opening focus detection pixel and the corresponding left-opening focus detection pixel with the shielded half areas and, for this reason, these G image signals achieve an output level half the output level of a signal generated by adding together the output signals from two image-capturing pixels. Accordingly, a G image signal with an output level equivalent to that of a G image signal generated by adding together the output signals from two image-capturing pixels can be obtained by multiplying the signal resulting from the addition of the output signal from a right-opening focus detection pixel and the output signal from the corresponding left-openings focus detection pixel, by 2 .
  • the body control unit 14 generates G image signals each by adding together the output signals from a pair of focus detection pixels arranged at successive positions along the X direction, in much the same way as adding together the output signals from two image-capturing pixels arranged next to each other, and then multiplying the signal representing the sum by 2.
  • FIGS. 8(A) and 8(B) illustrate how B image signals and R image signals are generated. Double-headed arrows in FIG. 8(A) each indicate a set of B pixels or R pixels, the signals output from which are added together.
  • FIG. 8(B) illustrates B image signals and R image signals obtained as a result of the addition operation.
  • the body control unit 14 executing the image signal generation processing, generates B image signals each by adding together the output signals from two B pixels set next to each other along the Y direction in a fourth pixel row 90 .
  • the body control unit 14 generates R image signals by adding together the output signals from two R pixels set next to each other along the Y direction in a sixth pixel row 110 .
  • the body control unit 14 adds the output signal from a B pixel located at an (m+1)th position along the X direction and an (n+2)th position along the Y direction (hereafter notated as a B(m+1, n+2) pixel) and the output signal from a B(m+1, n+4) pixel in the fourth pixel row 90 taking up the (m+1)th position along the X direction, so as to generate a B image signal corresponding to a B(m+1, n+3) pixel.
  • the body control unit 14 adds the output signal from a B(m+1, n+6) pixel and the output signal from a G(m+1, n+8) pixel so as to generate a B image signal corresponding to a B(m+1, n+7) pixel. In this manner it generates B image signals corresponding to a B(m+1, n+11) pixel, a B(m+1, n+15) pixel, a B(m+1, n+9) pixel and so forth in the fourth pixel row 90 taking up the (m+1)th position along the X direction.
  • the body control unit 14 generates B image signals in the fourth pixel rows 90 at the (m+5)th position, the (m+9)th position, the (m+13)th position, and so forth along the X direction, in much the same way as that in which it generates B image signals in the fourth pixel row 90 at the (m+1)th position along the X direction, as described earlier.
  • the body control unit 14 adds the output signal from an R pixel located at an (m+3)th position along the X direction and an nth position along the Y direction (hereafter notated as an R(m+3, n) pixel) and the output signal from an R(m+3, n+2) pixel in the sixth pixel row 110 taking up the (m+3)th position along the X direction, so as to generate an R image signal corresponding to an R(m+3, n+1) pixel.
  • the body control unit 14 adds the output signal from an R(m+3, n+4) pixel and the output signal from an R(m+3, n+6) pixel so as to generate an R image signal corresponding to an R(m+3, n+5) pixel.
  • the body control unit 14 In this manner it generates R image signals corresponding to an R(m+3, n+9) pixel, an R(m+3, n+13) pixel, an R(m+3, n+17) pixel and so forth in the fourth pixel row 90 taking up the (m+3)th position along the X direction.
  • the body control unit 14 generates R image signals in the sixth pixel rows 110 at the (m+7)th position, the (m+11)th position, the (m+15)th position, and so forth along the X direction, in much the same way as that in which it generates R image signals in the sixth pixel row 110 at the (m+3)th position along the X direction as described earlier.
  • the body control unit 14 offsets the sets of B pixels in the fourth pixel rows 90 , the output signals from which are added together, and the sets of R pixels in the sixth pixel rows 110 , the output signals from which are added together, relative to each other along the Y direction by an extent equivalent to a single B pixel or R pixel.
  • B image signals and R image signals forming a pixel array that includes a plurality of B pixel rows, each formed with a plurality of B pixels arranged along the X direction with a four-pixel pitch and a plurality of R pixel rows each formed with a plurality of R pixels arranged along the X direction with the four-pixel pitch set at alternate positions along the Y direction with each pair of the B pixel row and the R pixel row formed next to each other along the Y direction offset relative to each other along the X direction by an extent equivalent to a two-pixel pitch, are generated as illustrated in FIG. 8(B) .
  • the sets of B pixels and the sets of R pixels, the output signals from which are to be added together are set by ensuring that the positions of the pixels corresponding to the B image signals and the R image signals generated as a result of the addition operation do not overlap the positions of the pixels corresponding to the G image signals generated as a result of adding together the output signals from the G pixels in the individual G pixel sets, as described earlier.
  • a B image signal is generated for the B(m+1, n+3) pixel by adding together the output signal from the B(m+1, n+2) pixel and the output signal from the B(m+1, n+4) pixel.
  • the body control unit 14 combines the G image signals, the B image signals and the R image signals generated as described above and, as a result, image signals in a square Bayer array, such as that illustrated in FIG. 9 , are obtained. It is to be noted that the number of pixels corresponding to the image signals thus obtained is half the number of pixels arranged at the image sensor 12 .
  • the body control unit 14 then executes color interpolation processing for the image signals in the Bayer array obtained as described above so as to generate image signals corresponding to the missing color components.
  • color image signals RGB signals
  • the body control unit 14 generates an image file for purposes of, for instance, recording, by using these color image signals, and records the image file thus generated into the memory card 20 .
  • the image sensor 12 includes a pixel group made up with a plurality of pixels that receive light fluxes having passed through the interchangeable lens 2 .
  • the pixel group includes G pixels, B pixels and R pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another.
  • a plurality of first pixel rows 60 each made up with a plurality of G pixels arranged one after another along the horizontal direction
  • a plurality of second pixel rows 70 each made up with a plurality of B pixels and a plurality of R pixels arranged at alternate positions along the horizontal direction, are arranged at alternate positions along the vertical direction, with the first pixel rows 60 offset relative to the second pixel rows 70 along the horizontal direction by an extent approximately equal to a half pitch.
  • a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some G pixels, and the plurality of focus detection pixels are structured so as to receive a pair of light fluxes having passed through a pair of areas of a pupil of the interchangeable lens 2 and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
  • the image sensor 12 includes a pixel group made up with a plurality of pixels that receive light fluxes having passed through the interchangeable lens 2 .
  • the pixel group includes G pixels, B pixels and R pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another.
  • a plurality of pixel row sets 120 each includes a third pixel row 80 made up with a plurality of G pixels arranged one after another along the vertical direction, a fourth pixel row 90 set directly rightward relative to the third pixel row 80 and made up with a plurality of B pixels arranged one after another along the vertical direction, a fifth pixel row 100 set directly rightward relative to the fourth pixel row 90 and made up with a plurality of G pixels arranged one after another along the vertical direction and a sixth pixel row 110 set directly rightward relative to the fifth pixel row 100 and made up with a plurality of R pixels arranged one after another along the vertical direction, are arranged one after another along the horizontal direction.
  • Each pixel row among the third through sixth pixel rows 80 through 110 is arranged with an offset relative to the next pixel row along the vertical direction by an extent approximately equal to a half pitch.
  • a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some G pixels, and the plurality of focus detection pixels are structured so as to receive a pair of light fluxes having passed through a pair of areas of the pupil of the interchangeable lens 2 and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
  • the unit pixels 20 in the image sensor 12 described in (1) or (2) above assume a square shape in a plan view, with each side thereof inclining by approximately 45° relative to the X direction and the Y direction.
  • a higher aperture ratio is assured at the unit pixels 20 , which, in turn, makes it possible to improve the focus detection accuracy.
  • the image sensor 12 can be provided as a more compact unit without lowering the aperture ratio.
  • the digital camera 1 includes the image sensor 12 described in (1) or (2) above, a body control unit 14 that generates Bayer array signals by adding together the output signals from two G pixels arranged next to each other along the horizontal direction, adding together the output signals from two B pixels set next to each other along the vertical direction and adding together the output signals from two R pixels set next to each other along the vertical direction, and a body control unit 14 that generates color image signals based upon the Bayer array signals.
  • the existing image processing engine enabling color interpolation processing on Bayer array image data can be used for the color interpolation processing executed therein.
  • the image sensor 12 in the digital camera described in (4) above includes right-opening focus detection pixels 21 that receive a light flux having passed through a first area 201 of the exit pupil 200 of the interchangeable lens 2 and left-opening focus detection pixels 22 that receive a light flux having passed through a second area 202 of the exit pupil 200 of the interchangeable lens 2 .
  • the plurality of right-opening focus detection pixels 21 and the plurality of left-opening focus detection pixels 22 are arranged at alternate positions.
  • the body control unit 14 adds together the output signal from the right-opening focus detection pixel 21 and the output signal from the left-opening focus detection pixel 22 arranged next to the right-opening focus detection pixel 21 along the horizontal direction, multiplies the signal resulting from the addition operation by a predetermined factor (twice), and generates color image signals based upon signals each resulting from the multiplication operation and the Bayer array signals.
  • image signals can be generated with ease simply by using the output signals provided from the focus detection pixels.
  • focus detection pixels are arranged in place of some of the G pixels in the third pixel rows 80 and the fifth pixel rows 100 .
  • the present invention may be adopted in conjunction with third pixel rows 80 and fifth pixel rows 100 each entirely made up with focus detection pixels, as illustrated in FIGS. 10(A) and 10(B) .
  • the third pixel rows 80 may each be entirely made up with left-opening focus detection pixels and the fifth pixel rows 100 may each be made up entirely with right-opening focus detection pixels, as illustrated in FIG. 10(A) .
  • a left-opening focus detection pixel and a right-opening focus detection pixel are set at alternate positions, in reiteration along the X direction in each first pixel row 60 .
  • left-opening focus detection pixels and right-opening focus detection pixels may be arranged at alternate positions along the Y direction in the third pixel rows 80 and left-opening focus detection pixels and right-opening focus detection pixels may be arranged at alternate positions along the Y direction in the fifth pixel rows 100 as well, as illustrated in FIG. 10(B) .
  • the focus detection pixels in this configuration are arranged so that each first pixel row 60 includes a left-opening focus detection pixel and a right-opening focus detection pixel set at alternate positions, in reiteration along the X direction.
  • focus detection can be executed at any position on the photographic image plane without having to execute focus detection in a limited focus detection area.
  • the image shift quantity is calculated along the X direction via the left-opening focus detection pixels and the right-opening focus detection pixels and thus, the defocus quantity is detected along the X direction.
  • the present invention may be adopted in conjunction with upper-opening focus detection pixels, each having the upper half of the photoelectric conversion unit 30 thereof remaining unshielded and lower-opening detection pixels each having the lower half of the photoelectric conversion unit 30 thereof remaining unshielded so as to calculate an image shift quantity along the Y direction and detect the defocus quantity along the Y direction.
  • upper-opening focus detection pixels and lower-opening focus detection pixels are arranged at alternate positions along the Y direction in place of at least some of the G pixels in a third pixel row 80 and a fifth pixel row 100 .
  • at least two focus detection pixel rows are arranged next to each other along the X direction so that the upper-opening focus detection pixels and the lower-opening focus detection pixels are also set at alternate positions along the X direction.
  • the image sensor 12 may include upper-opening focus detection pixels and lower-opening focus detection pixels in addition to left-opening focus detection pixels and right-opening focus detection pixels.
  • the present invention is not limited to this example and it may be adopted in conjunction with focus detection pixels each equipped with two photoelectric conversion units.
  • the two photoelectric conversion units in the focus detection pixel individually receive the pair of light fluxes having passed through the pair of areas at the exit pupil 200 of the interchangeable lens 2 .
  • micro-lenses in the unit pixels 20 in the embodiment described above assume a shape achieved by cutting off a spherical lens on four sides, as illustrated in FIG. 4(A) .
  • the present invention is not limited to this example and it may be adopted in conjunction with spherical micro-lenses.
  • the unit pixels 20 in the embodiment described above take on a substantially square shape.
  • the present invention is not limited to this example and it may instead be adopted in conjunction with unit pixels assuming an octagonal shape that includes four sides inclining by 45° relative to the X direction and the Y direction.
  • the unit pixels 20 in the embodiment described above are arranged with an orientation achieved by rotating the sides thereof by approximately 45° relative to the X direction and the Y direction
  • the present invention is not limited to this example and the unit pixels 20 may each be arranged with an orientation allowing each side thereof to extend parallel to the X direction or the Y direction.
  • the adjacent unit pixels 20 should be offset relative to each other by a half pitch so as to achieve a staggered positional arrangement.
  • image sensor 12 in the embodiment described above includes primary (RGB) color filters
  • present invention may be adopted in conjunction with complementary (CMY) color filters.
  • the present invention is adopted in the digital camera 1 with the interchangeable lens 2 mounted at the camera body 3 .
  • the present invention is not limited to this example and it may be adopted in a digital camera having an integrated lens.

Abstract

An image sensor includes: a plurality of light receiving units arranged along a first direction and along a second direction different from the first direction in a two-dimensional array; and a light receiving unit arranged at a central position among four light receiving units set next to each other along the first direction and the second direction among the plurality of light receiving units, which includes a light shielding member arranged over part thereof.

Description

  • This application is the U.S. National Phase of PCT/JP2013/065216, filed May 31, 2013, which claims priority from Japanese Patent Application No. 2012-129207, filed Jun. 6, 2012, the entire disclosure of which is incorporated herein by reference hereto.
  • BACKGROUND
  • The present invention relates to an image sensor and an image-capturing device.
  • An image-capturing device is known which performs focus detection by a split-pupil phase detection method based upon output signals from a plurality of pixels dedicated for focus detection arranged on a part of an image sensor (see Japanese Laid Open Patent Publication No. 2011-77770).
  • SUMMARY
  • The focus detection pixels in the related art are arranged at positions separated from one another, and thus the focus detection accuracy will be lower compared to focus detection pixels arranged at successive positions. However, there is an issue in that if focus detection pixels in the related art were arranged at successive positions, the interpolation processing executed to generate image signals in correspondence to the positions at which the focus detection pixels are arranged would become complicated.
  • According to the first aspect of the present invention, an image sensor comprises: a plurality of light receiving units arranged along a first direction and along a second direction different from the first direction in a two-dimensional array; and a light receiving unit arranged at a central position among four light receiving units set next to each other along the first direction and the second direction among the plurality of light receiving units, which includes a light shielding member arranged over part thereof.
  • According to the second aspect of the present invention, in the image sensor according to the first aspect, it is preferred that the light receiving unit having the light shielding member arranged over part thereof is a light receiving unit used for purposes of focus detection.
  • According to the third aspect of the present invention, an image sensor comprises: a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein: the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another; a plurality of first pixel rows, each including a plurality of first pixels arranged one after another along a first direction, and a plurality of second pixel rows, each including a plurality of second pixels and a plurality of third pixels arranged at alternate positions along the first direction, are arranged at alternate positions along a second direction running perpendicular to the first direction; the first pixel rows are offset relative to the second pixel rows along the first direction by an extent approximately equal to a half pitch; a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some of the first pixels in at least some of the plurality of first pixel rows; and the plurality of focus detection pixels receive a pair of light fluxes having passed through a pair of areas of a pupil of the image-capturing optical system and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
  • According to the fourth aspect of the present invention, an image sensor comprises: a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein: the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another; a plurality of pixel row sets, each containing a first pixel row including a plurality of first pixels arranged one after another along a second direction, a second pixel row arranged immediately rightward relative to the first pixel row and including a plurality of second pixels arranged one after another along the second direction, a third pixel row arranged immediately rightward relative to the second pixel row and including a plurality of first pixels arranged one after another along the second direction and a fourth pixel row arranged immediately rightward relative to the third pixel row and including a plurality of third pixels arranged one after another along the second direction, are set one after another along a first direction running perpendicular to the second direction; each pixel row among the first through fourth pixel rows is arranged with an offset relative to an adjacent pixel row by an extent equal to a half pitch along the second direction; a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some of the first pixels in at least some of the plurality of first pixel rows and the plurality of third pixel rows; and the plurality of focus detection pixels receive a pair of light fluxes having passed through a pair of areas of a pupil of the image-capturing optical system and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
  • According to the fifth aspect of the present invention, in the image sensor according to the third or fourth aspect, it is preferred that the first pixels each include a green color filter, the second pixels each include a blue color filter and the third pixels each include a red color filter.
  • According to the sixth aspect of the present invention, in the image sensor according to any one of the third through fifth aspects, it is preferred that the pixels each assume a polygonal shape, in a plan view, which includes four sides inclining by approximately 45° relative to the first direction.
  • According to the seventh aspect of the present invention, in the image sensor according to any one of the third through sixth aspects, it is preferred that the focus detection pixels are arranged in place of all the first pixels.
  • According to the eighth aspect of the present invention, an image-capturing device comprises: an image sensor according to any one of the third through seventh aspects; an image generation unit that generates image signals based upon output signals provided from the image sensor; and a focus detection unit that detects a focusing condition at the image-capturing optical system based upon output signals provided from the image sensor.
  • According to the ninth aspect of the present invention, an image-capturing device comprises: an image sensor according to the fifth aspect; an adding unit that adds together output signals from two first pixels arranged next to each other along the first direction, adds together output signals from two second pixels arranged next to each other along the second direction and adds together output signals from two third pixels arranged next to each other along the second direction so as to output Bayer array signals; and an image generation unit that generates image signals corresponding to images formed with light fluxes having passed through the image-capturing optical system based upon the Bayer array signals output by the adding unit.
  • According to the tenth aspect of the present invention, in the image-capturing device according to the ninth aspect, it is preferred that: the focus detection pixels at the image sensor include a plurality of first focus detection pixels that receive one of the pair of light fluxes and a plurality of second focus detection pixels that receive the other of the pair of light fluxes, and the first focus detection pixels and the second focus detection pixels are arranged at alternate positions; the adding unit outputs multiplication result signals each obtained by adding together output signals from a first focus detection pixel and a second focus detection pixel set next to each other along the first direction and multiplying a signal resulting from addition by a predetermined factor; and the image generation unit generates the image signals based upon the Bayer array signals output from the adding unit and the multiplication result signals.
  • According to the present invention, better focus detection accuracy is assured without complicating the interpolation processing executed so as to generate image signals at the positions taken up by the focus detection pixels.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1. illustrates an example of a structure that may be adopted in a digital camera achieved in an embodiment of the present invention;
  • FIGS. 2(A) and 2(B) illustrate unit pixels in the image sensor;
  • FIGS. 3(A) and 3(B) illustrate unit pixels in the image sensor;
  • FIGS. 4(A) and 4(B) illustrate micro-lenses;
  • FIG. 5 is a plan view presenting an example of a positional arrangement with which pixels may be arranged in the image sensor;
  • FIG. 6 illustrates how focus detection is executed through the split-pupil phase detection method;
  • FIGS. 7(A) and 7(B) illustrate how G image signals are generated;
  • FIGS. 8(A) and 8(B) illustrate how B image signals and R image signals are generated;
  • FIG. 9 illustrates image signals output in a Bayer array; and
  • FIGS. 10(A) and 10(B) are views presenting examples of positional arrangements with which the focus detection pixels may be arranged in Variation 1.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • The following is a description of an embodiment of the present invention, given in reference to drawings. An X direction, a Y direction and a Z direction in the following description and the drawings respectively run along the horizontal direction, the vertical direction and the front/rear direction. FIG. 1 illustrates an example of a structure that may be adopted in a digital camera according to an embodiment of the present invention. The digital camera 1 comprises an interchangeable lens 2 and a camera body 3. The interchangeable lens 2 is mounted at the camera body 3 via a mount unit 4.
  • The interchangeable lens 2 includes a lens control unit 5, a main lens 9, a zooming lens 8, a focusing lens 7 and an aperture 6. The lens control unit 5, which includes a microcomputer, a memory and the like, executes drive control for the focusing lens 7 and the aperture 6, detects the opening condition at the aperture 6, detects the positions of the zooming lens 8 and the focusing lens 7, transmits lens information to a body control unit 14 located in the camera body 3, as will be described later, receives camera information from the body control unit 14, and the like.
  • An image sensor 12, an image sensor drive control unit 19, the body control unit 14, a liquid crystal display element drive circuit 15, a liquid crystal display element 16, an eyepiece lens 17, operation members 18 and the like are arranged at the camera body 3. A detachable memory card 20 is loaded in the camera body 3. The image sensor 12, which may be a CCD image sensor or a CMOS image sensor, is arranged at a predetermined image-forming plane of the interchangeable lens 2 and captures a subject image formed via the interchangeable lens 2.
  • The body control unit 14 includes a microcomputer, a memory and the like. The body control unit 14 executes overall operational control of the digital camera. The body control unit 14 and the lens control unit 5 are configured so as to be able to communicate with each other via an electrical contact point 13 at the mount unit 4.
  • The image sensor drive control unit 19 generates a control signal, which is required at the image sensor 12, in response to an instruction issued by the body control unit 14. The liquid crystal display element drive circuit 15 drives the liquid crystal display element 16 configuring a liquid crystal viewfinder (EVF: electronic viewfinder) in response to an instruction issued by the body control unit 14. The photographer views an image displayed at the liquid crystal display element 16 via the eyepiece lens 17. The memory card 20 is a storage medium where image data and the like are stored.
  • The subject image formed on the image sensor 12 via the interchangeable lens 2 undergoes photoelectric conversion at the image sensor 12. The timing with which photoelectric conversion signals are stored and read out (the frame rate) at the image sensor 12 is controlled based upon the control signal provided by the image sensor drive control unit 19. Signals output from the image sensor 12 are converted to digital data at an A/D conversion unit (not illustrated) and the digital data resulting from the conversion are provided to the body control unit 14.
  • The body control unit 14 calculates a defocus quantity based upon output signals, which correspond to a specific focus detection area, provided from the image sensor, and transmits the defocus quantity to the lens control unit 5. Based upon the defocus quantity received from the body control unit 14, the lens control unit 5 calculates a focusing lens drive quantity, and drives the focusing lens 7 via a motor (not illustrated) or the like based upon the lens drive quantity so as to move the focusing lens 7 to the focus match position.
  • In addition, the body control unit 14 generates, based upon signals output from the image sensor 17 in response to a photographing instruction, image data to be recorded. The body control unit 14 stores the image data thus generated into the memory card 20 and also provides the image data to the liquid crystal display element drive circuit 15 so as to display a reproduced image corresponding to the image data at the liquid crystal display element 16.
  • It is to be noted that the operation members 18, including a shutter release button, a focus detection area setting member and the like, are also located at the camera body 3. The body control unit 14 detects operation signals output from the operation members 18 and controls operations (photographing processing, focus detection area setting and the like) corresponding to the detection results.
  • (Description of the Image Sensor)
  • The following explanation will focus on the image sensor 12. First, the method adopted when forming unit pixels 20 of the image sensor 12 will be described. As FIG. 2(A) illustrates, the unit pixels 20 each assume a substantially square shape, designed to achieve a predetermined size, and they each include a rectangular photoelectric conversion unit (PD) 30. As FIG. 2(B) indicates, each unit pixel 20 is arranged with a specific orientation achieved by rotating it so that its sides incline at an approximately 45° angle relative to the X direction and the Y direction. It is to be noted that the photoelectric conversion unit 30 is arranged while the photoelectric conversion unit 30 is not rotated and its aspect ratio is altered. In addition, in order to ensure that the area of this photoelectric conversion unit 30 remains the same as the area of the photoelectric conversion unit 30 illustrated in FIG. 2(A), the four corners of the photoelectric conversion unit 30 are rounded off in correspondence to the shape of the unit pixel 20. A transistor (not illustrated) is arranged in the unit pixel 20.
  • Two different types of unit pixels 20, i.e., pixels used for the purpose of focus detection (hereafter referred to as focus detection pixels), and pixels other than focus detection pixels (hereafter referred to as image-capturing pixels) are arranged at the image sensor 12. Half of the photoelectric conversion unit 30 at a focus detection pixel is shielded or covered with, for instance, a light shielding metal film 40 or the like and thus, only the other half of the photoelectric conversion unit 30 is left unshielded or open, as illustrated in FIGS. 3(A) and 3(B). It is to be noted that unit pixels 20, each having the right half of its photoelectric conversion unit 30 left unshielded (hereafter referred to as right-opening focus detection pixels), as illustrated in FIG. 3(A), and unit pixels 20, each having the left half of its photoelectric conversion unit 30 left unshielded (hereafter referred to as left-opening focus detection pixels), as illustrated in FIG. 3(B), are provided as focus detection pixels.
  • Subsequently, the unit pixels 20 undergo a wiring process and a flattening process of the known art, and then color filter layers are arranged at the unit pixels 20. A color filter layer, through which only the red color component of the light is transmitted, is arranged at an image-capturing pixel (R pixel) that is to receive the red color component light. A color filter layer, through which only the green color component of the light is transmitted, is arranged at an image-capturing pixel (G pixel) that is to receive the green color component light. A color filter layer, through which only the blue color component of the light is transmitted, is arranged at an image-capturing pixel (B pixel) that is to receive the blue color component light. In addition, a color filter layer through which only the green color component of light is transmitted is arranged at a focus detection pixel. Namely, the focus detection pixels receive the green color component light alone.
  • Following the color filter layer formation, the unit pixels 20 undergo an on-chip lens formation process so as to form micro-lenses at the unit pixels 20. The micro-lenses arranged at the unit pixels 20 in the embodiment assume a shape achieved by cutting a round spherical lens on four sides, as illustrated in FIG. 4(A). Since micro-lenses 50 assuming such a shape can be arranged in a denser array, as illustrated in FIG. 4(B), the lens openings can be widened.
  • Next, in reference to FIG. 5, the positional arrangement with which the pixels are arranged at the image sensor 12 will be described. It is to be noted that FIG. 5 illustrates only part of the image sensor 12. In addition, the pixels, each having the left half or the right half thereof shaded in FIG. 5, are focus detection pixels. A pixel with the left half thereof shaded is a right-opening focus detection pixel, whereas a pixel with the right half thereof shaded is a left-opening focus detection pixel. In addition, while FIG. 5 illustrates the micro-lenses 50 as round members in order to simplify the illustration, the actual micro-lenses 50 assume the shape illustrated in FIG. 4(B).
  • The image sensor 12 includes first pixel rows 60, each made up with a plurality of G pixels arranged along the X direction with a predetermined pitch, and second pixel rows 70, each made up with a plurality of B pixels and a plurality of R pixels, arranged at alternate positions along the X direction with the predetermined pitch. The plurality of first pixel rows 60 and the plurality of second pixel rows 70 are arranged at alternate positions along the Y direction with the predetermined pitch. In addition, a first pixel row 60 and an adjacent second pixel row 70 are arranged with an offset relative to each other along the X direction by an extent equal to half the predetermined pitch. As explained earlier, the unit pixels 20 are each arranged so that its sides all incline at approximately 45° relative to the X direction and the Y direction and thus, by arranging adjacent pixel rows with an offset to an extent equivalent to half the predetermined pitch, the plurality of unit pixels 20 can be arranged in a dense array.
  • This positional arrangement adopted for the pixels may be re-phrased as follows. The image sensor 12 includes third pixel rows 80, each made up with a plurality of G pixels arranged along the Y direction with a predetermined pitch, fourth pixel rows 90, each made up with a plurality of B pixels arranged along the Y direction with the predetermined pitch, fifth pixel rows 100, each made up with a plurality of G pixels arranged along the Y direction with the predetermined pitch, and sixth pixel rows 110, each made up with a plurality of R pixels arranged along the Y direction with the predetermined pitch. A fourth pixel row 90 is arranged directly rightward relative to a third pixel row 80, a fifth pixel row 100 is arranged directly rightward relative to the fourth pixel row 90, and a sixth pixel row 110 is arranged directly rightward relative to the fifth pixel row 100. The third through sixth pixel rows 80 through 110 are each arranged with an offset relative to the adjacent pixel rows along the Y direction by an extent equivalent to half the predetermined pitch. At the image sensor 12, a plurality of sets of pixel rows, each formed with a third pixel row 80 through a sixth pixel row 110, are arranged side-by-side along the X direction.
  • In addition, in some of the plurality of first pixel rows 60, a plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged in place of some of the G pixels. In other words, at some third pixel rows 80 and fifth pixel rows 100 among the plurality of third pixel rows 80 and the plurality of fifth pixel rows 100, a plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged in place of some of the G pixels. The plurality of right-opening focus detection pixels and left-opening focus detection pixels are arranged successively at alternate positions along the X direction within a specific focus detection area.
  • (Focus Detection Processing)
  • Next, the focus detection processing executed based upon output signals provided from the image sensor 12 will be explained. As illustrated in FIG. 6, a light flux (light) A passing through a first area 201 of an exit pupil 200 of the interchangeable lens 2 enters right-opening focus detection pixels 21 and a light flux (light) B passing through a second area 202 of the exit pupil 200 enters left-opening focus detection pixels 22.
  • In a focus match state in which a sharp image is formed at the image sensor 12, a pair of images formed with the light fluxes from the different pupil positions of the split pupil described above, match each other. Namely, a signal waveform (i.e., a signal string a1, a2, a3, a4 . . . ) obtained through the plurality of right-opening focus detection pixels having received the light flux A and a signal waveform (i.e., a signal string b1, b2, b3, b4 . . . ) obtained through the plurality of left-opening focus detection pixels achieve a match in their shapes.
  • In contrast, in a non-focus match state, in which a sharp image is formed further frontward relative to the image sensor 12 or further rearward relative to the image sensor 12, the pair of images formed with light fluxes resulting from the pupil splitting do not match each other on the image sensor 12. Under these circumstances, the positional relationship (the image shift direction and the image shift quantity) between the signal waveform (the signal string a1, a2, a3, a4, . . . ) obtained through the right-opening focus detection pixels and the signal waveform (the signal string b1, b2, b3, b4, . . . ) obtained through the left-openings focus detection pixels is affected by the extent of deviation relative to the focus match state (i.e., the defocus quantity).
  • The body control unit 14 calculates the focusing condition (defocus quantity) at the interchangeable lens 2 based upon the positional relationship between the signal waveform (the signal string a1, a2, a3, a4, . . . ) obtained through the right-opening focus detection pixels and the signal waveform (the signal string b1, b2, b3, b4, . . . ) obtained through the left-opening focus detection pixels, and transmits the calculation results to the lens control unit 5 where they are used as camera information. Based upon the camera information, the lens control unit 5 drives the focusing lens 7 forward/backward along the optical axis and thus, focus is adjusted so as to form a sharp image on the image sensor 12.
  • (Image Signal Generation Processing)
  • Next, in reference to FIGS. 7(A) through 9, the image signal generation processing executed to generate color image signals based upon output signals provided from the image sensor 12 will be explained. FIGS. 7(A) and 7(B) illustrate how G image signals are generated. Double-headed arrows in FIG. 7(A) each indicate a set of G pixels, the signals output from which are added together. FIG. 7(B) illustrates G image signals obtained as a result of the addition operation. The body control unit 14, executing the image signal generation processing, generates G image signals by adding together the output signals from two G pixels set next to each other along the X direction in each first pixel row 60.
  • In more specific terms, the body control unit 14 adds the output signal from a G pixel located at an mth position along the X direction and an (n+1)th position along the Y direction (hereafter notated as a G(m, n+1) pixel) and the output signal from a G(m+2, n+1) pixel in the first pixel row 60 taking up the (n+1)th position along the Y direction, so as to generate a G image signal corresponding to a G(m+1, n+1) pixel. Likewise, the body control unit 14 adds the output signal from a G(m+4, n+1) pixel and the output signal from a G(m+6, n+1) pixel so as to generate a G image signal corresponding to a G(m+5, n+1) pixel. In this manner it generates G image signals corresponding to a G(m+9, n+1) pixel, a G(m+13, n+1) pixel, a G(m+17, n+1) pixel and so forth in the first pixel row 60 taking up the (n+1)th position along the Y direction.
  • In addition, the body control unit 14 adds the output signal from a G(m+2, n+3) pixel and the output signal from a G(m+4, n+3) pixel in the first pixel row 60 taking up the (n+3)th position along the Y direction, so as to generate a G image signal corresponding to a G(m+3, n+3) pixel. Likewise, the body control unit 14 adds the output signal from a G(m+6, n+3) pixel and a G(m+8, n+3) pixel so as to generate a G image signal corresponding to a G(m+7, n+3) pixel. In this manner it generates G image signals corresponding to a G(m+11, n+3), a G(m+15, n+3) pixel, a G(m+19, n+3) pixel and so forth in the first pixel row 60 taking up the (n+3)th position along the Y direction.
  • In addition, the body control unit 14 generates G image signals in the first pixel rows 60 at the (n+5)th position, the (n+9)th position, the (n+13)th position and so forth along the Y direction in much the same way as that in which it generates G image signals in the first pixel rows 60 at the (n+1)th position along the Y direction, as described above, and generates G image signals in the first pixel rows 60 the (n+7)th position, the (n+11)th position and the (n+15)th position along the Y direction in much the same way as that in which it generates G image signals in the first pixel rows 60 at the (n+3)th position along the Y direction. Namely, the sets of G pixels, the signals output from which are added together in a given first pixel row 60, and the sets of G pixels, the output signals from which are added together in the first pixel row 60 arranged at the position next to the given first pixel row along the Y direction are offset relative to each other along the X direction by an extent equivalent to a single G pixel. As a result, G image signals corresponding to a G pixel array that includes a plurality of G pixel rows, each formed with a plurality of G pixels arranged along the X direction with a four-pixel pitch, set side-by-side along the Y direction with each pair of successive G pixel rows formed next to each other along the Y direction offset relative to each other along the X direction by an extent equivalent to a two-pixel pitch, are generated as illustrated in FIG. 7(B).
  • In addition, a G(m+6, n+7) pixel and a G(m+10, n+7) pixel in FIG. 7(A) are right-opening focus detection pixels, whereas a G(m+8, n+7) pixel and a G(m+12, n+7) pixel in FIG. 7(A) are left-opening focus detection pixels. The body control unit 14 generates G image signals in conjunction with the focus detection pixels through a similar process, each by adding together the output signal from the right-opening focus detection pixel and the output signal from the left-opening focus detection pixel next to the right-opening focus detection pixel along the X direction. Namely, the body control unit 14 generates a G image signal corresponding to a G(m+7, n+7) pixel by adding together the output signal from the G(m+6, n+7) pixel and the output signal from the G(m+8, n+7) pixel, and generates a G image signal corresponding to a G(m+11, n+7) pixel by adding together the output signal from the G(m+10, n+7) pixel and the output signal from the G(m+12, n+7) pixel. It is to be noted that the G(m+7, n+7) pixel and the G(m+11, n+7) pixel are shaded in FIG. 7(B).
  • The G image signals corresponding to the G(m+7, n+7) pixel and the G(m+11, n+7) pixel are each generated by adding together the output signal from the right-opening focus detection pixel and the corresponding left-opening focus detection pixel with the shielded half areas and, for this reason, these G image signals achieve an output level half the output level of a signal generated by adding together the output signals from two image-capturing pixels. Accordingly, a G image signal with an output level equivalent to that of a G image signal generated by adding together the output signals from two image-capturing pixels can be obtained by multiplying the signal resulting from the addition of the output signal from a right-opening focus detection pixel and the output signal from the corresponding left-openings focus detection pixel, by 2. Thus, in conjunction with the focus detection pixels, the body control unit 14 generates G image signals each by adding together the output signals from a pair of focus detection pixels arranged at successive positions along the X direction, in much the same way as adding together the output signals from two image-capturing pixels arranged next to each other, and then multiplying the signal representing the sum by 2.
  • FIGS. 8(A) and 8(B) illustrate how B image signals and R image signals are generated. Double-headed arrows in FIG. 8(A) each indicate a set of B pixels or R pixels, the signals output from which are added together. FIG. 8(B) illustrates B image signals and R image signals obtained as a result of the addition operation. The body control unit 14, executing the image signal generation processing, generates B image signals each by adding together the output signals from two B pixels set next to each other along the Y direction in a fourth pixel row 90. In addition, the body control unit 14 generates R image signals by adding together the output signals from two R pixels set next to each other along the Y direction in a sixth pixel row 110.
  • In more specific terms, the body control unit 14 adds the output signal from a B pixel located at an (m+1)th position along the X direction and an (n+2)th position along the Y direction (hereafter notated as a B(m+1, n+2) pixel) and the output signal from a B(m+1, n+4) pixel in the fourth pixel row 90 taking up the (m+1)th position along the X direction, so as to generate a B image signal corresponding to a B(m+1, n+3) pixel. Likewise, the body control unit 14 adds the output signal from a B(m+1, n+6) pixel and the output signal from a G(m+1, n+8) pixel so as to generate a B image signal corresponding to a B(m+1, n+7) pixel. In this manner it generates B image signals corresponding to a B(m+1, n+11) pixel, a B(m+1, n+15) pixel, a B(m+1, n+9) pixel and so forth in the fourth pixel row 90 taking up the (m+1)th position along the X direction. The body control unit 14 generates B image signals in the fourth pixel rows 90 at the (m+5)th position, the (m+9)th position, the (m+13)th position, and so forth along the X direction, in much the same way as that in which it generates B image signals in the fourth pixel row 90 at the (m+1)th position along the X direction, as described earlier.
  • In addition, the body control unit 14 adds the output signal from an R pixel located at an (m+3)th position along the X direction and an nth position along the Y direction (hereafter notated as an R(m+3, n) pixel) and the output signal from an R(m+3, n+2) pixel in the sixth pixel row 110 taking up the (m+3)th position along the X direction, so as to generate an R image signal corresponding to an R(m+3, n+1) pixel. Likewise, the body control unit 14 adds the output signal from an R(m+3, n+4) pixel and the output signal from an R(m+3, n+6) pixel so as to generate an R image signal corresponding to an R(m+3, n+5) pixel. In this manner it generates R image signals corresponding to an R(m+3, n+9) pixel, an R(m+3, n+13) pixel, an R(m+3, n+17) pixel and so forth in the fourth pixel row 90 taking up the (m+3)th position along the X direction. The body control unit 14 generates R image signals in the sixth pixel rows 110 at the (m+7)th position, the (m+11)th position, the (m+15)th position, and so forth along the X direction, in much the same way as that in which it generates R image signals in the sixth pixel row 110 at the (m+3)th position along the X direction as described earlier.
  • Namely, the body control unit 14 offsets the sets of B pixels in the fourth pixel rows 90, the output signals from which are added together, and the sets of R pixels in the sixth pixel rows 110, the output signals from which are added together, relative to each other along the Y direction by an extent equivalent to a single B pixel or R pixel. As a result, B image signals and R image signals forming a pixel array that includes a plurality of B pixel rows, each formed with a plurality of B pixels arranged along the X direction with a four-pixel pitch and a plurality of R pixel rows each formed with a plurality of R pixels arranged along the X direction with the four-pixel pitch set at alternate positions along the Y direction with each pair of the B pixel row and the R pixel row formed next to each other along the Y direction offset relative to each other along the X direction by an extent equivalent to a two-pixel pitch, are generated as illustrated in FIG. 8(B).
  • Furthermore, the sets of B pixels and the sets of R pixels, the output signals from which are to be added together, are set by ensuring that the positions of the pixels corresponding to the B image signals and the R image signals generated as a result of the addition operation do not overlap the positions of the pixels corresponding to the G image signals generated as a result of adding together the output signals from the G pixels in the individual G pixel sets, as described earlier. For instance, if the output signal from the B(m+1, n) pixel and the output signal from the B(m+1, n+2) pixel were added together, a B image signal corresponding to a B(m+1, n+1) pixel would be generated and the position of this B(m+1, n+1) pixel would overlap the position of a G image signal generated for a G(m+1, n+1) pixel by adding together the output signal from the G(m, n+1) pixel and the G(m+2, n+1) pixel. In order to avoid such an overlap of pixel positions of the G image signal and the B image signal, a B image signal is generated for the B(m+1, n+3) pixel by adding together the output signal from the B(m+1, n+2) pixel and the output signal from the B(m+1, n+4) pixel.
  • The body control unit 14 combines the G image signals, the B image signals and the R image signals generated as described above and, as a result, image signals in a square Bayer array, such as that illustrated in FIG. 9, are obtained. It is to be noted that the number of pixels corresponding to the image signals thus obtained is half the number of pixels arranged at the image sensor 12.
  • The body control unit 14 then executes color interpolation processing for the image signals in the Bayer array obtained as described above so as to generate image signals corresponding to the missing color components. A detailed explanation of the color interpolation processing executed for the Bayer array image signals, which is of the known art, will not be provided. Through this color interpolation processing, color image signals (RGB signals) are obtained. The body control unit 14 generates an image file for purposes of, for instance, recording, by using these color image signals, and records the image file thus generated into the memory card 20.
  • The following operations and advantages are achieved through the embodiment described above.
  • (1) The image sensor 12 includes a pixel group made up with a plurality of pixels that receive light fluxes having passed through the interchangeable lens 2. The pixel group includes G pixels, B pixels and R pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another. At the image sensor 12, a plurality of first pixel rows 60, each made up with a plurality of G pixels arranged one after another along the horizontal direction, and a plurality of second pixel rows 70, each made up with a plurality of B pixels and a plurality of R pixels arranged at alternate positions along the horizontal direction, are arranged at alternate positions along the vertical direction, with the first pixel rows 60 offset relative to the second pixel rows 70 along the horizontal direction by an extent approximately equal to a half pitch. In at least some of the plurality of first pixel rows 60, a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some G pixels, and the plurality of focus detection pixels are structured so as to receive a pair of light fluxes having passed through a pair of areas of a pupil of the interchangeable lens 2 and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes. These structural features allow the focus detection pixels to be arranged at successive positions and thus achieve an improvement in focus detection accuracy without complicating the process of interpolation processing executed to generate, through interpolation, image signals at positions corresponding to the focus detection pixels.
  • (2) The image sensor 12 includes a pixel group made up with a plurality of pixels that receive light fluxes having passed through the interchangeable lens 2. The pixel group includes G pixels, B pixels and R pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another. At the image sensor 12, a plurality of pixel row sets 120, each includes a third pixel row 80 made up with a plurality of G pixels arranged one after another along the vertical direction, a fourth pixel row 90 set directly rightward relative to the third pixel row 80 and made up with a plurality of B pixels arranged one after another along the vertical direction, a fifth pixel row 100 set directly rightward relative to the fourth pixel row 90 and made up with a plurality of G pixels arranged one after another along the vertical direction and a sixth pixel row 110 set directly rightward relative to the fifth pixel row 100 and made up with a plurality of R pixels arranged one after another along the vertical direction, are arranged one after another along the horizontal direction. Each pixel row among the third through sixth pixel rows 80 through 110 is arranged with an offset relative to the next pixel row along the vertical direction by an extent approximately equal to a half pitch. In at least some of the plurality of third pixel rows 80 and the plurality of fifth pixel rows 100, a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some G pixels, and the plurality of focus detection pixels are structured so as to receive a pair of light fluxes having passed through a pair of areas of the pupil of the interchangeable lens 2 and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes. These structural features allow the focus detection pixels to be arranged at successive positions and thus achieve an improvement in focus detection accuracy without complicating the process of interpolation processing executed to generate, through interpolation, image signals at positions corresponding to the focus detection pixels.
  • (3) The unit pixels 20 in the image sensor 12 described in (1) or (2) above assume a square shape in a plan view, with each side thereof inclining by approximately 45° relative to the X direction and the Y direction. As a result, a higher aperture ratio is assured at the unit pixels 20, which, in turn, makes it possible to improve the focus detection accuracy. In other words, the image sensor 12 can be provided as a more compact unit without lowering the aperture ratio.
  • (4) The digital camera 1 includes the image sensor 12 described in (1) or (2) above, a body control unit 14 that generates Bayer array signals by adding together the output signals from two G pixels arranged next to each other along the horizontal direction, adding together the output signals from two B pixels set next to each other along the vertical direction and adding together the output signals from two R pixels set next to each other along the vertical direction, and a body control unit 14 that generates color image signals based upon the Bayer array signals. In the digital camera 1 configured as described above, the existing image processing engine enabling color interpolation processing on Bayer array image data can be used for the color interpolation processing executed therein.
  • (5) The image sensor 12 in the digital camera described in (4) above includes right-opening focus detection pixels 21 that receive a light flux having passed through a first area 201 of the exit pupil 200 of the interchangeable lens 2 and left-opening focus detection pixels 22 that receive a light flux having passed through a second area 202 of the exit pupil 200 of the interchangeable lens 2. The plurality of right-opening focus detection pixels 21 and the plurality of left-opening focus detection pixels 22 are arranged at alternate positions. The body control unit 14 adds together the output signal from the right-opening focus detection pixel 21 and the output signal from the left-opening focus detection pixel 22 arranged next to the right-opening focus detection pixel 21 along the horizontal direction, multiplies the signal resulting from the addition operation by a predetermined factor (twice), and generates color image signals based upon signals each resulting from the multiplication operation and the Bayer array signals. Through these measures, image signals can be generated with ease simply by using the output signals provided from the focus detection pixels.
  • (Variation 1)
  • In the embodiment described above, focus detection pixels are arranged in place of some of the G pixels in the third pixel rows 80 and the fifth pixel rows 100. As an alternative, however, the present invention may be adopted in conjunction with third pixel rows 80 and fifth pixel rows 100 each entirely made up with focus detection pixels, as illustrated in FIGS. 10(A) and 10(B).
  • For instance, the third pixel rows 80 may each be entirely made up with left-opening focus detection pixels and the fifth pixel rows 100 may each be made up entirely with right-opening focus detection pixels, as illustrated in FIG. 10(A). In this configuration, a left-opening focus detection pixel and a right-opening focus detection pixel are set at alternate positions, in reiteration along the X direction in each first pixel row 60.
  • As a further alternative, left-opening focus detection pixels and right-opening focus detection pixels may be arranged at alternate positions along the Y direction in the third pixel rows 80 and left-opening focus detection pixels and right-opening focus detection pixels may be arranged at alternate positions along the Y direction in the fifth pixel rows 100 as well, as illustrated in FIG. 10(B). The focus detection pixels in this configuration are arranged so that each first pixel row 60 includes a left-opening focus detection pixel and a right-opening focus detection pixel set at alternate positions, in reiteration along the X direction.
  • In variation 1, which includes focus detection pixels arranged through the entire image sensor 12, focus detection can be executed at any position on the photographic image plane without having to execute focus detection in a limited focus detection area.
  • (Variation 2)
  • In the embodiment described above, the image shift quantity is calculated along the X direction via the left-opening focus detection pixels and the right-opening focus detection pixels and thus, the defocus quantity is detected along the X direction. However, the present invention may be adopted in conjunction with upper-opening focus detection pixels, each having the upper half of the photoelectric conversion unit 30 thereof remaining unshielded and lower-opening detection pixels each having the lower half of the photoelectric conversion unit 30 thereof remaining unshielded so as to calculate an image shift quantity along the Y direction and detect the defocus quantity along the Y direction.
  • In this variation, upper-opening focus detection pixels and lower-opening focus detection pixels are arranged at alternate positions along the Y direction in place of at least some of the G pixels in a third pixel row 80 and a fifth pixel row 100. In addition, at least two focus detection pixel rows, each made up with focus detection pixels arranged one after another along the Y direction, are arranged next to each other along the X direction so that the upper-opening focus detection pixels and the lower-opening focus detection pixels are also set at alternate positions along the X direction. As a result, by adding together the output signal from the upper-opening focus detection pixel and the output signal from the lower-opening focus detection pixel present directly next to the upper-opening focus detection pixel along the X direction and then multiplying the sum by 2, as in the embodiment described earlier, a G image signal equivalent to that output from an image-capturing pixel can be generated.
  • Furthermore, the image sensor 12 may include upper-opening focus detection pixels and lower-opening focus detection pixels in addition to left-opening focus detection pixels and right-opening focus detection pixels.
  • (Variation 3)
  • At each focus detection pixel in the embodiment described above, half of the photoelectric conversion unit 30 remains unshielded or open. However, the present invention is not limited to this example and it may be adopted in conjunction with focus detection pixels each equipped with two photoelectric conversion units. The two photoelectric conversion units in the focus detection pixel individually receive the pair of light fluxes having passed through the pair of areas at the exit pupil 200 of the interchangeable lens 2.
  • (Variation 4)
  • The micro-lenses in the unit pixels 20 in the embodiment described above assume a shape achieved by cutting off a spherical lens on four sides, as illustrated in FIG. 4(A). However the present invention is not limited to this example and it may be adopted in conjunction with spherical micro-lenses.
  • (Variation 5)
  • The unit pixels 20 in the embodiment described above take on a substantially square shape. However, the present invention is not limited to this example and it may instead be adopted in conjunction with unit pixels assuming an octagonal shape that includes four sides inclining by 45° relative to the X direction and the Y direction.
  • In addition, while the unit pixels 20 in the embodiment described above are arranged with an orientation achieved by rotating the sides thereof by approximately 45° relative to the X direction and the Y direction, the present invention is not limited to this example and the unit pixels 20 may each be arranged with an orientation allowing each side thereof to extend parallel to the X direction or the Y direction. In this case, too, the adjacent unit pixels 20 should be offset relative to each other by a half pitch so as to achieve a staggered positional arrangement.
  • (Variation 6)
  • While the image sensor 12 in the embodiment described above includes primary (RGB) color filters, the present invention may be adopted in conjunction with complementary (CMY) color filters.
  • (Variation 7)
  • In the embodiment described above, the present invention is adopted in the digital camera 1 with the interchangeable lens 2 mounted at the camera body 3. However, the present invention is not limited to this example and it may be adopted in a digital camera having an integrated lens.
  • It is to be noted that the embodiment described above simply represents an example and the present invention is in no way limited to the structural particulars of the embodiment as long as the features characterizing the present invention remain intact. In addition, structures achieved in the variations may be adopted in conjunction with the embodiment in any combination.

Claims (10)

1. An image sensor, comprising:
a plurality of light receiving units arranged along a first direction and along a second direction different from the first direction in a two-dimensional array; and
a light receiving unit arranged at a central position among four light receiving units set next to each other along the first direction and the second direction among the plurality of light receiving units, which includes a light shielding member arranged over part thereof.
2. An image sensor according to claim 1, wherein:
the light receiving unit having the light shielding member arranged over part thereof is a light receiving unit used for purposes of focus detection.
3. An image sensor comprising:
a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein:
the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another;
a plurality of first pixel rows, each including a plurality of first pixels arranged one after another along a first direction, and a plurality of second pixel rows, each including a plurality of second pixels and a plurality of third pixels arranged at alternate positions along the first direction, are arranged at alternate positions along a second direction running perpendicular to the first direction;
the first pixel rows are offset relative to the second pixel rows along the first direction by an extent approximately equal to a half pitch;
a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some of the first pixels in at least some of the plurality of first pixel rows; and
the plurality of focus detection pixels receive a pair of light fluxes having passed through a pair of areas of a pupil of the image-capturing optical system and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
4. An image sensor comprising:
a pixel group including a plurality of pixels that receive light fluxes having passed through an image-capturing optical system, wherein:
the pixel group includes first pixels, second pixels and third pixels respectively assuming first spectral sensitivity, second spectral sensitivity and third spectral sensitivity different from one another;
a plurality of pixel row sets, each containing a first pixel row including a plurality of first pixels arranged one after another along a second direction, a second pixel row arranged immediately rightward relative to the first pixel row and including a plurality of second pixels arranged one after another along the second direction, a third pixel row arranged immediately rightward relative to the second pixel row and including a plurality of first pixels arranged one after another along the second direction and a fourth pixel row arranged immediately rightward relative to the third pixel row and including a plurality of third pixels arranged one after another along the second direction, are set one after another along a first direction running perpendicular to the second direction;
each pixel row among the first through fourth pixel rows is arranged with an offset relative to an adjacent pixel row by an extent equal to a half pitch along the second direction;
a plurality of focus detection pixels assuming the first spectral sensitivity are arranged in place of at least some of the first pixels in at least some of the plurality of first pixel rows and the plurality of third pixel rows; and
the plurality of focus detection pixels receive a pair of light fluxes having passed through a pair of areas of a pupil of the image-capturing optical system and output a pair of image signals corresponding to a pair of images formed with the pair of light fluxes.
5. An image sensor according to claim 3, wherein:
the first pixels each include a green color filter, the second pixels each include a blue color filter and the third pixels each include a red color filter.
6. An image sensor according to claim 3, wherein:
the pixels each assume a polygonal shape, in a plan view, which includes four sides inclining by approximately 45° relative to the first direction.
7. An image sensor according to claim 3, wherein:
the focus detection pixels are arranged in place of all the first pixels.
8. An image-capturing device, comprising:
an image sensor according to claim 3;
an image generation unit that generates image signals based upon output signals provided from the image sensor; and
a focus detection unit that detects a focusing condition at the image-capturing optical system based upon output signals provided from the image sensor.
9. An image-capturing device, comprising:
an image sensor according to claim 5;
an adding unit that adds together output signals from two first pixels arranged next to each other along the first direction, adds together output signals from two second pixels arranged next to each other along the second direction and adds together output signals from two third pixels arranged next to each other along the second direction so as to output Bayer array signals; and
an image generation unit that generates image signals corresponding to images formed with light fluxes having passed through the image-capturing optical system based upon the Bayer array signals output by the adding unit.
10. An image-capturing device according to claim 9, wherein:
the focus detection pixels at the image sensor include a plurality of first focus detection pixels that receive one of the pair of light fluxes and a plurality of second focus detection pixels that receive the other of the pair of light fluxes, and the first focus detection pixels and the second focus detection pixels are arranged at alternate positions;
the adding unit outputs multiplication result signals each obtained by adding together output signals from a first focus detection pixel and a second focus detection pixel set next to each other along the first direction and multiplying a signal resulting from addition by a predetermined factor; and
the image generation unit generates the image signals based upon the Bayer array signals output from the adding unit and the multiplication result signals.
US14/405,196 2012-06-06 2013-05-31 Imaging element and imaging device Abandoned US20150222805A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/642,686 US10412294B2 (en) 2012-06-06 2017-07-06 Image sensor and image-capturing device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012-129207 2012-06-06
JP2012129207A JP5966636B2 (en) 2012-06-06 2012-06-06 Imaging device and imaging apparatus
PCT/JP2013/065216 WO2013183561A1 (en) 2012-06-06 2013-05-31 Imaging element and imaging device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/065216 A-371-Of-International WO2013183561A1 (en) 2012-06-06 2013-05-31 Imaging element and imaging device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/642,686 Continuation US10412294B2 (en) 2012-06-06 2017-07-06 Image sensor and image-capturing device

Publications (1)

Publication Number Publication Date
US20150222805A1 true US20150222805A1 (en) 2015-08-06

Family

ID=49711953

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/405,196 Abandoned US20150222805A1 (en) 2012-06-06 2013-05-31 Imaging element and imaging device
US15/642,686 Active US10412294B2 (en) 2012-06-06 2017-07-06 Image sensor and image-capturing device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/642,686 Active US10412294B2 (en) 2012-06-06 2017-07-06 Image sensor and image-capturing device

Country Status (5)

Country Link
US (2) US20150222805A1 (en)
JP (1) JP5966636B2 (en)
CN (1) CN104508531A (en)
IN (1) IN2015DN00052A (en)
WO (1) WO2013183561A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170118398A1 (en) * 2014-06-24 2017-04-27 Sony Corporation Image sensor, calculation method, and electronic device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6507527B2 (en) * 2014-08-29 2019-05-08 株式会社ニコン Imaging device and imaging device
WO2017057496A1 (en) 2015-09-30 2017-04-06 日産化学工業株式会社 Liquid crystal display element
JP6315032B2 (en) * 2016-07-07 2018-04-25 株式会社ニコン Imaging device and imaging apparatus
JP6655218B2 (en) * 2017-03-30 2020-02-26 富士フイルム株式会社 Imaging device and image processing method
JP2019114728A (en) * 2017-12-26 2019-07-11 ソニーセミコンダクタソリューションズ株式会社 Solid state imaging apparatus, distance measurement device, and manufacturing method
JP2019057948A (en) * 2018-12-28 2019-04-11 株式会社ニコン Imaging device
JPWO2022130888A1 (en) 2020-12-16 2022-06-23

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100091161A1 (en) * 2007-06-16 2010-04-15 Nikon Corporation Solid-state image sensor and imaging apparatus equipped with solid-state image sensor
US20100188532A1 (en) * 2008-11-27 2010-07-29 Nikon Corporaton Image sensor and image-capturing device
US20100214453A1 (en) * 2009-02-17 2010-08-26 Nikon Corporation Backside illumination image sensor, manufacturing method thereof and image-capturing device
US20110234861A1 (en) * 2010-03-23 2011-09-29 Fujifilm Corporation Imaging device
US20110273602A1 (en) * 2009-03-02 2011-11-10 Cannon Kabushiki Kaisha Optical Device and Signal Processor

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0677450A (en) 1992-06-25 1994-03-18 Sony Corp Solid-state image pickup element
JP2002289828A (en) 2001-03-28 2002-10-04 Fuji Film Microdevices Co Ltd Color imaging device
WO2004112380A1 (en) * 2003-06-17 2004-12-23 Matsushita Electric Industrial Co., Ltd. Information generating apparatus, image pickup apparatus and image pickup method
JP5060216B2 (en) * 2007-08-30 2012-10-31 キヤノン株式会社 Imaging device
JP5211590B2 (en) * 2007-09-10 2013-06-12 株式会社ニコン Image sensor and focus detection apparatus
JP5219865B2 (en) 2008-02-13 2013-06-26 キヤノン株式会社 Imaging apparatus and focus control method
JP5504874B2 (en) * 2009-02-23 2014-05-28 株式会社ニコン Imaging device and imaging apparatus
JP2010210903A (en) * 2009-03-10 2010-09-24 Nikon Corp Image capturing apparatus
US8350940B2 (en) * 2009-06-08 2013-01-08 Aptina Imaging Corporation Image sensors and color filter arrays for charge summing and interlaced readout modes
US8724928B2 (en) * 2009-08-31 2014-05-13 Intellectual Ventures Fund 83 Llc Using captured high and low resolution images
JP2011054911A (en) * 2009-09-04 2011-03-17 Sony Corp Solid-state imaging device and method of manufacturing the same, and electronic apparatus
JP5045801B2 (en) * 2009-09-09 2012-10-10 株式会社ニコン Focus detection device, photographing lens unit, imaging device, and camera system
JP2011077770A (en) 2009-09-30 2011-04-14 Fujifilm Corp Controller of solid-state imaging device and operation control method thereof
JP5232118B2 (en) 2009-09-30 2013-07-10 富士フイルム株式会社 Imaging device and electronic camera
JP5454223B2 (en) 2010-02-25 2014-03-26 株式会社ニコン camera
JP5434761B2 (en) 2010-04-08 2014-03-05 株式会社ニコン Imaging device and imaging apparatus
CN102870402B (en) 2010-04-30 2016-08-03 富士胶片株式会社 Imaging device and formation method
JP5453173B2 (en) 2010-05-28 2014-03-26 富士フイルム株式会社 Imaging device, solid-state imaging device having phase difference detection pixel, and driving control method of imaging device
JP5935237B2 (en) * 2011-03-24 2016-06-15 ソニー株式会社 Solid-state imaging device and electronic apparatus
JP5791349B2 (en) * 2011-04-21 2015-10-07 キヤノン株式会社 Imaging apparatus and control method thereof
JP5814626B2 (en) * 2011-05-27 2015-11-17 キヤノン株式会社 Photoelectric conversion device and method of manufacturing photoelectric conversion device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100091161A1 (en) * 2007-06-16 2010-04-15 Nikon Corporation Solid-state image sensor and imaging apparatus equipped with solid-state image sensor
US20100188532A1 (en) * 2008-11-27 2010-07-29 Nikon Corporaton Image sensor and image-capturing device
US20100214453A1 (en) * 2009-02-17 2010-08-26 Nikon Corporation Backside illumination image sensor, manufacturing method thereof and image-capturing device
US20110273602A1 (en) * 2009-03-02 2011-11-10 Cannon Kabushiki Kaisha Optical Device and Signal Processor
US20110234861A1 (en) * 2010-03-23 2011-09-29 Fujifilm Corporation Imaging device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170118398A1 (en) * 2014-06-24 2017-04-27 Sony Corporation Image sensor, calculation method, and electronic device
US10212332B2 (en) * 2014-06-24 2019-02-19 Sony Corporation Image sensor, calculation method, and electronic device for autofocus

Also Published As

Publication number Publication date
US10412294B2 (en) 2019-09-10
JP2013254076A (en) 2013-12-19
CN104508531A (en) 2015-04-08
IN2015DN00052A (en) 2015-05-22
US20170302846A1 (en) 2017-10-19
WO2013183561A1 (en) 2013-12-12
JP5966636B2 (en) 2016-08-10

Similar Documents

Publication Publication Date Title
US10412294B2 (en) Image sensor and image-capturing device
US10560669B2 (en) Image sensor and image-capturing device
JP7001080B2 (en) Imaging device
US8730373B2 (en) Image forming apparatus
US11747714B2 (en) Image sensor and image-capturing device that selects pixel signal for focal position
US9215447B2 (en) Imaging device and imaging apparatus including a pixel with light receiving region on one side of a center of the pixel
JP2012211942A (en) Solid-state image sensor and image pickup apparatus
JP6315032B2 (en) Imaging device and imaging apparatus
JP6031835B2 (en) Imaging device
JP6566072B2 (en) Imaging device and imaging apparatus
JP2014215526A (en) Imaging element and camera

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MURATA, HIRONOBU;REEL/FRAME:035501/0733

Effective date: 20150311

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION