WO2014097792A1 - Dispositif d'imagerie, procédé de traitement de signal et programme de traitement de signal - Google Patents

Dispositif d'imagerie, procédé de traitement de signal et programme de traitement de signal Download PDF

Info

Publication number
WO2014097792A1
WO2014097792A1 PCT/JP2013/080992 JP2013080992W WO2014097792A1 WO 2014097792 A1 WO2014097792 A1 WO 2014097792A1 JP 2013080992 W JP2013080992 W JP 2013080992W WO 2014097792 A1 WO2014097792 A1 WO 2014097792A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
block
difference detection
gain value
correction gain
Prior art date
Application number
PCT/JP2013/080992
Other languages
English (en)
Japanese (ja)
Inventor
一文 菅原
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2014097792A1 publication Critical patent/WO2014097792A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets

Definitions

  • the present invention relates to an imaging apparatus, a signal processing method, and a signal processing program.
  • an imaging device In recent years, with the increase in the resolution of solid-state imaging devices such as CCD (Charge Coupled Device) image sensors and CMOS (Complementary Metal Oxide Semiconductor) image sensors, digital still cameras, digital video cameras, mobile phones, PDA (Personal DigitalAssessment).
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • phase difference AF Auto Focus
  • the phase difference AF method can detect the in-focus position at a higher speed and with higher accuracy than the contrast AF method, it is widely used in various imaging apparatuses.
  • a pair of phase difference detection pixels whose light shielding film openings are decentered in opposite directions are discretely distributed over the entire light receiving surface.
  • This phase difference detection pixel has an area of the light shielding film opening smaller than that of other normal pixels (imaging pixels), so that the output signal is not sufficient for use as a captured image signal. Become. Therefore, it is necessary to correct the output signal of the pixel for phase difference detection.
  • Patent Documents 1 to 4 describe an interpolation correction process for generating an output signal of a phase difference detection pixel using an output signal of a normal pixel around the output signal, and gain amplification of the output signal of the phase difference detection pixel.
  • An image pickup apparatus that uses a gain correction process that corrects the correction is disclosed.
  • Patent Documents 1 to 4 do not particularly mention a method for generating a correction gain value.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an imaging apparatus and a signal processing method capable of improving the correction accuracy of an output signal of a phase difference detection pixel.
  • the imaging device of the present invention is an imaging device having a solid-state imaging device that images a subject through an imaging optical system including a focus lens, and the solid-state imaging device is a plurality of imaging devices arranged in a two-dimensional array on a light receiving surface.
  • a plurality of types of phase difference detection pixels having different structures and pixels, and the phase difference detection is included in the output signal value of the phase difference detection pixel included in the captured image signal output from the solid-state image sensor.
  • a gain correction processing unit that multiplies a correction gain value corresponding to a pixel for correction to correct the output signal value, and an area where the phase difference detection pixel is arranged on the light receiving surface, and the plurality of types of phase difference detection pixels and Dividing into a plurality of blocks including the imaging pixels, the correction gain value corresponding to the group is increased for each group of the same type of phase difference detection pixels belonging to each block.
  • a correction gain value generator for generating by using the captured image signal obtained by imaging an object by a solid state imaging device, those comprising a.
  • a correction gain value corresponding to the group is generated for each group of the same type of phase difference detection pixels belonging to each block.
  • a gain value is generated for each individual phase difference detection pixel, it is generated when the imaging pixels around the phase difference detection pixel for which the gain value is to be generated are overexposed or underexposed.
  • the gain value is low in reliability.
  • a signal of each phase difference detection pixel of the group and a signal of imaging pixels around each phase difference detection pixel Can be used to generate one gain value, so that the influence of the gain value on the reliability can be reduced even if some of the imaging pixels are overexposed or underexposed.
  • the signal processing method of the present invention is a signal processing method for processing a picked-up image signal output from a solid-state image pickup device that picks up an image of a subject through an image pickup optical system including a focus lens.
  • a plurality of imaging pixels arranged in a three-dimensional array and a plurality of types of phase difference detection pixels having different structures, and the phase difference detection pixels included in a captured image signal output from the solid-state imaging device A gain correction processing step of correcting the output signal value by multiplying the output signal value by a correction gain value corresponding to the phase difference detection pixel, and an area in which the phase difference detection pixel is arranged on the light receiving surface.
  • the signal processing program of the present invention is a signal processing program for processing a captured image signal output from a solid-state image sensor that images a subject through an imaging optical system including a focus lens
  • the solid-state image sensor includes: It includes a plurality of imaging pixels arranged in a two-dimensional array on the light receiving surface and a plurality of types of phase difference detection pixels having different structures, and the level included in the captured image signal output from the solid-state imaging device.
  • a correction gain value generation step for generating a correction gain value corresponding to the group for each group of similar phase difference detection pixels using a captured image signal obtained by imaging a subject with the solid-state imaging device; Is a program for causing a computer to execute.
  • the present invention it is possible to provide an imaging apparatus and a signal processing method capable of improving the correction accuracy of the output signal of the phase difference detection pixel.
  • the figure which shows schematic structure of the digital camera as an example of the imaging device for describing one Embodiment of this invention 1 is a schematic plan view showing a schematic configuration of a solid-state imaging device 3 mounted on the digital camera shown in FIG.
  • Functional block diagram of the digital signal processing unit 17 of the digital camera shown in FIG. FIG. 1 is a schematic plan view showing a light receiving surface of the solid-state imaging device 3 shown in FIG.
  • FIG. 1 is a diagram showing a schematic configuration of a digital camera as an example of an imaging apparatus for explaining an embodiment of the present invention.
  • the imaging system of the digital camera shown in FIG. 1 includes an imaging optical system having a photographic lens 1 including a focus lens and an aperture 2, and a solid-state imaging device 3 such as a CCD image sensor or a CMOS image sensor.
  • This digital camera has a mount mechanism (not shown), and an imaging optical system can be attached and detached by this mount mechanism.
  • the mount mechanism may be omitted, and the imaging optical system may be fixed to the digital camera.
  • the solid-state imaging device 3 has a configuration in which a plurality of imaging pixels and two types of phase difference detection pixels that respectively receive a pair of light beams that have passed through different portions of the pupil region of the imaging optical system are two-dimensionally arranged.
  • the object image formed by the photographing lens 1 is received and a captured image signal is output, and a pair of image signals corresponding to the pair of light beams is output.
  • the system control unit 11 that controls the entire electric control system of the digital camera controls the flash light emitting unit 12 and the light receiving unit 13. Further, the system control unit 11 controls the lens driving unit 8 to adjust the position of the focus lens included in the photographing lens 1. Further, the system control unit 11 adjusts the exposure amount by controlling the aperture amount of the aperture 2 via the aperture drive unit 9.
  • system control unit 11 drives the solid-state imaging device 3 via the solid-state imaging device driving unit 10 and outputs a subject image captured through the photographing lens 1 as a captured image signal.
  • An instruction signal from the user is input to the system control unit 11 through the operation unit 14.
  • the electric control system of the digital camera further includes an analog signal processing unit 6 that performs analog signal processing such as correlated double sampling processing connected to the output of the solid-state imaging device 3, and RGB output from the analog signal processing unit 6. And an A / D conversion circuit 7 for converting the color signal into a digital signal.
  • the analog signal processing unit 6 and the A / D conversion circuit 7 are controlled by the system control unit 11.
  • the electric control system of the digital camera performs various image processing on the main memory 16, the memory control unit 15 connected to the main memory 16, and the captured image signal obtained by imaging with the solid-state image sensor 3.
  • a focus detection unit 19 that calculates a defocus amount of the photographing lens 1 based on a phase difference between a pair of image signals output from two types of phase difference detection pixels of the element 3 and a detachable recording medium 21 are connected.
  • the memory control unit 15, digital signal processing unit 17, compression / decompression processing unit 18, focus detection unit 19, external memory control unit 20, and display control unit 22 are mutually connected by a control bus 24 and a data bus 25, and system control is performed. It is controlled by a command from the unit 11.
  • FIG. 2 is a schematic plan view showing a schematic configuration of the solid-state imaging device 3 mounted on the digital camera shown in FIG.
  • the solid-state imaging device 3 is provided with phase detection pixels scattered all over or a part of the light receiving surface where the pixels are two-dimensionally arranged.
  • FIG. 2 is an enlarged view of a portion of the light receiving surface where the phase difference detection pixels are provided.
  • the solid-state imaging device 3 includes a large number of pixels (each square in the drawing) arranged two-dimensionally (in the example of FIG. 2 in a square lattice shape) in the row direction X and the column direction Y orthogonal thereto.
  • a large number of pixels are arranged such that pixel rows composed of a plurality of pixels arranged in a row direction X at a constant pitch are arranged in a column direction Y at a constant pitch.
  • the large number of pixels includes an imaging pixel 30, a phase difference detection pixel 31L, and a phase difference detection pixel 31R.
  • Each pixel includes a photoelectric conversion unit that receives light and converts it into electric charges.
  • the imaging pixel 30 is a pair of light that has passed through two different portions of the pupil region of the photographing lens 1 shown in FIG. 1 (for example, light that has passed through the left side and light that has passed through the right side with respect to the main axis of the photographing lens 1). It is a pixel that receives both.
  • the phase difference detection pixel 31L is a pixel that receives one of the pair of lights, and has a configuration in which the opening of the photoelectric conversion unit (a region not hatched) is decentered to the left as compared with the imaging pixel 30. It has become.
  • the phase difference detection pixel 31 ⁇ / b> R is a pixel that receives the other of the pair of lights. Compared with the imaging pixel 30, the opening of the photoelectric conversion unit (the area not hatched) is eccentric to the right side. It has become.
  • phase difference detection pixel is not limited to that described above, and a well-known configuration can be employed.
  • the two pixels may have a function equivalent to that of the phase difference detection pixels 31R and 31L.
  • the openings of the photoelectric conversion units may have the same configuration, and the phase difference detection pixels 31R and 31L may be configured by decentering the microlenses provided above the photoelectric conversion units in the row direction X in opposite directions.
  • the pair of image signals having the phase difference in the row direction X is detected by the phase difference detection pixel 31R and the phase difference detection pixel 31L.
  • the eccentric direction of the light shielding film opening is, for example, the column direction.
  • a pair of image signals having a phase difference in the column direction Y may be detected as Y.
  • a color filter is mounted above the photoelectric conversion unit included in each pixel, and the array of the color filter is a Bayer array for all of the many pixels constituting the solid-state imaging device 3.
  • R is marked on a pixel on which a color filter that transmits red (R) light is mounted.
  • G is written in a pixel on which a color filter that transmits green (G) light is mounted.
  • B is marked on a pixel on which a color filter that transmits blue (B) light is mounted. Note that the color filter may have four or more colors.
  • the phase difference detection pixels 31L are arranged at intervals of three pixels in the third and ninth pixel rows from the top of FIG. 2 at the positions of pixels on which color filters that transmit green (G) light are mounted.
  • the phase difference detection pixels 31R are arranged at intervals of three pixels at the positions of the pixels on which color filters that transmit green (G) light are mounted in the fourth and tenth pixel rows from the top in FIG.
  • phase difference detection pixels 31L and the phase difference detection pixels 31R adjacent in the oblique direction form a pair, and the light receiving surface of the solid-state imaging device 3 has a plurality of pairs.
  • the pair of phase difference detection pixels may not be adjacent to each other, and may be separated by several pixels.
  • the focus detection unit 19 shown in FIG. 1 uses a signal group read from the phase difference detection pixel 31L and the phase difference detection pixel 31R, and is an amount that is away from the focus adjustment state of the photographing lens 1, here the focus state. And its direction, that is, the defocus amount is calculated.
  • the system control unit 11 illustrated in FIG. 1 performs focus adjustment by controlling the position of the focus lens included in the imaging lens 1 based on the defocus amount calculated by the focus detection unit 19.
  • the system control unit 11 causes the solid-state imaging device 3 to perform imaging, and a captured image signal (output from each pixel) output from the solid-state imaging device 3 by this imaging. A set of output signals) is taken into the digital signal processing unit 17.
  • the digital signal processing unit 17 corrects the output signal of the phase difference detection pixel included in the captured image signal, and records the corrected captured image signal in the main memory 16. Further, the digital signal processing unit 17 performs image processing on the recorded captured image signal to generate captured image data. This image processing includes demosaic processing, ⁇ correction processing, white balance adjustment processing, and the like.
  • FIG. 3 is a functional block diagram of the digital signal processing unit 17 in the digital camera shown in FIG.
  • the digital signal processing unit 17 includes a gain correction processing unit 171, a correction gain value generation unit 172, and an image processing unit 173. These are functional blocks formed by a processor included in the digital signal processing unit 17 executing a program.
  • the gain correction processing unit 171 performs gain correction processing for correcting an output signal of a correction target phase difference detection pixel (hereinafter referred to as a correction target pixel) included in the captured image signal by multiplying the output signal by a correction gain value. .
  • the correction gain value generation unit 172 generates the correction gain value used by the gain correction processing unit 171 using a captured image signal obtained by imaging for live view image display.
  • the image processing unit 173 performs image processing on the captured image signal including the corrected output signal of the correction target pixel to generate captured image data, and records the captured image data in the recording medium 21.
  • the image processing unit 173 may record the corrected captured image signal as RAW data on the recording medium 21 as it is.
  • the correction gain value generation unit 172 displays an AF area 31 (an area that is a target for focus detection in which a pair of phase difference detection pixels is arranged) on the light receiving surface 30 of the solid-state imaging device 3.
  • the block is divided into a plurality of blocks 32 (36 in 6 rows ⁇ 6 columns in the example of FIG. 4) each including the phase difference detection pixel 31R, the phase difference detection pixel 31L, and the imaging pixel 31.
  • each block row composed of blocks arranged in the row direction Y has names L1, L2,..., L6 on the left.
  • each block 32 includes a plurality of pairs of imaging pixels 31 and phase difference detection pixels.
  • the imaging pixels included in each block 32 are substantially the same, and the number of pairs of phase difference detection pixels included in each block 32 is also approximately the same, but this is not a limitation.
  • FIG. 5 is a flowchart for explaining the correction gain value generation operation of the correction gain value generation unit 172. Numbers 1 to 36 are assigned in advance to each of the 36 blocks shown in FIG.
  • the correction gain value generation unit 172 determines whether the extracted output signal value has a level equal to or higher than the first threshold value or a level equal to or lower than the second threshold value. judge.
  • the first threshold is a pixel saturation level
  • the second threshold is a small value such as dark noise. That is, in step S3, it is determined whether there is a saturated pixel or a blackened pixel in the i block.
  • step S3 If the determination in step S3 is NO, the correction gain value generation unit 172 integrates the output signal values of all the phase difference detection pixels 31R in the i block in step S6.
  • the correction gain value generation unit 172 calculates the ratio of the two integrated values calculated in steps S6 and S7 (the integrated value of the imaging pixels multiplied by a predetermined coefficient / the integrated value of the phase difference detection pixels). Value) is generated as a correction gain value corresponding to the phase difference detection pixel 31R in the i block and stored in the main memory 16.
  • the predetermined coefficient is determined in consideration of the difference in the number of integrated output signal values of the imaging pixel and the phase difference detection pixel.
  • the correction gain value is obtained from the ratio of the integrated values, the average value of the output signal values of all the phase difference detection pixels 31R in the i block and the imaging around each of the phase difference detection pixels 31R.
  • the correction gain value may be obtained from the ratio of the output signal value of the pixel 31 to the average value.
  • the correction gain value generation unit 172 determines that the number of pixels excluding the saturated or blacked out pixels in the phase difference detection pixels 31R in the i block is the first in step S4. It is determined whether or not the threshold value is three or more.
  • the correction gain value generation unit 172 determines in step S5 the imaging pixel 31 (each position) that is a target to be integrated (or averaged) when obtaining the correction gain value in the i block. It is determined whether or not the number of pixels excluding saturated or blackened pixels among the imaging pixels 31) around the phase difference detection pixels 31R is equal to or greater than a fourth threshold value.
  • the minimum number of output signals of the phase difference detection pixels (the number of signal values for which the average value is to be obtained) and the image pickup necessary for ensuring the reliability of the correction gain value obtained in step S10 is sufficient.
  • the total number of pixel output signals (the number of signal values for which an average value is obtained) can be determined in advance, and these values serve as the third threshold value and the fourth threshold value.
  • step S4 If the determination in step S4 is NO and the determination in step S5 is NO, the correction gain value generation unit 172 does not generate a correction gain value corresponding to the phase difference detection pixel 31R in the i block, and the step Proceed to S11.
  • the interpolation correction processing unit 172 When the determination in step S5 is YES, the interpolation correction processing unit 172 outputs the output signal value of the pixel excluding the saturated or blacked out pixel in the phase difference detection pixel 31R in the i block in step S8. Accumulate.
  • the correction gain value generation unit 172 integrates the output signal values of the pixels excluding the saturated or blacked out pixels of the G-color detection imaging pixels 31 in the i block. .
  • step S9 the process proceeds to step S10, and the correction gain value generation unit 172 generates a correction gain value using the two integrated values calculated in step S8 and step S9.
  • the correction gain value generation unit 172 performs the processing flow shown in FIG. 5 in the same manner for the phase difference detection pixel 31L. That is, in the description of FIG. 5, an operation is performed in which the “phase difference detection pixel 31R” is replaced with the “phase difference detection pixel 31L”.
  • the correction gain value is obtained separately for the phase difference detection pixel 31R and the phase difference detection pixel 31L because the phase difference detection pixel 31R and the phase difference detection pixel 31L are on the light receiving surface. This is because even if the positions are substantially the same, there may be a large difference in sensitivity.
  • the correction gain value generation unit 172 tries to generate a correction gain value corresponding to the phase difference detection pixel 31R (31L) for each block 32, and the determination in step S4 or step S5 is NO. For the block, the correction gain value corresponding to the phase difference detection pixel 31R (31L) is not generated.
  • FIG. 6 and 7 are diagrams for explaining the result of the process described in FIG. 6 and 7 show the 36 blocks 32 shown in FIG. 6 and 7, “OK” is written in each block 32 when the correction gain value is generated and stored, and “NG” is written when the correction gain value is not generated and stored.
  • the correction gain value generation unit 172 generates the correction gain value corresponding to the block 32 and the correction gain value already generated around the block 32. It is generated using the correction gain value of the block 32.
  • a correction gain value is generated by using the correction gain value generated for the other block 32 in Y) (the block 32 in column C2 and rows L1, L2, L3, L5, and L6).
  • block 32 in column C2 and row L1 block 32 in column C2 and row L2, block 32 in column C2 and row L3, block 32 in column C2 and row L5, and block in column C2 and row L6 32, the average value of the five correction gain values generated for the five blocks 32 is calculated, and this average value is set as the correction gain value of the block 32 in the column C2 and the row L4.
  • the sensitivity of the phase difference detection pixel 31R (or 31L) is equal in any position in the column direction Y. Therefore, for the block 32 for which no correction gain value is generated, the correction gain value is calculated by using the correction gain value corresponding to the block 32 included in the block row including the block 32, thereby correcting the correction gain value. Can improve the reliability.
  • the average value of the plurality of correction gain values obtained for the block row including the “NG” block 32 is used as the correction gain value of the block 32.
  • the correction gain value of the block 32 is used as the correction gain value of the block 32.
  • the correction gain value generation unit 172 detects the phase difference of the phase difference detection pixels 31R and 31L with respect to the block row C3 including the block 32 of “OK”. Two block columns (block columns C2 and C4) that are closest to each other in the direction (row direction X) are specified. Then, the correction gain value corresponding to each block 32 in the block column C3 is generated using the correction gain value generated for each identified block column.
  • the correction gain value generation unit 172 calculates the average value Av1 of the correction gain values of the six blocks 32 belonging to the block row C2, and calculates the average value Av2 of the correction gain values of the six blocks 32 belonging to the block row C4. To do. It is known that the graph obtained by plotting the positions of the columns C1 to C6 in the row direction X on the horizontal axis and the correction gain values obtained for the columns C1 to C6 on the vertical axis has linearity. For this reason, the correction gain value generation unit 172 can obtain the correction gain value for the column C3 by linear interpolation using the average value Av1 and the average value Av2.
  • the correction gain value generation unit 172 generates correction gain values corresponding to the phase difference detection pixels 31R and the phase difference detection pixels 31L for all the blocks 32, respectively.
  • the gain correction processing unit 171 reads the gain value corresponding to the block to which the correction target pixel belongs and the type of the correction target pixel from the main memory 16, and uses the gain value as the correction target value. Gain correction processing is performed by multiplying the output signal of the pixel.
  • step S10 provides reliability.
  • a high correction gain value cannot be generated, it is possible to generate a correction gain value with high reliability by using correction gain values obtained for the other blocks 32. Therefore, it is possible to prevent the correction accuracy of the output signal of the phase difference detection pixel from being lowered, and it is possible to improve the captured image quality.
  • a correction gain value can be generated using a captured image signal obtained by imaging with the solid-state imaging device 3. For this reason, it is not necessary to generate a correction gain value in advance and store it in the camera body in correspondence with any lens apparatus that can be mounted on the digital camera, and the number of adjustment steps of the camera can be reduced.
  • the lens device even if a lens device cannot acquire detailed information other than the lens ID, the lens device is mounted and set to a shooting mode, and an AE / AF instruction is issued.
  • the correction gain value can be generated and stored simply. For this reason, the imaging quality can be improved regardless of what lens device is mounted.
  • the correction gain value is generated and stored when the AE / AF instruction is given.
  • the timing for generating the correction gain value is not limited to this.
  • the timing set in the shooting mode, the timing when the lens apparatus is replaced during the shooting mode, or the like may be used.
  • the correction gain value corresponding to each block 32 generated by the correction gain value generation unit 172 is recorded in the main memory 16 of the digital camera in association with the identification information of the lens device attached to the digital camera when it is obtained. It is good also as a structure to carry out, and it is good also as a structure which matches with the identification information of a digital camera, and records on the recording medium in a lens apparatus. Alternatively, a correction gain value may be generated every time an AE / AF instruction is given.
  • step S8 may be performed when step S3 is YES.
  • step S10 in association with the correction gain value calculated in step S10, the number of output signals integrated in step S6 and the number of output signals integrated in step S7, or the number of output signals integrated in step S8 and step S9.
  • the accumulated number of output signals is stored in the main memory 16.
  • the correction gain value generation unit 172 outputs the accumulated number of output signals of the phase difference detection pixels associated with the correction gain values and the output signals of the imaging pixels. Blocks 32 whose accumulated numbers are each equal to or greater than the third threshold are determined as “OK” blocks, and other blocks 32 are determined as “NG” blocks. Then, a new correction gain value is generated for the “NG” block by the method described above.
  • the pixel arrangement of the solid-state imaging device 3 including the phase difference detection pixels and the imaging pixels is not limited to that shown in FIG. 2, and other known arrangements can be adopted.
  • the detection color of the phase difference detection pixel is green, the detection color may be red or blue.
  • the solid-state imaging device 5 may be a solid-state imaging device for monochrome imaging. That is, the color filter may be omitted.
  • the digital signal processing unit 17 uses not only the gain correction process but also the output signal of the phase difference detection pixel, and the output signal of the imaging pixel 31 that detects the same color around the phase difference detection pixel. In combination with the interpolation correction processing for interpolation, the output signal of the phase difference detection pixel may be corrected.
  • the processing performed by the digital signal processing unit 17 can be provided as a program for causing a computer to execute the processing.
  • a program is recorded on a non-transitory recording medium in which the program can be read by a computer.
  • Such “computer-readable recording medium” includes, for example, an optical medium such as a CD-ROM (Compact Disc-ROM), a magnetic recording medium such as a memory card, and the like. Such a program can also be provided by downloading via a network.
  • an optical medium such as a CD-ROM (Compact Disc-ROM)
  • a magnetic recording medium such as a memory card, and the like.
  • Such a program can also be provided by downloading via a network.
  • FIG. 8 shows an appearance of a smartphone 200 that is an embodiment of the photographing apparatus of the present invention.
  • a smartphone 200 shown in FIG. 8 includes a flat housing 201, and a display input in which a display panel 202 as a display unit and an operation panel 203 as an input unit are integrated on one surface of the housing 201. Part 204 is provided.
  • Such a housing 201 includes a speaker 205, a microphone 206, an operation unit 207, and a camera unit 208.
  • the configuration of the housing 201 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent can be employed, or a configuration having a folding structure and a slide mechanism can be employed.
  • FIG. 9 is a block diagram showing a configuration of the smartphone 200 shown in FIG.
  • the main components of the smartphone include a wireless communication unit 210, a display input unit 204, a call unit 211, an operation unit 207, a camera unit 208, a storage unit 212, and an external input / output unit. 213, a GPS (Global Positioning System) receiving unit 214, a motion sensor unit 215, a power supply unit 216, and a main control unit 220.
  • a wireless communication function for performing mobile wireless communication via a base station device BS (not shown) and a mobile communication network NW (not shown) is provided.
  • the wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 220. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 204 controls the main control unit 220 to display images (still images and moving images), character information, and the like to visually transmit information to the user and to detect user operations on the displayed information.
  • a so-called touch panel which includes a display panel 202 and an operation panel 203.
  • the display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • LCD Liquid Crystal Display
  • OELD Organic Electro-Luminescence Display
  • the operation panel 203 is a device that is placed so that an image displayed on the display surface of the display panel 202 is visible and detects one or more coordinates operated by a user's finger or stylus.
  • a detection signal generated due to the operation is output to the main control unit 220.
  • the main control unit 220 detects an operation position (coordinates) on the display panel 202 based on the received detection signal.
  • the display panel 202 and the operation panel 203 of the smartphone 200 exemplified as an embodiment of the photographing apparatus of the present invention integrally constitute a display input unit 204.
  • the arrangement 203 covers the display panel 202 completely.
  • the operation panel 203 may have a function of detecting a user operation even in an area outside the display panel 202.
  • the operation panel 203 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 202 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 202. May be included).
  • the operation panel 203 may include two sensitive areas of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 201 and the like.
  • the position detection method employed in the operation panel 203 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, a capacitance method, and the like. You can also
  • the call unit 211 includes a speaker 205 and a microphone 206, converts user's voice input through the microphone 206 into voice data that can be processed by the main control unit 220, and outputs the voice data to the main control unit 220. 210 or the audio data received by the external input / output unit 213 is decoded and output from the speaker 205. Further, as shown in FIG. 8, for example, the speaker 205 can be mounted on the same surface as the display input unit 204 and the microphone 206 can be mounted on the side surface of the housing 201.
  • the operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 207 is mounted on the side surface of the housing 201 of the smartphone 200 and turns on when pressed with a finger or the like, and turns off when a finger is released with a restoring force such as a spring. It is a push button type switch.
  • the storage unit 212 includes a control program and control data of the main control unit 220, application software, address data that associates the name and telephone number of a communication partner, transmitted / received e-mail data, Web data downloaded by Web browsing, The downloaded content data is stored, and streaming data and the like are temporarily stored.
  • the storage unit 212 includes an internal storage unit 217 built in the smartphone and an external storage unit 218 having a removable external memory slot.
  • Each of the internal storage unit 217 and the external storage unit 218 constituting the storage unit 212 includes a flash memory type (hard memory type), a hard disk type (hard disk type), a multimedia card micro type (multimedia card micro type), This is realized using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
  • a flash memory type hard memory type
  • hard disk type hard disk type
  • multimedia card micro type multimedia card micro type
  • a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
  • the external input / output unit 213 serves as an interface with all external devices connected to the smartphone 200, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network.
  • external devices for example, universal serial bus (USB), IEEE 1394, etc.
  • a network for example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared Data Association (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark) ZigBee) (registered trademark, etc.) for direct or indirect connection.
  • an external device connected to the smartphone 200 for example, a wired / wireless headset, a wired / wireless external charger, a wired / wireless data port, a memory card (Memory card) connected via a card socket, or a SIM (Subscriber).
  • Identity Module Card / UIM (User Identity Module Card) card external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, yes / no
  • the external input / output unit 213 transmits data received from such an external device to each component inside the smartphone 200, or allows the data inside the smartphone 200 to be transmitted to the external device. Can do.
  • the GPS receiving unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 220, executes positioning calculation processing based on the received GPS signals, and calculates the latitude of the smartphone 200 Detect the position consisting of longitude and altitude.
  • the GPS reception unit 214 can acquire position information from the wireless communication unit 210 or the external input / output unit 213 (for example, a wireless LAN), the GPS reception unit 214 can also detect the position using the position information.
  • the motion sensor unit 215 includes, for example, a three-axis acceleration sensor, and detects the physical movement of the smartphone 200 in accordance with an instruction from the main control unit 220. By detecting the physical movement of the smartphone 200, the moving direction and acceleration of the smartphone 200 are detected. The detection result is output to the main control unit 220.
  • the power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 200 in accordance with an instruction from the main control unit 220.
  • the main control unit 220 includes a microprocessor, operates according to a control program and control data stored in the storage unit 212, and controls each unit of the smartphone 200 in an integrated manner.
  • the main control unit 220 includes a mobile communication control function that controls each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 210.
  • the application processing function is realized by the main control unit 220 operating according to the application software stored in the storage unit 212.
  • Examples of the application processing function include an infrared communication function for controlling the external input / output unit 213 to perform data communication with the opposite device, an e-mail function for transmitting / receiving e-mails, and a web browsing function for browsing web pages. .
  • the main control unit 220 has an image processing function such as displaying video on the display input unit 204 based on image data (still image or moving image data) such as received data or downloaded streaming data.
  • the image processing function is a function in which the main control unit 220 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 204.
  • the main control unit 220 executes display control for the display panel 202 and operation detection control for detecting a user operation through the operation unit 207 and the operation panel 203.
  • the main control unit 220 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
  • a software key such as a scroll bar, or a window for creating an e-mail.
  • the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 202.
  • the main control unit 220 detects a user operation through the operation unit 207 or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 203. Or a display image scroll request through a scroll bar.
  • the main control unit 220 causes the operation position with respect to the operation panel 203 to overlap with the display panel 202 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 202.
  • a touch panel control function for controlling the sensitive area of the operation panel 203 and the display position of the software key.
  • the main control unit 220 can also detect a gesture operation on the operation panel 203 and execute a preset function in accordance with the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
  • the camera unit 208 includes configurations other than the external memory control unit 20, the recording medium 21, the display control unit 22, the display unit 23, and the operation unit 14 in the digital camera shown in FIG.
  • the captured image data generated by the camera unit 208 can be recorded in the storage unit 212 or output through the input / output unit 213 or the wireless communication unit 210.
  • the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited thereto, and may be mounted on the back surface of the display input unit 204. .
  • the camera unit 208 can be used for various functions of the smartphone 200.
  • an image acquired by the camera unit 208 can be displayed on the display panel 202, or the image of the camera unit 208 can be used as one of operation inputs of the operation panel 203.
  • the GPS receiving unit 214 detects a position
  • the position can be detected with reference to an image from the camera unit 208.
  • the optical axis direction of the camera unit 208 of the smartphone 200 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
  • the image from the camera unit 208 can also be used in the application software.
  • the position information acquired by the GPS receiver 214 to the image data of the still image or the moving image, the voice information acquired by the microphone 206 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 215 can be added and recorded in the recording unit 212, or output through the input / output unit 213 and the wireless communication unit 210.
  • the digital signal processing unit 17 performs the above-described signal processing, thereby enabling high-quality shooting.
  • the disclosed imaging device is an imaging device having a solid-state imaging device that images a subject through an imaging optical system including a focus lens, and the solid-state imaging device includes a plurality of imaging devices arranged in a two-dimensional array on a light receiving surface.
  • a plurality of types of phase difference detection pixels having different structures and pixels, and the phase difference detection is included in the output signal value of the phase difference detection pixel included in the captured image signal output from the solid-state image sensor.
  • a gain correction processing unit that multiplies a correction gain value corresponding to a pixel for correction to correct the output signal value, and an area where the phase difference detection pixel is arranged on the light receiving surface, and the plurality of types of phase difference detection pixels and Dividing into a plurality of blocks including the imaging pixels, and for each group of phase difference detection pixels of the same type belonging to each block, a correction gain value corresponding to the group,
  • a correction gain value generator for generating by using the captured image signal obtained by imaging an object by serial solid-state imaging device in which comprises a.
  • the correction gain value generation unit has a number of pixels whose output signal values are in a predetermined range in the same type of phase difference detection pixels belonging to the block, and the first threshold value or more, and A first block in which the number of pixels whose output signal values are in the predetermined range in the imaging pixels around the same kind of phase difference detection pixels belonging to the block is equal to or greater than a second threshold value.
  • the block is in the predetermined range of the output signal values of the imaging pixels around the group belonging to the first block and the phase difference detection pixels that detect the same color as the group.
  • a correction gain value corresponding to the group is generated using the above-described one.
  • the correction gain value generation unit may further include the first pixel in which the output signal value is in the predetermined range in the same kind of phase difference detection pixels belonging to the block.
  • the number of pixels whose output signal values are within the predetermined range in the imaging pixels around the same kind of phase difference detection pixels belonging to the block is less than the second threshold value or less than the second threshold value.
  • the correction gain value corresponding to the group generated for the first block around the second block is used for the group of the second block. A corresponding correction gain value is generated.
  • the correction gain value generation unit is in a direction orthogonal to a direction in which the phase difference detection pixels detect a phase difference with respect to the second block, with respect to the second block.
  • the correction gain value corresponding to the group of the second block is generated using the correction gain value corresponding to the group generated for the first block.
  • the correction gain value generation unit is in a direction orthogonal to a direction in which the phase difference detection pixels detect a phase difference with respect to the second block, with respect to the second block.
  • An average value of correction gain values corresponding to the group generated for all the first blocks is generated as a correction gain value corresponding to the group of the second block.
  • the correction gain value generation unit recently sets the second block in a direction orthogonal to the direction in which the phase difference detection pixels detect the phase difference with respect to the second block.
  • An average value of correction gain values corresponding to the group generated for the two first blocks in contact with each other is generated as a correction gain value corresponding to the group of the second block.
  • the plurality of blocks are obtained by converting the block sequence including the plurality of blocks arranged in a direction orthogonal to a direction in which the plurality of types of phase difference detection pixels detect a phase difference, Three or more are arranged in the direction to be detected, and the correction gain value generation unit, when there is a specific block sequence that is a block sequence in which all the blocks become the second block, Using the correction gain value corresponding to the group generated for each first block in the block sequence closest to the specific block sequence in the direction of detecting the phase difference. A correction gain value corresponding to the group of each second block in the specific block sequence is generated.
  • the disclosed imaging apparatus includes a mount mechanism that can attach and detach the imaging optical system.
  • the disclosed signal processing method is a signal processing method for processing a picked-up image signal output from a solid-state image pickup device that picks up an image of a subject through an image pickup optical system including a focus lens.
  • a plurality of imaging pixels arranged in a three-dimensional array and a plurality of types of phase difference detection pixels having different structures, and the phase difference detection pixels included in a captured image signal output from the solid-state imaging device A gain correction processing step of correcting the output signal value by multiplying the output signal value by a correction gain value corresponding to the phase difference detection pixel, and an area in which the phase difference detection pixel is arranged on the light receiving surface.
  • the disclosed signal processing program is a signal processing program for processing a captured image signal output from a solid-state imaging device that images a subject through an imaging optical system including a focus lens
  • the solid-state imaging device includes: It includes a plurality of imaging pixels arranged in a two-dimensional array on the light receiving surface and a plurality of types of phase difference detection pixels having different structures, and the level included in the captured image signal output from the solid-state imaging device.
  • a correction gain value generation step for generating a correction gain value corresponding to the group for each group of types of phase difference detection pixels using a captured image signal obtained by imaging a subject with the solid-state imaging device; Is a program for causing a computer to execute.
  • the present invention is particularly convenient and effective when applied to a digital camera or the like.
  • Solid-State Image Sensor 16 Main Memory 17 Digital Signal Processing Unit 30 Imaging Pixels 31R, 31L Phase Difference Detection Pixel 30 Light Receiving Surface 31 AF Area 32 Block

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Focusing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

La présente invention concerne : un dispositif d'imagerie qui présente une plus grande précision de correction du signal de sortie d'un pixel pour la détection de différence de phase ; un procédé de traitement de signal, et un programme de traitement de signal. Une unité de traitement de signal numérique (17), qui traite un signal d'image capturé produit par un élément d'imagerie à semi-conducteurs (3) contenant des pixels d'imagerie (30) et des pixels de détection de différence de phase (31R, 31L), divise une zone AF (31) en 36 blocs, et détermine la valeur de gain de correction pour chaque pixel de détection de différence de phase (31R, 31L) par rapport à chaque bloc (32). Parmi les 36 blocs (32), pour le bloc (32) ayant la fiabilité la plus faible dans la valeur de gain de correction déterminée, une valeur de gain de correction est générée au moyen de la valeur de gain de correction du bloc (32) qui a obtenu une valeur de gain de correction ayant la fiabilité la plus élevée à proximité du bloc (32) ayant la fiabilité la plus faible.
PCT/JP2013/080992 2012-12-17 2013-11-18 Dispositif d'imagerie, procédé de traitement de signal et programme de traitement de signal WO2014097792A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012274684A JP2016029750A (ja) 2012-12-17 2012-12-17 撮像装置、信号処理方法、信号処理プログラム
JP2012-274684 2012-12-17

Publications (1)

Publication Number Publication Date
WO2014097792A1 true WO2014097792A1 (fr) 2014-06-26

Family

ID=50978136

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/080992 WO2014097792A1 (fr) 2012-12-17 2013-11-18 Dispositif d'imagerie, procédé de traitement de signal et programme de traitement de signal

Country Status (2)

Country Link
JP (1) JP2016029750A (fr)
WO (1) WO2014097792A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106101556A (zh) * 2016-07-29 2016-11-09 广东欧珀移动通信有限公司 移动终端的图像合成方法、装置及移动终端
US10554877B2 (en) 2016-07-29 2020-02-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2017170724A1 (ja) * 2016-03-31 2019-02-14 株式会社ニコン 撮像装置、レンズ調節装置、および電子機器

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002131623A (ja) * 2000-10-24 2002-05-09 Canon Inc 撮像装置及び撮像システム
JP2010062640A (ja) * 2008-09-01 2010-03-18 Canon Inc 撮像装置、撮像装置の制御方法及びプログラム
JP2011244288A (ja) * 2010-05-19 2011-12-01 Fujifilm Corp 撮像装置及び撮像画像信号の補正方法
WO2012128154A1 (fr) * 2011-03-24 2012-09-27 富士フイルム株式会社 Élément d'imagerie en couleur, dispositif d'imagerie, et programme de commande de dispositif d'imagerie

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002131623A (ja) * 2000-10-24 2002-05-09 Canon Inc 撮像装置及び撮像システム
JP2010062640A (ja) * 2008-09-01 2010-03-18 Canon Inc 撮像装置、撮像装置の制御方法及びプログラム
JP2011244288A (ja) * 2010-05-19 2011-12-01 Fujifilm Corp 撮像装置及び撮像画像信号の補正方法
WO2012128154A1 (fr) * 2011-03-24 2012-09-27 富士フイルム株式会社 Élément d'imagerie en couleur, dispositif d'imagerie, et programme de commande de dispositif d'imagerie

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106101556A (zh) * 2016-07-29 2016-11-09 广东欧珀移动通信有限公司 移动终端的图像合成方法、装置及移动终端
US10554877B2 (en) 2016-07-29 2020-02-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal
US10848678B2 (en) 2016-07-29 2020-11-24 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image synthesis method and apparatus for mobile terminal, and mobile terminal

Also Published As

Publication number Publication date
JP2016029750A (ja) 2016-03-03

Similar Documents

Publication Publication Date Title
JP5775977B2 (ja) 画像処理装置、撮像装置、画像処理方法、及び画像処理プログラム
JP5657182B2 (ja) 撮像装置及び信号補正方法
JP5690974B2 (ja) 撮像装置及び合焦制御方法
JP5542249B2 (ja) 撮像素子及びこれを用いた撮像装置及び撮像方法
EP2903258B1 (fr) Dispositif et procédé de traitement d'image, et dispositif de capture d'image
JP5799178B2 (ja) 撮像装置及び合焦制御方法
JP5982601B2 (ja) 撮像装置及び合焦制御方法
US9743031B2 (en) Imaging device and imaging method
JP5802846B2 (ja) 撮像装置
JP5872122B2 (ja) 撮像装置及び合焦制御方法
JP5677625B2 (ja) 信号処理装置、撮像装置、信号補正方法
CN110447223B (zh) 摄像装置
JP5982600B2 (ja) 撮像装置及び合焦制御方法
JP5990665B2 (ja) 撮像装置及び合焦制御方法
JP5768193B2 (ja) 画像処理装置、撮像装置、画像処理方法、画像処理プログラム
WO2013145821A1 (fr) Élément d'imagerie et dispositif d'imagerie
WO2014097792A1 (fr) Dispositif d'imagerie, procédé de traitement de signal et programme de traitement de signal
WO2013183381A1 (fr) Dispositif de prise d'images et procédé de prise d'images
JP5798696B2 (ja) 画像処理装置、方法、記録媒体及びプログラム並びに撮像装置
JP5680803B2 (ja) 画像処理装置および方法ならびに撮像装置
WO2013145887A1 (fr) Dispositif d'imagerie et procédé d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13866079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13866079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP