WO2014141561A1 - 撮像装置、信号処理方法、信号処理プログラム - Google Patents
撮像装置、信号処理方法、信号処理プログラム Download PDFInfo
- Publication number
- WO2014141561A1 WO2014141561A1 PCT/JP2013/084314 JP2013084314W WO2014141561A1 WO 2014141561 A1 WO2014141561 A1 WO 2014141561A1 JP 2013084314 W JP2013084314 W JP 2013084314W WO 2014141561 A1 WO2014141561 A1 WO 2014141561A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- phase difference
- difference detection
- correction
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4007—Interpolation-based scaling, e.g. bilinear interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/12—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
- G03B17/14—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets interchangeably
Definitions
- the present invention relates to an imaging device, a signal processing method, and a signal processing program.
- an imaging device an information device having an imaging function as described above is referred to as an imaging device.
- phase difference AF Auto Focus, auto focusing
- phase difference AF method can detect the in-focus position at high speed and with high accuracy as compared with the contrast AF method, the phase difference AF method is widely adopted in various imaging apparatuses.
- a pair of pixels for phase difference detection in which the light shielding film openings are eccentric in opposite directions are separated over the entire light receiving surface.
- the ones provided in are used (see Patent Documents 1 to 4).
- this pixel for phase difference detection Since the area of the light shielding film opening of this pixel for phase difference detection is smaller than that of the other normal pixels (imaging pixels), the output signal thereof is insufficient to be used as a pickup image signal. Become. Therefore, the output signal of the pixel for phase difference detection needs to be corrected.
- Patent documents 1 to 4 perform interpolation correction processing for interpolating and generating an output signal of a pixel for phase difference detection using an output signal of a normal pixel around it, and gain amplification of an output signal of a pixel for phase difference detection Discloses an imaging apparatus that uses a gain correction process to correct the image.
- Patent Document 5 describes that in a lens-interchangeable camera, processing of interpolating and generating an output signal of a pixel for phase difference detection using an output signal of a normal pixel around it is performed.
- Patent Document 6 describes a camera in which a threshold value for determining whether a pixel of a solid-state image sensor is a defective pixel is variable using lens information acquired from a lens device.
- the output of the pixel for phase difference detection differs depending on the combination of the imaging device incorporated in the camera and the lens mounted in the camera.
- the light beam angle to the imaging device is different, and the amount of light incident on the pixel for detecting the phase difference with respect to the light beam angle depends on the shape of the light shielding film of the It changes in complexity depending on the positional relationship of
- Patent Documents 1 to 6 in an interchangeable lens camera, when an interchangeable lens not storing the correction gain value is attached, consideration is given to how to correct the output signal of the pixel for phase difference detection. It has not been.
- the present invention has been made in view of the above circumstances, and it is possible to correct an output signal of a phase difference detection pixel at high speed and with high accuracy even when any lens is mounted. It aims at providing an apparatus.
- An imaging apparatus is an imaging apparatus to which a lens device can be attached and detached, and includes a plurality of imaging pixels and a plurality of phase difference detection pixels arranged in a two-dimensional array on the light receiving surface
- An imaging element for imaging an object a communication unit for communicating with the mounted lens apparatus, and a lens for acquiring lens information which is information specific to the lens apparatus from the lens apparatus via the communication unit
- An information acquisition unit and a gain correction processing unit that performs a gain correction process of correcting an output signal of the phase difference detection pixel in a captured image signal obtained by capturing an object by the image sensor by multiplying the output signal by a gain value
- the image pickup for detecting the same color of the output signal of the phase difference detection pixel in the pickup image signal as the phase difference detection pixel around the phase difference detection pixel.
- All the phase differences in the captured image signal according to the interpolation correction processing unit that performs interpolation correction processing that performs correction by replacing the signal generated using the output signal of the pixel with the signal and the lens information acquired by the lens information acquisition unit A first correction method for correcting the output signal of the detection pixel by the interpolation correction processing unit, and a second correction method for correcting the output signals of all the phase difference detection pixels in the captured image signal by the gain correction processing unit Correction that selects one of the correction method and the third correction method that corrects the output signal of each phase difference detection pixel in the captured image signal by either the interpolation correction processing unit or the gain correction processing unit
- the output signal of the phase difference detection pixel in the captured image signal is compensated by the method selection unit and the method selected by the correction method selection unit.
- An image processing unit which is intended to include a.
- the signal processing method is a signal processing method by an imaging device capable of attaching and detaching a lens device, and the imaging device includes a plurality of imaging pixels and a plurality of phase differences arrayed in a two-dimensional array on a light receiving surface. Communication is performed on lens information which is information specific to the lens device, including an imaging element including a detection pixel and capturing an object through the lens device, and a communication unit for communicating with the mounted lens device.
- Output signals of all the pixels for phase difference detection in the captured image signal according to the lens information acquisition step acquired from the lens apparatus through the lens unit and the lens information acquired in the lens information acquisition step Interpolation interpolation that is replaced by a signal generated using the output signal of the imaging pixel that detects the same color as that of the phase difference detection pixel around the detection pixel
- the signal processing program of the present invention is a program for causing a computer to execute each step of the above signal processing method.
- the present invention it is possible to provide a lens-interchangeable imaging device capable of correcting the output signal of the phase difference detection pixel at high speed and with high accuracy regardless of what kind of lens is mounted.
- FIG. 1 shows schematic structure of the digital camera as an example of the imaging device for describing one Embodiment of this invention.
- Functional block diagram of the digital signal processor 17 in the digital camera shown in FIG. 1 Flow chart for explaining the operation of the digital camera shown in FIG.
- the figure which shows the sensitivity ratio of pixel 51R, 51L for phase difference detection in the position (horizontal pixel position) of row direction X of solid-state image sensor 5 The figure for demonstrating that it becomes the sensitivity ratio of FIG.
- the figure for demonstrating the incident light ray angle in the arbitrary positions in row direction X of the solid-state image sensor 5 A diagram showing an example of data stored in the memory 60 of the lens apparatus 100
- Flow chart for explaining the operation of the digital signal processing unit 17 shown in FIG. 5 A diagram for explaining a smartphone as an imaging device Internal block diagram of the smartphone of FIG. 13
- FIG. 1 is a view showing a schematic configuration of a digital camera as an example of an imaging apparatus for describing an embodiment of the present invention.
- the digital camera shown in FIG. 1 includes a lens apparatus 100 as an imaging optical system, and a camera body 200 provided with a mounting mechanism (not shown) on which the lens apparatus 100 is mounted.
- the lens device 100 is detachable from the camera body 200, and can be replaced with another one.
- the lens device 100 includes a photographing lens 10 including a focus lens and a zoom lens, a diaphragm 20, a lens drive unit 30, a diaphragm drive unit 40, a lens control unit 50 for overall control of the entire lens device 100, and a memory 60. And an electrical contact 70.
- the focus lens here is a lens that adjusts the focal length of the imaging optical system by moving in the optical axis direction.
- the focus lens indicates a lens that adjusts the focal position in a lens unit composed of a plurality of lenses, and in the case of a full group lens, it indicates the entire whole group.
- the lens drive unit 30 adjusts the position of the focus lens included in the imaging lens 10 or adjusts the position of the zoom lens included in the imaging lens 1 in accordance with an instruction from the lens control unit 50.
- the diaphragm drive unit 40 adjusts the exposure amount by controlling the opening amount of the diaphragm 20 in accordance with a command from the lens control unit 50.
- the memory 60 stores lens information which is information unique to the lens apparatus 100.
- the lens information includes at least a lens ID as identification information for identifying the lens device 100.
- the electrical contact 70 is an interface for communicating between the lens apparatus 100 and the camera body 200.
- the electrical contact 70 contacts the electrical contact 9 provided on the camera body 200 in a state where the lens device 100 is mounted on the camera body 200.
- the electrical contact 9 functions as a communication unit for communicating with the attached lens apparatus 100.
- the camera body 200 is an analog signal that performs analog signal processing such as correlated double sampling processing connected to the output of the solid-state imaging device 5 such as a CCD type or CMOS type that captures an object through the lens device 100
- a processing unit 6 and an A / D conversion circuit 7 for converting an analog signal output from the analog signal processing unit 6 into a digital signal are provided.
- the analog signal processing unit 6 and the A / D conversion circuit 7 are controlled by the system control unit 11.
- the analog signal processing unit 6 and the A / D conversion circuit 7 may be incorporated in the solid-state imaging device 5.
- the system control unit 11 drives the solid-state imaging device 5 via the imaging device driving unit 8 and outputs a subject image captured through the imaging lens 10 as a captured image signal.
- An instruction signal from the user is input to the system control unit 11 through the operation unit 14.
- the electric control system of the digital camera further performs interpolation calculation and gamma correction calculation on the main memory 16, the memory control unit 15 connected to the main memory 16, and the captured image signal output from the A / D conversion circuit 7. , And the digital signal processing unit 17 that generates photographed image data by performing RGB / YC conversion processing and the like, and compresses the photographed image data generated by the digital signal processing unit 17 into a JPEG format or expands the compressed image data Compression / expansion processing unit 18, a defocus amount calculation unit 19 for calculating a defocus amount, an external memory control unit 20 to which a detachable recording medium 21 is connected, and a display unit 23 mounted on the back of the camera etc. And a display control unit 22 connected thereto.
- the memory control unit 15, the digital signal processing unit 17, the compression / decompression processing unit 18, the defocus amount calculation unit 19, the external memory control unit 20, and the display control unit 22 are mutually connected by a control bus 24 and a data bus 25. It is controlled by a command from the system control unit 11.
- FIG. 2 is a partially enlarged view showing a planar configuration of the solid-state imaging device 5 mounted on the digital camera shown in FIG.
- the solid-state imaging device 5 includes a large number of pixels 51 (each square block in the figure) arranged in a two-dimensional manner in the row direction X and the column direction Y orthogonal to the row direction X. Although all the pixels 51 are not shown in FIG. 2, several millions to ten thousand of pixels 51 are arranged in a two-dimensional manner in practice.
- an output signal is obtained from each of the large number of pixels 51.
- a set of this large number of output signals is referred to herein as a captured image signal.
- Each pixel 51 includes a photoelectric conversion unit such as a photodiode and a color filter formed above the photoelectric conversion unit.
- the pixel 51 including a color filter transmitting red light is attached with the letter “R”
- the pixel 51 including a color filter transmitting green light is attached with the letter “G”
- blue light is attached with the letter "B”.
- the large number of pixels 51 are arranged in a row direction Y in which a plurality of pixel rows composed of a plurality of pixels 51 arranged in the row direction X are arranged. Then, the pixel rows in the odd rows and the pixel rows in the even rows are shifted in the row direction X by approximately 1 ⁇ 2 of the arrangement pitch of the pixels 51 in each pixel row.
- the arrangement of the color filters included in each pixel 51 of the odd-numbered pixel rows is a Bayer arrangement as a whole. Further, the arrangement of the color filters included in each pixel 51 of the even-numbered pixel rows is also a Bayer arrangement as a whole.
- the pixel 51 in the odd-numbered row and the pixel 51 adjacent to the lower right with respect to the pixel 51 and detecting the same color light constitute a paired pixel.
- the sensitivity of the camera can be enhanced by adding the output signals of the two pixels 51 forming the pair pixel, or the two pixels 51 forming the pair pixel By changing the exposure time and adding the output signals of the two pixels 51, it is possible to achieve a wide dynamic range of the camera.
- a part of the large number of pixels 51 is a phase difference detection pixel.
- the phase difference detection pixels include a plurality of phase difference detection pixels 51R and a plurality of phase difference detection pixels 51L.
- the plurality of phase difference detection pixels 51R receive one of a pair of light fluxes passing through different portions of the pupil area of the photographing lens 1 (for example, the light flux passing through the right half of the pupil area) and output a signal corresponding to the light reception amount Do. That is, the plurality of phase difference detection pixels 51 ⁇ / b> R provided in the solid-state imaging device 5 picks up an image formed by one of the pair of light fluxes.
- the plurality of phase difference detection pixels 51L receive the other of the pair of light beams (for example, the light beam passing through the left half of the pupil region) and output a signal according to the amount of light received. That is, the plurality of phase difference detection pixels 51 ⁇ / b> L provided in the solid-state imaging device 5 capture an image formed by the other of the pair of light fluxes.
- the plurality of pixels 51 (hereinafter referred to as imaging pixels) other than the phase difference detection pixels 51R and 51L capture an image formed by the light flux that has passed through almost all of the pupil region of the photographing lens 1 It becomes.
- a light shielding film is provided above the photoelectric conversion unit of the pixel 51, and an opening for defining a light receiving area of the photoelectric conversion unit is formed in the light shielding film.
- the center of the opening (indicated by symbol a in FIG. 2) of the imaging pixel 51 coincides with the center of the photoelectric conversion unit of the imaging pixel 51 (the center of the square block). Note that, in FIG. 2, in order to simplify the drawing, the opening a is illustrated in only a part of the imaging pixel 51.
- the center (indicated by symbol c in FIG. 2) of the phase difference detection pixel 51R is decentered to the right with respect to the center of the photoelectric conversion unit of the phase difference detection pixel 51R.
- the center (indicated by symbol b in FIG. 2) of the phase difference detection pixel 51L is decentered to the left with respect to the center of the photoelectric conversion unit of the phase difference detection pixel 51L.
- phase difference detection pixel 51R a part of the pixel 51 on which the green color filter is mounted is the phase difference detection pixel 51R or the phase difference detection pixel 51L.
- pixels on which color filters of other colors are mounted may be used as phase difference detection pixels.
- phase difference pair The pair of the phase difference detection pixel 51R and the phase difference detection pixel 51L disposed in the vicinity of the phase difference detection pixel 51R (hereinafter referred to as a phase difference pair) is discrete and periodic on the light receiving surface 53 on which the pixel 51 is disposed. Are arranged.
- two adjacent pixels refer to two pixels which are close enough to be regarded as receiving light from substantially the same subject part. Since the phase difference detection pixels 51R and the phase difference detection pixels 51L that form the phase difference pair are close to each other, they are treated as having the same position in the row direction X (hereinafter also referred to as horizontal pixel position).
- the phase difference detection pixels 51R are arranged at every three pixels in the row direction X in a part of the even pixel rows (four pixel rows arranged every three pixel rows in the example of FIG. 2). Is located in
- the phase difference detection pixel 51L detects phase difference in the row direction X in a part of the pixel row in the odd row (the pixel row next to the pixel row including the phase difference detection pixel 51R). They are arranged in the same cycle as the pixel 51R.
- the light received by the phase difference detection pixel 51L through the opening b of the light shielding film is light from the left side as viewed from the subject of the photographing lens 1 provided above the sheet of FIG. Most of the light comes from the direction you saw with your right eye.
- the light received by the phase difference detection pixel 51R through the opening c of the light shielding film is light from the right when viewed from the subject of the photographing lens 1, that is, light from a direction when the subject is viewed with the left eye Become the main.
- phase difference detection pixels 51R can obtain a captured image signal when the subject is viewed with the left eye
- all phase difference detection pixels 51L can obtain a captured image signal when the subject is viewed by the right eye be able to. Therefore, by combining the two, it is possible to generate stereoscopic image data of a subject or to generate phase difference information by performing correlation calculation of the two.
- the phase difference detection pixel 51R and the phase difference detection pixel 51L can respectively receive light beams passing through different portions of the pupil region of the photographing lens 1 by decentering the opening of the light shielding film in the opposite direction. It is possible to obtain phase difference information.
- the structure for obtaining phase difference information is not limited to this, and a well-known one can be adopted.
- FIG. 3 is a functional block diagram of the digital signal processing unit 17 in the digital camera shown in FIG.
- the digital signal processing unit 17 includes a gain correction processing unit 171, an interpolation correction processing unit 172, a lens information acquisition unit 173, a correction method selection unit 174, and an image processing unit 175. These are functional blocks formed by the processor included in the digital signal processing unit 17 executing a program.
- the gain correction processing unit 171 performs a gain correction process of correcting an output signal of a phase difference detection pixel to be corrected (hereinafter referred to as a correction target pixel) included in a captured image signal by multiplying the output signal by a gain value.
- This gain value can be stored in advance in the memory of the camera body 200 when the lens device 100 mounted on the camera body 200 is a genuine one of the maker of the camera body 200.
- the gain value can be obtained from a captured image signal obtained by capturing a reference image in an adjustment process before shipping of the digital camera.
- the gain value for each of the phase difference detection pixels 51 for the genuine lens device 100 is stored in the main memory 16 of the camera body 200 in association with the lens ID for identifying the lens device 100.
- the gain value may be generated and stored for each of the phase difference detection pixels 51, or the light receiving surface of the solid-state imaging device 5 is divided into blocks, and one gain value is generated and stored for each block. It is also good.
- the interpolation correction processing unit 172 corrects the output signal of the correction target pixel by replacing the output signal of the correction target pixel with the signal of the imaging pixel that detects the same color as the correction target pixel in the periphery of the correction target pixel. .
- the average value of the output signals of the imaging pixels for detecting G color light around the correction target pixel is the output signal value of the correction target pixel.
- Replace with The interpolation correction processing unit 172 may perform correction by replacing the copy of the output signal of any one of the imaging pixels around the correction target pixel with the output signal of the correction target pixel.
- the lens information acquisition unit 173 acquires lens information stored in the memory 60 of the lens device 100 from the lens device 100 mounted on the camera body 200.
- the correction method selection unit 174 is a first correction method in which the interpolation correction processing unit 172 corrects the output signals of all the phase difference detection pixels in the captured image signal output from the solid-state imaging device 5. Depending on the lens information acquired by the lens information acquisition unit 173, any one of the second correction method of correcting by the gain correction processing unit 171 the output signals of all the phase difference detection pixels in the output captured image signal select.
- the image processing unit 175 corrects the output signal of the phase difference detection pixel of the captured image signal output from the solid-state imaging device 5 according to the method selected by the correction method selection unit 174, and corrects the captured image signal Are recorded in the main memory 16. Then, the image processing unit 175 performs known image processing such as demosaicing processing, ⁇ correction processing, white balance adjustment processing, and the like on the recorded captured image signal to generate captured image data, and records this in the recording medium 21. Do.
- the image processing unit 175 may record the corrected captured image signal as the RAW data on the recording medium 21 as it is.
- FIG. 4 is a flowchart for explaining the operation of the digital camera shown in FIG.
- the system control unit 11 of the camera body 200 detects that the lens device 100 is attached through the electrical contact 9.
- the system control unit 11 sends a transmission request for lens information to the lens device 100 through the electrical contact 9 (step S1).
- the lens control unit 50 of the lens device 100 transmits the lens information stored in the memory 60 to the camera body 200 via the electrical contact 70.
- the system control unit 11 receives lens information sent from the lens apparatus 100 and temporarily stores the lens information in the main memory 16.
- the digital signal processing unit 17 acquires lens information stored in the main memory 16 (step S2), and searches the main memory 16 for data of a correction gain value associated with the lens ID included in the lens information. (Step S3).
- the digital signal processing unit 17 causes the gain correction processing unit 171 to correct the output signals of all the phase difference detection pixels in the captured image signal output from the solid-state imaging device 5.
- a correction method is selected (step S4).
- the digital signal processing unit 17 corrects the output signals of all the phase difference detection pixels in the captured image signal output from the solid-state imaging device 5 by the interpolation correction processing unit 172
- One correction method is selected (step S5).
- the imaging standby state is established, and when the imaging instruction is issued by pressing the shutter button included in the operation unit 14, imaging is performed by the solid-state imaging device 5 Is output.
- the captured image signal is converted to a digital signal after being subjected to analog signal processing, and temporarily stored in the main memory 16.
- the digital signal processing unit 17 corrects the output signal of the phase difference detection pixel 51 in the captured image signal stored in the main memory 16 according to the method selected in step S4 or step S5, and performs imaging after correction
- the image signal is processed to generate captured image data, which is recorded on the recording medium 21, and the imaging processing is ended.
- all the phase difference detection pixels are mounted when the manufacturer's genuine lens apparatus 100 whose correction gain value used for gain correction processing is stored in the main memory 16 is mounted.
- the output signal of 51 is corrected by gain correction processing.
- the output signals of all the phase difference detection pixels 51 are corrected by interpolation correction processing. For this reason, it becomes possible to correct the output signal of the phase difference detection pixel 51 with no problem for any lens apparatus that can be attached to the camera body 200.
- this digital camera it is not necessary to store in advance the correction gain value corresponding to the lens apparatus 100 manufactured by another company in the main memory 16 of the camera body 200. For this reason, the time for data generation can be reduced, and the memory capacity can be reduced, and the manufacturing cost of the digital camera can be reduced.
- the correction gain value corresponding to the lens apparatus 100 should be stored in the memory 60 of the lens apparatus 100 instead of the camera body 200 in association with the lens ID. Is also conceivable.
- step S3 of FIG. 4 the system control unit 11 determines which of the main memory 16 of the camera body 200 and the memory 60 of the lens apparatus 100 the correction gain value corresponding to the lens ID included in the acquired lens information. If it is stored, the process of step S4 is performed, and if it is not stored, the process of step S5 is performed.
- FIG. 5 is a functional block diagram showing a modification of the digital signal processing unit 17 in the digital camera shown in FIG.
- the digital signal processing unit 17 shown in FIG. 5 is the same as that shown in FIG. 3 except that a correction gain value generation unit 176 is added.
- the correction gain value generation unit 176 determines the light beam angle information and the design information for the solid-state imaging device 5 (chip size).
- the correction gain value for each of the phase difference detection pixels 51 is generated using information such as the number of pixels, the shape of the light shielding film opening of the phase difference detection pixels, and the shape of the photoelectric conversion region in the silicon substrate.
- the correction gain value generation unit 176 records the generated correction gain value in the main memory 16 in association with the lens ID included in the lens information.
- FIG. 6 is a schematic plan view showing the entire configuration of the solid-state imaging device 5 mounted on the digital camera shown in FIG.
- the solid-state imaging device 5 has a light receiving surface 53 on which all the pixels 51 are disposed.
- AF areas phase difference detection areas
- the AF area 52 is an area including a plurality of phase difference pairs aligned in the row direction X. Only the imaging pixel 51 is disposed in a portion of the light receiving surface 53 excluding the AF area 52.
- the three AF areas 52 at the middle in the row direction X respectively pass through the points of intersection of the light receiving surface 53 with the optical axis of the imaging lens 1 in plan view and in the column direction It is an area having a width in the row direction X across a straight line extending in Y.
- the position in the row direction X of the intersection of the light receiving surface 53 with the optical axis of the imaging lens 1 is referred to as an optical axis position.
- the defocus amount calculation unit 19 shown in FIG. 1 is read out from the phase difference detection pixels 51L and the phase difference detection pixels 51R in one AF area 52 selected from among the nine AF areas 52 by the user operation or the like.
- An output signal group is used to calculate a phase difference amount which is a relative positional deviation amount of two images formed by the pair of light beams. Then, based on this phase difference amount, the focus adjustment state of the photographing lens 1, here, the amount away from the in-focus state and its direction, that is, the defocus amount is determined.
- the system control unit 11 shown in FIG. 1 moves the focus lens included in the imaging lens 1 to the in-focus position based on the defocus amount calculated by the defocus amount calculation unit 19 to focus the imaging lens 1. Control the state.
- the apertures of the phase difference detection pixel 51R and the phase difference detection pixel 51L are decentered in opposite directions. Therefore, even if the positions of the apertures in the eccentric direction (the shift direction of a pair of images; the row direction X in FIG. 2) are substantially the same, the sensitivity difference is generated between the phase difference detection pixels 51R and the phase difference detection pixels 51L. Occurs.
- FIG. 7 is a diagram showing the sensitivity ratio of the phase difference detection pixels 51R and 51L forming a phase difference pair at an arbitrary position in the row direction X (hereinafter also referred to as a horizontal pixel position) in the solid-state imaging device 5.
- the straight line indicated by reference numeral 51R indicates the sensitivity ratio of the phase difference detection pixel 51R
- the straight line indicated by reference numeral 51L indicates the sensitivity ratio of the phase difference detection pixel 51L.
- the sensitivity ratio of an arbitrary phase difference detection pixel is an output signal of an arbitrary phase difference detection pixel and an imaging pixel adjacent thereto (where the light of the same color as the arbitrary phase difference detection pixel is detected).
- a and B respectively, it means a value represented by A / B or B / A.
- FIG. 7 shows that when the sensitivity ratio is A / B.
- the range of horizontal pixel positions of the three AF areas 52 at the left end in FIG. 6 is indicated by reference numeral 52L. Further, in FIG. 6, the range of horizontal pixel positions of the three AF areas 52 at the central portion is indicated by reference numeral 52C. Further, the range of horizontal pixel positions of the three AF areas 52 at the right end in FIG. 6 is indicated by reference numeral 52R.
- the horizontal pixel position at the left end of the range 52L is indicated by x1
- the horizontal pixel position at the right end of the range 52L is indicated by x2
- the horizontal pixel position at the right end of the range 52C is indicated by x3
- the right end of the range 52R The horizontal pixel position of the unit is indicated by x4.
- phase difference detection pixels 51R and 51L are periodically arranged in the column direction Y as well. However, since the phase difference detection pixels 51R and the phase difference detection pixels 51L do not have eccentric openings in the column direction Y, the sensitivity ratio is as shown in FIG. 7 at any position in the column direction Y.
- the output signal of the phase difference detection pixel 51R and the output signal of the phase difference detection pixel 51L each have different levels for each horizontal pixel position depending on the subject, so how is the sensitivity distribution of the phase difference detection pixel I do not know if it is.
- the sensitivity distribution of the phase difference detection pixel can be known by obtaining the sensitivity ratio which is the ratio of the phase difference detection pixel and the output signal of the imaging pixel adjacent thereto.
- the opening c is decentered to the right in FIG. Therefore, as shown in FIG. 8, half of the light passing through the left side of the photographing lens 10 enters the opening c of the phase difference detection pixel 51 R at the left end of the light receiving surface 53. The light which has passed through will not enter. On the other hand, half of the light passing through the right side of the photographing lens 10 enters the aperture c of the phase difference detection pixel 51R at the right end of the light receiving surface 53, and all the light passing through the left side of the photographing lens 10 enters. . Further, only the light passing through the left side of the imaging lens 10 enters the opening c of the phase difference detection pixel 51R at the center of the light receiving surface 53, and the light passing through the right side of the imaging lens 10 does not enter.
- the characteristic of the sensitivity ratio is opposite to that of the phase difference detection pixel 51R.
- the sensitivity ratio of the phase difference detection pixel 51 L decreases from the left end to the right end of the light receiving surface 53. Further, the sensitivity ratio of the phase difference detection pixel 51 R becomes higher as it goes from the left end to the right end of the light receiving surface 53.
- the sensitivity ratio of the phase difference detection pixel 51L and the sensitivity ratio of the phase difference detection pixel 51R are substantially the same.
- the solid-state imaging device 5 mounted with the phase difference detection pixel 51R and the phase difference detection pixel 51L has the characteristic of the sensitivity ratio as shown in FIG.
- the sensitivity ratio of each of the phase difference pairs at any horizontal pixel position shown in FIG. 7 is uniquely determined by the angle of the light beam incident on that horizontal pixel position (hereinafter referred to as the incident light beam angle).
- the incident ray angle will be described below.
- FIG. 9 is a view of the photographing lens 10 and the solid-state imaging device 5 as viewed in the column direction Y which is a direction orthogonal to the optical axis of the photographing lens 10 and the row direction X.
- the light incident on any horizontal pixel position of the solid-state imaging device 5 is a chief ray passing through the center of the photographing lens 10, an upper ray passing through the upper end in FIG. 9 of the photographing lens 10, and the lower end in FIG. And lower rays passing through the section.
- the upper ray refers to a ray that passes through one end (upper end) in the row direction X of the photographing lens 10 and reaches the above-mentioned arbitrary horizontal pixel position.
- the lower ray refers to a ray that passes through the other end (lower end) in the row direction X of the photographing lens 10 and reaches the above-mentioned arbitrary horizontal pixel position.
- the angle between the optical axis K of the photographing lens 10 and the upper ray is ⁇
- the angle between the optical axis K of the photographing lens 10 and the lower ray is ⁇
- the angle between the optical axis K of the photographing lens 10 and the lower ray is ⁇
- the sensitivity ratios of the phase difference detection pixel 51R and the phase difference detection pixel 51L each have linear characteristics as shown in FIG. Therefore, if the sensitivity ratio of the phase difference detection pixels 51R and the phase difference detection pixels 51L at at least two positions in the row direction X of the solid-state imaging device 5 is known, phase difference detection at all positions in the row direction X
- the sensitivity ratio between the pixel 51R for phase difference and the pixel 51L for phase difference detection can be determined by linear interpolation.
- the sensitivity ratio of each of the phase difference pairs at any horizontal pixel position is determined by the incident ray angle at that horizontal pixel position.
- the incident light beam angle at an arbitrary horizontal pixel position differs depending on the type of the lens device 100 and the optical condition set in the lens device 100.
- information of incident light beam angles at at least two arbitrary positions in the row direction X of the solid-state imaging device 5 when the lens device 100 is attached to the camera body 200 is obtained for each optical condition of the lens device 100.
- information of incident light beam angles at at least two arbitrary positions in the row direction X of the solid-state imaging device 5 when the lens device 100 is attached to the camera body 200 is obtained for each optical condition of the lens device 100.
- a table in which the sensitivity ratio of each of the phase difference pairs at the arbitrary horizontal pixel position is stored for each different incident ray angle at the arbitrary horizontal pixel position is stored There is. If the combination of the lens device and the imaging device is different, the sensitivity ratio of the phase difference pair is also different, so it is desirable to store the data of the sensitivity ratio to the incident light beam angle in the device equipped with the imaging device. Since the information on the ray angle is determined by the lens, it is desirable to store it in the lens device.
- the information of the incident light beam angle stored in the memory 60 and the data of the table stored in the main memory 16 can be obtained by measurement in an adjustment process before shipping of the lens device 100 and the camera body 200.
- the incident light beam angles at each of the horizontal pixel positions x1, x2, x3 and x4 shown in FIG. 7 are measured for each of all the optical conditions (1,
- a table as shown in FIG. 10 is created from the measurement results and stored in the memory 60 of the lens apparatus 100.
- the sensitivity ratio of each of the arbitrary phase difference detection pixels 51R and the phase difference detection pixels 51L having the same horizontal pixel position is measured for all possible combinations of the upper ray angle and the lower ray angle, and A table as shown in FIG. 11 is created from the measurement results and stored in the main memory 16 of the camera body 200.
- sensitivity ratios of the phase difference detection pixel 51R are indicated by R1, R2 and R3
- sensitivity ratios of the phase difference detection pixel 51L are indicated by L1, L2 and L3.
- the correction gain value generation unit 176 uses the light beam angle information stored in the memory 60 of the lens device 100 and the table stored in the main memory 16 to obtain the correction gain for each of the phase difference detection pixels 51. Generate a value
- a correction gain value for making the sensitivity ratio "1" may be stored as a table shown in FIG.
- FIG. 12 is a flowchart for explaining the operation of the digital signal processing unit 17 shown in FIG.
- the flowchart shown in FIG. 12 is obtained by adding steps S10 and S11 to the flowchart shown in FIG. In FIG. 12, the same processing as that of FIG.
- step S3 determines whether or not ray angle information is included in the lens information (step S10).
- the correction method selection unit 174 performs the process of step S5.
- the correction gain value generation unit 176 corresponds to each phase difference detection pixel 51 for each imaging condition using the light beam angle information and the table shown in FIG.
- a correction gain value is generated, and the generated correction gain value group is stored in the main memory 16 in association with the lens ID included in the lens information (step S11).
- the correction gain value group may be stored in the memory 60 of the lens apparatus 100 in association with the lens ID.
- step S11 the correction method selection unit 174 performs the process of step S4.
- the digital camera of this modification even if the correction gain value corresponding to the lens ID is not stored in the camera body 200 or the lens apparatus 100, the light beam angle information is included in the lens information. For example, the correction gain value can be generated and stored from the ray angle information. For this reason, it is not necessary to store in advance the correction gain value for each imaging condition in the camera body 200, and the manufacturing cost of the digital camera can be reduced.
- the digital camera of this modification stores the generated correction gain value in association with the lens ID, generation of the correction gain value is omitted thereafter for the lens apparatus 100 in which the correction gain value has been generated even once. It is possible to reduce the photographing time.
- the lens apparatus 100 in which the correction gain value is not stored in either the camera body 200 or the lens apparatus 100 and in which the light beam angle information is not stored is mounted, the phase difference is obtained by the interpolation correction process. The output signal of the detection pixel 51 is corrected. For this reason, in any lens apparatus 100, the imaging quality can be made high.
- step S4 in FIG. 4 and FIG. 12 a second correction method for correcting the output signals of all the phase difference detection pixels 51 by gain correction processing is selected.
- a third correction method is selected in which the output signal of each phase difference detection pixel in the captured image signal is corrected by either the interpolation correction processing unit 172 or the gain correction processing unit 171 May be
- a third correction method as described in JP-A-2012-4729, an edge of a subject image is detected, and interpolation correction processing and gain correction processing are switched according to the amount of edge.
- the method of correcting the image data is adopted.
- the third correction method is not limited to this, and a method of using both the interpolation correction process and the gain correction process may be adopted.
- the correction method is selected according to the lens information stored in the lens apparatus 100.
- gain correction may not be performed, and interpolation correction may always be performed. Also in this case, it is possible to cope with a possible lens device, and the generation operation of the correction gain value is not necessary.
- the camera body 200 can acquire lens information from the mounted lens apparatus 100.
- communication may not be possible between the lens apparatus 100 and the camera body 200.
- the camera body 200 may be fitted with a genuine mount adapter manufactured by a manufacturer of the camera body 200 conforming to the specifications of lenses manufactured by another company, and a lens manufactured by another company may be mounted to the genuine mount adapter.
- the system control unit 11 of the camera body 200 when detecting that the genuine mount adapter is attached to the electric contact 9, the system control unit 11 of the camera body 200 enables the lensless release mode that enables photographing without the lens device.
- This lensless release mode may be manually enabled or disabled.
- information such as the focal length of the lens can be manually input.
- the lensless release mode is effective when lens information can not be acquired from the lens apparatus 100 mounted on the camera body 200. Therefore, when it is determined that the lens information can not be acquired, the system control unit 11 does not perform the gain correction process, but always performs the interpolation correction process. By doing this, even when a lens device that can not communicate is attached, it is possible to correct the output signal of the phase difference detection pixel.
- FIG. 13 shows an appearance of a smartphone 300 which is an embodiment of the photographing device of the present invention.
- the smartphone 300 shown in FIG. 13 has a flat casing 201, and a display input in which a display panel 202 as a display unit and an operation panel 203 as an input unit are integrated on one surface of the casing 201.
- a section 204 is provided.
- such a housing 201 includes a speaker 205, a microphone 206, an operation unit 207, and a camera unit 208.
- the configuration of the housing 201 is not limited to this. For example, a configuration in which the display unit and the input unit are independent may be employed, or a configuration having a folding structure or a slide mechanism may be employed.
- FIG. 14 is a block diagram showing a configuration of smartphone 300 shown in FIG.
- a wireless communication unit 210, a display input unit 204, a call unit 211, an operation unit 207, a camera unit 208, a storage unit 212, and an external input / output unit are main components of the smartphone.
- a wireless communication function of performing mobile wireless communication via the base station apparatus BS (not shown) and the mobile communication network NW (not shown) is provided as a main function of the smartphone 300.
- the wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW in accordance with an instruction from the main control unit 220. This wireless communication is used to transmit and receive various file data such as audio data and image data, e-mail data and the like, and to receive Web data and streaming data.
- the display input unit 204 displays images (still images and moving images), character information, and the like to visually communicate information to the user, and detects a user operation on the displayed information.
- a so-called touch panel and includes a display panel 202 and an operation panel 203.
- the display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
- LCD Liquid Crystal Display
- OELD Organic Electro-Luminescence Display
- the operation panel 203 is a device mounted so as to visibly recognize an image displayed on the display surface of the display panel 202, and detecting one or more coordinates operated by the user's finger or a stylus.
- a detection signal generated due to the operation is output to the main control unit 220.
- the main control unit 220 detects an operation position (coordinates) on the display panel 202 based on the received detection signal.
- the display panel 202 and the operation panel 203 of the smartphone 300 exemplified as one embodiment of the imaging device of the present invention are integrated to constitute the display input unit 204, the operation panel The arrangement 203 is such that the display panel 202 is completely covered.
- the operation panel 203 may have a function of detecting a user operation also in the area outside the display panel 202.
- the operation panel 203 detects a detection area (hereinafter referred to as a display area) for a superimposed portion overlapping the display panel 202 and a detection area (hereinafter a non-display area) for an outer edge portion not overlapping the display panel 202. And may be provided.
- the size of the display area may be completely the same as the size of the display panel 202, but it is not necessary to make the two coincide with each other.
- the operation panel 203 may have two sensitive areas, an outer edge portion and an inner portion other than the outer edge portion. Furthermore, the width of the outer edge portion is appropriately designed in accordance with the size of the housing 201 and the like. Furthermore, as a position detection method adopted by the operation panel 203, a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an electrostatic capacity method, etc. may be mentioned. You can also
- the call unit 211 includes a speaker 205 and a microphone 206, converts the user's voice input through the microphone 206 into sound data that can be processed by the main control unit 220, and outputs the sound data to the main control unit 220.
- the voice data received by the external input / output unit 213 is decoded and output from the speaker 205.
- the speaker 205 can be mounted on the same surface as the surface provided with the display input unit 204, and the microphone 206 can be mounted on the side surface of the housing 201.
- the operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 207 is mounted on the side surface of the housing 201 of the smartphone 300 and is turned on when pressed by a finger or the like, and turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
- the storage unit 212 includes control programs and control data of the main control unit 220, application software, address data corresponding to names of communicating parties, telephone numbers, etc., data of transmitted and received e-mails, web data downloaded by web browsing, It stores downloaded content data and temporarily stores streaming data and the like.
- the storage unit 212 is configured by an internal storage unit 217 embedded in the smartphone and an external storage unit 218 having an external memory slot that can be attached and detached.
- the internal storage unit 217 and the external storage unit 218 that constitute the storage unit 212 are a flash memory type (flash memory type), a hard disk type (hard disk type), a multimedia card micro type, It is realized by using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory or the like), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
- a card type memory for example, MicroSD (registered trademark) memory or the like
- RAM Random Access Memory
- ROM Read Only Memory
- the external input / output unit 213 plays a role of an interface with all the external devices connected to the smartphone 300, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network (For example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared Data Association (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee (registered trademark) ZigBee (registered trademark) etc.) for connecting directly or indirectly.
- USB universal serial bus
- IEEE 1394 etc.
- a network for example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared Data Association (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee (registered trademark) ZigBee (registered trademark) etc.
- Examples of external devices connected to the smartphone 300 include a wired / wireless headset, a wired / wireless external charger, a wired / wireless data port, a memory card connected via a card socket, and a SIM (Subscriber).
- Identity Module Card (UI) / User Identity Module Card (UIM) Card external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment wirelessly connected,
- UI User Identity Module Card
- IAM User Identity Module Card
- the external input / output unit 213 transmits the data received from such an external device to each component in the smartphone 300, and transmits the data in the smartphone 300 to the external device.
- the GPS reception unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn according to the instruction of the main control unit 220, executes positioning operation processing based on the plurality of received GPS signals, and executes the latitude of the smartphone 300 , The position that consists of longitude and altitude is detected.
- the GPS reception unit 214 can acquire position information from the wireless communication unit 210 or the external input / output unit 213 (for example, a wireless LAN), the GPS reception unit 214 can also detect the position using the position information.
- the motion sensor unit 215 includes, for example, a 3-axis acceleration sensor and the like, and detects a physical movement of the smartphone 300 according to an instruction of the main control unit 220. By detecting the physical movement of the smartphone 300, the moving direction or acceleration of the smartphone 300 is detected. The detection result is output to the main control unit 220.
- the power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 300 according to an instruction of the main control unit 220.
- the main control unit 220 includes a microprocessor, operates in accordance with control programs and control data stored in the storage unit 212, and centrally controls the respective units of the smartphone 300. Further, the main control unit 220 has a mobile communication control function of controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 210.
- the application processing function is realized by the main control unit 220 operating according to the application software stored in the storage unit 212.
- the application processing functions include, for example, an infrared communication function that controls the external input / output unit 213 to perform data communication with the other device, an electronic mail function that transmits and receives electronic mail, and a web browsing function that browses web pages. .
- the main control unit 220 also has an image processing function of displaying a video on the display input unit 204 based on image data (still image or moving image data) such as received data or downloaded streaming data.
- the image processing function is a function in which the main control unit 220 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 204.
- the main control unit 220 executes display control for the display panel 202 and operation detection control for detecting a user operation through the operation unit 207 and the operation panel 203.
- the main control unit 220 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
- the scroll bar is a software key for receiving an instruction to move a display portion of an image, for example, for a large image that can not fit in the display area of the display panel 202.
- the main control unit 220 detects a user operation through the operation unit 207, receives an operation on the icon through the operation panel 203, and receives an input of a character string in the input field of the window. Or accepts a request for scrolling the display image through the scroll bar.
- the main control unit 220 causes the operation position with respect to the operation panel 203 to be a superimposed portion (display area) overlapping the display panel 202 or an outer edge portion (non-display area) not overlapping the display panel 202 And a touch panel control function of controlling the sensitive area of the operation panel 203 and the display position of the software key.
- the main control unit 220 can also detect a gesture operation on the operation panel 203, and can execute a preset function according to the detected gesture operation.
- Gesture operation is not a conventional simple touch operation, but it is possible to draw a locus with a finger or the like, designate a plurality of positions simultaneously, or combine them to draw an locus for at least one from a plurality of positions. means.
- the camera unit 208 includes components other than the external memory control unit 20, the recording medium 21, the display control unit 22, the display unit 23, and the operation unit 14 in the digital camera shown in FIG.
- the captured image data generated by the camera unit 208 can be recorded in the storage unit 212 or can be output through the input / output unit 213 or the wireless communication unit 210.
- the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited to this, and may be mounted on the back of the display input unit 204 Good.
- the camera unit 208 can also be used for various functions of the smartphone 300.
- the image acquired by the camera unit 208 can be displayed on the display panel 202, or the image of the camera unit 208 can be used as one operation input of the operation panel 203.
- the GPS reception unit 214 detects a position
- the position can also be detected with reference to the image from the camera unit 208.
- the optical axis direction of the camera unit 208 of the smartphone 300 is determined without using the three-axis acceleration sensor or in combination with the three-axis acceleration sensor. You can also determine the current usage environment.
- the image from the camera unit 208 can also be used in application software.
- position information acquired by the GPS reception unit 214 and audio information acquired by the microphone 206 may be converted into text information to image data of a still image or a moving image
- the posture information and the like acquired by the motion sensor unit 215 may be added and recorded in the recording unit 212 or may be output through the input / output unit 213 or the wireless communication unit 210.
- the solid-state imaging device 5 as an imaging device of the camera unit 208, high-precision phase difference AF and high-quality imaging can be performed.
- the program for causing the computer to execute the functional blocks of the digital signal processing unit 17 can be distributed via a network such as the Internet, and this program is installed in a smartphone with a camera, etc., as shown in FIG. The same function as a digital camera can be realized. It can also be provided as a non-transitory computer readable medium for storing this program.
- the disclosed imaging device is an imaging device to which the lens device can be attached and detached, and includes a plurality of imaging pixels and a plurality of phase difference detection pixels arranged in a two-dimensional array on the light receiving surface
- An imaging element for imaging an object a communication unit for communicating with the mounted lens apparatus, and a lens for acquiring lens information which is information specific to the lens apparatus from the lens apparatus via the communication unit
- An information acquisition unit, and a gain correction processing unit that performs a gain correction process of correcting an output signal of the phase difference detection pixel in a captured image signal obtained by capturing an object by the image sensor by multiplying the output signal by a gain value And detecting the same color of the output signal of the phase difference detection pixel in the captured image signal as that of the phase difference detection pixel around the phase difference detection pixel.
- Interpolation correction processing unit that performs interpolation correction processing for correcting by replacing the signal generated using the output signal of the pixel for correction, and all the positions in the captured image signal according to the lens information acquired by the lens
- a first correction method of correcting the output signal of the phase difference detection pixel by the interpolation correction processing unit and a second correction method of correcting the output signals of all the phase difference detection pixels in the captured image signal by the gain correction processing unit
- the output signal of the phase difference detection pixel in the captured image signal is A positive image processing unit, but with a.
- the correction method selection unit when the correction method selection unit does not store the gain value corresponding to identification information of the lens device included in the lens information in either the imaging device or the lens device. , The first correction method is selected.
- the correction method selection unit does not include information necessary for generating the gain value determined by a combination of the mounted lens device and the imaging device.
- the first correction method is selected.
- the information necessary to obtain the gain value includes information on the light beam angle of the lens device.
- the disclosed imaging device generates the gain value using the information on the light ray angle and the information on the sensitivity of the image pickup element for each light ray angle when the information on the light ray angle is included in the lens information. And a correction gain value generation unit.
- the disclosed signal processing method is a signal processing method by an imaging device capable of attaching and detaching a lens device, and the imaging device includes a plurality of imaging pixels and a plurality of phase differences arrayed in a two-dimensional array on a light receiving surface.
- An imaging device including a detection pixel, which captures an object through the lens device, and a communication unit for communicating with the mounted lens device, the lens information being information unique to the lens device
- the lens information acquisition step acquired from the lens device through the communication unit and the lens information acquired in the lens information acquisition step, the output signals of all the phase difference detection pixels in the captured image signal are at that order.
- a signal generated using the output signal of the image pickup pixel that detects the same color as the phase difference detection pixel in the periphery of the phase difference detection pixel is corrected for correction
- the disclosed signal processing program is a program for causing a computer to execute each step of the above signal processing method.
- the present invention is particularly useful when applied to digital cameras and the like, and is effective.
- lens apparatus 100 lens apparatus 200 camera body 5 solid-state image sensor 9 electric contact 10 photographing lens 11 system control section 17 digital signal processing section 50 lens control section 51R, 51L phase difference detection pixel 60 memory 70 electric contact 171 gain correction processing section 172 interpolation correction Processing unit 173 Lens information acquisition unit 174 Correction method selection unit 175 Image processing unit 176 Correction gain value generation unit
Abstract
Description
本出願は、2013年3月13日出願の日本特許出願(特願2013-50238)に基づくものであり、その内容はここに取り込まれる。
200 カメラ本体
5 固体撮像素子
9 電気接点
10 撮影レンズ
11 システム制御部
17 デジタル信号処理部
50 レンズ制御部
51R,51L 位相差検出用画素
60 メモリ
70 電気接点
171 ゲイン補正処理部
172 補間補正処理部
173 レンズ情報取得部
174 補正方法選択部
175 画像処理部
176 補正ゲイン値生成部
Claims (7)
- レンズ装置を着脱可能な撮像装置であって、
受光面に二次元アレイ状に配列された複数の撮像用画素及び複数の位相差検出用画素を含み、前記レンズ装置を通して被写体を撮像する撮像素子と、
装着された前記レンズ装置と通信を行うための通信部と、
前記レンズ装置に固有の情報であるレンズ情報を、前記通信部を介して前記レンズ装置から取得するレンズ情報取得部と、
前記撮像素子によって被写体を撮像して得られる撮像画像信号における前記位相差検出用画素の出力信号を当該出力信号にゲイン値を乗じて補正するゲイン補正処理を行うゲイン補正処理部と、
前記撮像画像信号における前記位相差検出用画素の出力信号を当該位相差検出用画素の周囲にある当該位相差検出用画素と同一色を検出する前記撮像用画素の出力信号を用いて生成した信号に置き換えて補正する補間補正処理を行う補間補正処理部と、
前記レンズ情報取得部により取得したレンズ情報に応じて、前記撮像画像信号における全ての前記位相差検出用画素の出力信号を前記補間補正処理部によって補正する第一の補正方法と、前記撮像画像信号における全ての前記位相差検出用画素の出力信号を前記ゲイン補正処理部によって補正する第二の補正方法又は前記撮像画像信号における各位相差検出用画素の出力信号を前記補間補正処理部と前記ゲイン補正処理部のいずれかによって補正する第三の補正方法と、のいずれかの方法を選択する補正方法選択部と、
前記補正方法選択部によって選択された方法によって、前記撮像画像信号における前記位相差検出用画素の出力信号を補正する画像処理部と、を備える撮像装置。 - 請求項1記載の撮像装置であって、
前記補正方法選択部は、前記レンズ情報に含まれる前記レンズ装置の識別情報に対応する前記ゲイン値が前記撮像装置及び前記レンズ装置のいずれかに記憶されていない場合に、前記第一の補正方法を選択する撮像装置。 - 請求項1記載の撮像装置であって、
前記補正方法選択部は、装着された前記レンズ装置と前記撮像素子との組み合わせによって決まる前記ゲイン値を生成するために必要な情報が前記レンズ情報に含まれていない場合に、前記第一の補正方法を選択する撮像装置。 - 請求項3記載の撮像装置であって、
前記ゲイン値を取得するために必要な情報は、前記レンズ装置の光線角度情報である撮像装置。 - 請求項4記載の撮像装置であって、
前記レンズ情報に前記光線角度に関する情報が含まれる場合に、前記光線角度に関する情報と、光線角度毎の前記撮像素子の感度に関する情報とを用いて、前記ゲイン値を生成する補正ゲイン値生成部を備える撮像装置。 - レンズ装置を着脱可能な撮像装置による信号処理方法であって、
前記撮像装置は、受光面に二次元アレイ状に配列された複数の撮像用画素及び複数の位相差検出用画素を含み、前記レンズ装置を通して被写体を撮像する撮像素子と、装着された前記レンズ装置と通信を行うための通信部とを含み、
前記レンズ装置に固有の情報であるレンズ情報を、前記通信部を介して前記レンズ装置から取得するレンズ情報取得ステップと、
前記レンズ情報取得ステップにより取得したレンズ情報に応じて、前記撮像画像信号における全ての前記位相差検出用画素の出力信号を当該位相差検出用画素の周囲にある当該位相差検出用画素と同一色を検出する前記撮像用画素の出力信号を用いて生成した信号に置き換えて補正する補間補正処理によって補正する第一の補正方法と、前記撮像画像信号における全ての前記位相差検出用画素の出力信号を当該出力信号にゲイン値を乗じて補正するゲイン補正処理によって補正する第二の補正方法又は前記撮像画像信号における各位相差検出用画素の出力信号を前記補間補正処理と前記ゲイン補正処理のいずれかによって補正する第三の補正方法と、のいずれかの方法を選択する補正方法選択ステップと、
前記補正方法選択ステップによって選択した方法によって、前記撮像画像信号における前記位相差検出用画素の出力信号を補正する画像処理ステップと、を備える信号処理方法。 - 請求項6記載の信号処理方法の各ステップをコンピュータに実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015505240A JP5866478B2 (ja) | 2013-03-13 | 2013-12-20 | 撮像装置、信号処理方法、信号処理プログラム |
DE112013006817.6T DE112013006817B4 (de) | 2013-03-13 | 2013-12-20 | Bildaufnahmevorrichtung, Signalverarbeitungsverfahren und Signalverarbeitungsprogramm |
CN201380074658.0A CN105008976B (zh) | 2013-03-13 | 2013-12-20 | 摄像装置、信号处理方法、信号处理程序 |
US14/853,397 US9491352B2 (en) | 2013-03-13 | 2015-09-14 | Imaging device, signal processing method, and signal processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-050238 | 2013-03-13 | ||
JP2013050238 | 2013-03-13 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/853,397 Continuation US9491352B2 (en) | 2013-03-13 | 2015-09-14 | Imaging device, signal processing method, and signal processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014141561A1 true WO2014141561A1 (ja) | 2014-09-18 |
Family
ID=51536246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/084314 WO2014141561A1 (ja) | 2013-03-13 | 2013-12-20 | 撮像装置、信号処理方法、信号処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9491352B2 (ja) |
JP (2) | JP5866478B2 (ja) |
CN (1) | CN105008976B (ja) |
DE (1) | DE112013006817B4 (ja) |
WO (1) | WO2014141561A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017170724A1 (ja) * | 2016-03-31 | 2017-10-05 | 株式会社ニコン | 撮像装置、レンズ調節装置、および電子機器 |
WO2019065555A1 (ja) * | 2017-09-28 | 2019-04-04 | 富士フイルム株式会社 | 撮像装置、情報取得方法及び情報取得プログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104854496B (zh) * | 2012-11-22 | 2017-04-12 | 富士胶片株式会社 | 摄像装置、散焦量运算方法及摄像光学系统 |
US20170367536A1 (en) * | 2016-06-22 | 2017-12-28 | Marty Guy Wall | Stainless steel food service vessels |
US10554877B2 (en) | 2016-07-29 | 2020-02-04 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image synthesis method and apparatus for mobile terminal, and mobile terminal |
CN106101556B (zh) | 2016-07-29 | 2017-10-20 | 广东欧珀移动通信有限公司 | 移动终端的图像合成方法、装置及移动终端 |
WO2018035859A1 (en) * | 2016-08-26 | 2018-03-01 | Huawei Technologies Co., Ltd. | Imaging apparatus and imaging method |
CN110463184B (zh) * | 2017-03-30 | 2021-03-12 | 富士胶片株式会社 | 图像处理装置、图像处理方法及非易失性有形介质 |
JP6856762B2 (ja) * | 2017-09-28 | 2021-04-14 | 富士フイルム株式会社 | 撮像装置、情報取得方法及び情報取得プログラム |
US11641519B2 (en) * | 2018-07-20 | 2023-05-02 | Nikon Corporation | Focus detection device, imaging device, and interchangeable lens |
US11676255B2 (en) * | 2020-08-14 | 2023-06-13 | Optos Plc | Image correction for ophthalmic images |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009008928A (ja) * | 2007-06-28 | 2009-01-15 | Olympus Corp | 撮像装置及び画像信号処理プログラム |
JP2010039759A (ja) * | 2008-08-05 | 2010-02-18 | Canon Inc | 撮像装置 |
JP2011257565A (ja) * | 2010-06-08 | 2011-12-22 | Canon Inc | 撮像装置 |
JP2014036262A (ja) * | 2012-08-07 | 2014-02-24 | Nikon Corp | 撮像装置 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS56271A (en) | 1979-06-15 | 1981-01-06 | Hitachi Ltd | Non-electrolytic copper plating solution |
JP2586737B2 (ja) | 1993-10-15 | 1997-03-05 | 井関農機株式会社 | 乗用型苗植機 |
KR100617781B1 (ko) * | 2004-06-29 | 2006-08-28 | 삼성전자주식회사 | 이미지 센서의 화질 개선장치 및 방법 |
JP4453648B2 (ja) * | 2005-06-13 | 2010-04-21 | ソニー株式会社 | 画像処理装置および撮像装置 |
JP2007019959A (ja) | 2005-07-08 | 2007-01-25 | Nikon Corp | 撮像装置 |
JP4770560B2 (ja) | 2006-04-11 | 2011-09-14 | 株式会社ニコン | 撮像装置、カメラおよび画像処理方法 |
JP4218723B2 (ja) * | 2006-10-19 | 2009-02-04 | ソニー株式会社 | 画像処理装置、撮像装置、画像処理方法およびプログラム |
JP4823167B2 (ja) | 2007-08-10 | 2011-11-24 | キヤノン株式会社 | 撮像装置 |
JP2010030848A (ja) | 2008-07-30 | 2010-02-12 | Ohara Inc | ガラス |
JP5371331B2 (ja) * | 2008-09-01 | 2013-12-18 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法及びプログラム |
JP5228777B2 (ja) | 2008-10-09 | 2013-07-03 | 株式会社ニコン | 焦点検出装置および撮像装置 |
JP5662667B2 (ja) | 2009-10-08 | 2015-02-04 | キヤノン株式会社 | 撮像装置 |
JP5563283B2 (ja) * | 2009-12-09 | 2014-07-30 | キヤノン株式会社 | 画像処理装置 |
JP2011197080A (ja) * | 2010-03-17 | 2011-10-06 | Olympus Corp | 撮像装置及びカメラ |
WO2011136026A1 (ja) * | 2010-04-30 | 2011-11-03 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
JP5642433B2 (ja) | 2010-06-15 | 2014-12-17 | 富士フイルム株式会社 | 撮像装置及び画像処理方法 |
JP5670103B2 (ja) * | 2010-06-15 | 2015-02-18 | 山陽特殊製鋼株式会社 | 高強度オーステナイト系耐熱鋼 |
US20120019704A1 (en) * | 2010-07-26 | 2012-01-26 | Levey Charles I | Automatic digital camera photography mode selection |
US8970720B2 (en) * | 2010-07-26 | 2015-03-03 | Apple Inc. | Automatic digital camera photography mode selection |
JP2013013006A (ja) * | 2011-06-30 | 2013-01-17 | Nikon Corp | 撮像装置、撮像装置の製造方法、画像処理装置、及びプログラム、並びに記録媒体 |
JP5784395B2 (ja) * | 2011-07-13 | 2015-09-24 | オリンパス株式会社 | 撮像装置 |
JP5739282B2 (ja) | 2011-08-30 | 2015-06-24 | シャープ株式会社 | 冷蔵庫 |
US8593564B2 (en) * | 2011-09-22 | 2013-11-26 | Apple Inc. | Digital camera including refocusable imaging mode adaptor |
JP5657182B2 (ja) * | 2012-05-10 | 2015-01-21 | 富士フイルム株式会社 | 撮像装置及び信号補正方法 |
JP2016005050A (ja) * | 2014-06-16 | 2016-01-12 | キヤノン株式会社 | レンズ装置および撮像装置 |
JP6600170B2 (ja) * | 2014-07-07 | 2019-10-30 | キヤノン株式会社 | 撮像素子及びその制御方法並びに撮像装置 |
-
2013
- 2013-12-20 DE DE112013006817.6T patent/DE112013006817B4/de active Active
- 2013-12-20 CN CN201380074658.0A patent/CN105008976B/zh active Active
- 2013-12-20 JP JP2015505240A patent/JP5866478B2/ja active Active
- 2013-12-20 WO PCT/JP2013/084314 patent/WO2014141561A1/ja active Application Filing
-
2015
- 2015-09-14 US US14/853,397 patent/US9491352B2/en active Active
- 2015-12-28 JP JP2015256385A patent/JP6031587B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009008928A (ja) * | 2007-06-28 | 2009-01-15 | Olympus Corp | 撮像装置及び画像信号処理プログラム |
JP2010039759A (ja) * | 2008-08-05 | 2010-02-18 | Canon Inc | 撮像装置 |
JP2011257565A (ja) * | 2010-06-08 | 2011-12-22 | Canon Inc | 撮像装置 |
JP2014036262A (ja) * | 2012-08-07 | 2014-02-24 | Nikon Corp | 撮像装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017170724A1 (ja) * | 2016-03-31 | 2017-10-05 | 株式会社ニコン | 撮像装置、レンズ調節装置、および電子機器 |
WO2019065555A1 (ja) * | 2017-09-28 | 2019-04-04 | 富士フイルム株式会社 | 撮像装置、情報取得方法及び情報取得プログラム |
US11201999B2 (en) | 2017-09-28 | 2021-12-14 | Fujifilm Corporation | Imaging device, information acquisition method, and information acquisition program |
Also Published As
Publication number | Publication date |
---|---|
CN105008976B (zh) | 2017-07-11 |
JP5866478B2 (ja) | 2016-02-17 |
DE112013006817T5 (de) | 2015-12-10 |
JPWO2014141561A1 (ja) | 2017-02-16 |
US20160014327A1 (en) | 2016-01-14 |
CN105008976A (zh) | 2015-10-28 |
US9491352B2 (en) | 2016-11-08 |
DE112013006817B4 (de) | 2022-07-14 |
JP6031587B2 (ja) | 2016-11-24 |
JP2016076998A (ja) | 2016-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6031587B2 (ja) | 撮像装置、信号処理方法、信号処理プログラム | |
JP5775976B2 (ja) | 撮像装置、デフォーカス量演算方法、及び撮像光学系 | |
JP5542249B2 (ja) | 撮像素子及びこれを用いた撮像装置及び撮像方法 | |
JP5697801B2 (ja) | 撮像装置及び自動焦点調節方法 | |
JP5799178B2 (ja) | 撮像装置及び合焦制御方法 | |
JP5779726B2 (ja) | 撮像装置及び露出決定方法 | |
US9485410B2 (en) | Image processing device, imaging device, image processing method and computer readable medium | |
WO2014091854A1 (ja) | 画像処理装置、撮像装置、画像処理方法、及び画像処理プログラム | |
JP6307526B2 (ja) | 撮像装置及び合焦制御方法 | |
JP5677625B2 (ja) | 信号処理装置、撮像装置、信号補正方法 | |
WO2014106935A1 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
US20190335110A1 (en) | Imaging element and imaging apparatus | |
JP5542248B2 (ja) | 撮像素子及び撮像装置 | |
JP5749409B2 (ja) | 撮像装置、画像処理方法及びプログラム | |
JP5807125B2 (ja) | 撮像装置及びその合焦制御方法 | |
US9712752B2 (en) | Image processing device, imaging device, image processing method, and computer readable medium, capable of improving visibility of a boundary line between a first display image and a second display image for use in focus-checking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13877877 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015505240 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130068176 Country of ref document: DE Ref document number: 112013006817 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13877877 Country of ref document: EP Kind code of ref document: A1 |