US20190215441A1 - Focus detection device and image-capturing apparatus - Google Patents
Focus detection device and image-capturing apparatus Download PDFInfo
- Publication number
- US20190215441A1 US20190215441A1 US16/355,000 US201916355000A US2019215441A1 US 20190215441 A1 US20190215441 A1 US 20190215441A1 US 201916355000 A US201916355000 A US 201916355000A US 2019215441 A1 US2019215441 A1 US 2019215441A1
- Authority
- US
- United States
- Prior art keywords
- focus detection
- pair
- signal sequences
- signals
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23212—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
- G02B7/346—Systems for automatic generation of focusing signals using different areas in a pupil plane using horizontal and vertical areas in the pupil plane, i.e. wide area autofocusing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/365—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H04N5/2254—
-
- H04N5/2353—
-
- H04N5/3696—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
Definitions
- the present invention relates to a focus detection device and an image-capturing apparatus.
- the focus detection pixel row is divided in a plurality of blocks.
- a focus detection error may occur because boundaries between the blocks do not always correspond to boundaries between distant subjects and near subjects present in the same photographic screen.
- a focus detection device comprises: a focus detection sensor receiving a pair of light fluxes that has passed through a pair of pupil regions of an optical system and outputting a pair of focus detection signal sequences, each focus detection signal sequence being made of a plurality of focus detection signals; a difference calculation unit obtaining a plurality of differences by sequentially calculating differences between focus detection signals corresponding to each other in the pair of focus detection signal sequences; a division unit dividing the pair of focus detection signal sequences into at least two pairs of partial signal sequences, the at least two pairs of partial signal sequences including a first pair of partial signal sequences and a second pair of partial signal sequences, based on the plurality of differences obtained by the difference calculation unit; a focus detection parameter calculation unit calculating a first focus detection parameter in accordance with a phase difference amount of the first pair of partial signal sequences and a second focus detection parameter in accordance with a phase difference amount of the second pair of partial signal sequences; and a focus adjustment parameter determination unit determining either the
- the focus detection parameter calculation unit calculates a minimum value of a correlation amount of the pair of focus detection signal sequences while shifting the pair of focus detection signal sequences relative to each other by a predetermined shift amount, and then calculates another focus detection parameter based on a specific shift amount of the pair of the focus detection signal sequences, the specific shift amount providing the minimum value; and the difference calculation unit obtains the plurality of differences in the pair of focus detection signal sequences in a state where the pair of focus detection signal sequences has been shifted relative to each other by the specific shift amount.
- the focus adjustment parameter determination unit determines either the first focus detection parameter or the second focus detection parameter as the focus adjustment parameter; and, if the minimum value is lower than the predetermined value, the focus adjustment parameter determination unit determines the another focus detection parameter as the focus adjustment parameter.
- the focus adjustment parameter determination unit determines either the first focus detection parameter or the second focus detection parameter as the focus adjustment parameter, if brightness of a subject image formed by the optical system is not lower than a predetermined brightness; and the focus adjustment parameter determination unit determines the another focus detection parameter as the focus adjustment parameter, if the brightness of the subject image is lower than the predetermined brightness.
- the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences and the second pair of partial signal sequences, depending on whether or not each of the plurality of differences obtained by the difference calculation unit is equal to or higher than an average value of the plurality of differences.
- the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences and the second pair of partial signal sequences, depending on whether or not a magnitude of a difference between adjacent differences in the plurality of differences obtained by the difference calculation unit is smaller than a predetermined value.
- the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences, the second pair of partial signal sequences, and a third pair of partial signal sequences, based on the plurality of differences obtained by the difference calculation unit; if the pair of focus detection signal sequences is divided by the division unit into the first pair of partial signal sequences, the second pair of partial signal sequences, and the third pair of partial signal sequences, the focus detection parameter calculation unit calculates a third focus detection parameter in accordance with a phase difference amount of the third pair of partial signal sequences, in case calculating the first focus detection parameter and the second focus detection parameter; and the focus adjustment parameter determination unit determines one of the first focus detection parameter, the second focus detection parameter, and the third focus detection parameter as the focus adjustment parameter.
- the focus detection parameter calculation unit calculates a first defocus amount and a second defocus amount respectively as the first focus detection parameter and the second focus detection parameter; and the focus adjustment parameter determination unit determines a nearer-side defocus amount of the first defocus amount and the second defocus amount, as the focus adjustment parameter.
- the focus detection device further comprises: an image sensor receiving light fluxes that have passed through the optical system, via a microlens array, and outputting image-capturing signals.
- the focus detection sensor is provided independently of the image sensor or included in the image sensor; and, if the focus detection sensor is provided independently of the image sensor, the pair of light fluxes are received by the focus detection sensor after passing through the pair of pupil regions and then passing through a microlens array or an image reforming optical system.
- an image-capturing apparatus comprises: the focus detection device according to the ninth aspect; the optical system; a focus adjustment unit performing the focus adjustment based on the focus adjustment parameter determined by the focus adjustment parameter determination unit; and an image generation unit generating an image based on the image-capturing signals output by the image sensor in case that the optical system focuses on the light receiving surface of the image sensor by the focus adjustment.
- a focus adjustment can be performed after properly dividing a focus detection signal sequence depending on distances of the subjects from the present apparatus, under the consideration of the circumstance of the subjects.
- FIG. 1 is a view illustrating a configuration of an image-capturing apparatus having a focus detection device in one embodiment of the present invention.
- FIG. 2 is a view illustrating a focus detection sensor and a microlens array which covers the focus detection sensor.
- FIG. 3 is a view illustrating a relationship between a plurality of focus detection pixels and a microlens.
- FIG. 4 is a flowchart of a focus detection process performed by a controller.
- FIG. 5 is a view illustrating one example in which two subject images are included in a focus detection area.
- FIG. 6 is a flowchart of a defocus amount determination process performed by the controller.
- FIG. 7 is a graph illustrating variations of focus detection signal values of a pair of focus detection signal sequences with respect to focus detection pixel positions in the focus detection area.
- FIG. 8 is a graph illustrating a state where the pair of focus detection signal sequences are shifted relative to each other by a specific shift amount which provides the minimum value of a correlation amount of the pair of the focus detection signal sequences.
- FIG. 9 is a graph explaining a division process for the pair of focus detection signal sequences.
- FIG. 10 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value.
- FIG. 11 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value.
- FIG. 12 is a flowchart of an image-capturing process performed by the controller.
- FIG. 13 is a view illustrating one example in which three subject images are included in a focus detection area.
- FIG. 14 is a graph illustrating variations of focus detection signal values of a pair of focus detection signal sequences with respect to focus detection pixel positions in the focus detection area.
- FIG. 15 is a graph illustrating a state where the pair of focus detection signal sequences are shifted relative to each other by a specific shift amount which provides the minimum value of a correlation amount of the pair of the focus detection signal sequences.
- FIG. 16 is a graph explaining a division process for the pair of focus detection signal sequences.
- FIG. 17 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value.
- FIG. 18 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value.
- FIG. 19 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value.
- FIG. 20 is a view illustrating a configuration of an image-capturing apparatus having another focus detection device.
- FIG. 21 is a view illustrating a configuration of an image-capturing apparatus having another focus detection device.
- FIG. 22 is a graph illustrating differences between absolute values
- FIG. 1 is a view illustrating a configuration of an image-capturing apparatus 100 including a focus detection device 50 according to this embodiment.
- the image-capturing apparatus 100 includes the focus detection device 50 , a liquid crystal display element 1 , an image sensor 2 , a photographing optical system 4 , a lens drive motor 5 , a half mirror 7 , a focus adjustment device 8 , and a storage device 15 .
- the focus detection device 50 includes a focus detection sensor 6 , a microlens array 9 , and a controller 3 .
- the photographing optical system 4 is an optical system for forming subject images on a focal plane.
- the photographing optical system 4 includes a plurality of lenses and diaphragms. Among the plurality of lenses, a focus adjustment lens can be moved by the lens drive motor 5 in a direction of an optical axis 10 of the photographing optical system 4 .
- the half mirror 7 is a thin mirror, such as a pellicle mirror, and is located in an optical path along the optical axis 10 as illustrated in FIG. 1 .
- Incident light fluxes pass through the photographing optical system 4 and then some of the incident light fluxes are reflected from the half mirror 7 in a direction of an optical axis 10 a , i.e., toward the microlens array 9 , while the rest of the incident light fluxes, which are not reflected, transmit through the half mirror 7 .
- the light fluxes that have been reflected from the half mirror 7 transmit through the microlens array 9 made of a plurality of microlenses arranged in two dimensions and the light fluxes are then incident on the focus detection sensor 6 .
- the light fluxes that have transmitted through the half mirror 7 referred to as transmitted light fluxes, are incident on the image sensor 2 .
- the microlens array 9 is arranged on the image-forming plane of the photographing optical system 4 .
- the position of the microlens array 9 is equivalent to the position of an image-capturing plane of the image sensor 2 .
- the focus detection sensor 6 has a plurality of focus detection pixels arranged therein, the focus detection pixels generating electrical focus detection signals in accordance with the light fluxes received by the pixels.
- a pair of focus detection pixel groups made of a part of focus detection pixels in a focus detection area receives a pair of light fluxes among the light fluxes incident on the focus detection sensor 6 through the microlens array 9 and performs a photoelectric conversion process, so that a pair of electrical focus detection signal sequences corresponding to the subject image is generated. Details thereof will be described later with reference to FIG. 3 .
- the focus detection area may be displayed on a screen in such a manner that the focus detection area is superimposed on a through image, when the liquid crystal display element 1 displays the through image on the screen.
- the through image is based on a plurality of image-capturing signals, described later, output by the image sensor 2 .
- a plurality of focus detection areas may be displayed on the screen of the liquid crystal display element 1 so that an user can specify one of the plurality of focus detection areas displayed on the screen while viewing the screen of the liquid crystal display element 1 .
- the controller 3 performs an exposure control of the plurality of focus detection pixels, a read-out control of the plurality of focus detection signals, and/or an amplification control of the plurality of focus detection signals that has been read out, for example, as a photoelectric conversion control of the focus detection sensor 6 .
- the pair of focus detection signal sequences generated by the focus detection sensor 6 is output to the controller 3 .
- the controller 3 detects a focus of the photographing optical system 4 by the use of the split-pupil phase detection method, on the basis of the pair of focus detection signal sequences output by the focus detection sensor 6 .
- the controller 3 detects a phase difference amount of the pair of focus detection signal sequences, as a focus detection parameter that is obtained from the focus detection.
- the controller 3 calculates a defocus amount as a focus detection parameter in accordance with the phase difference amount.
- the controller 3 determines a focus adjustment parameter on the basis of the phase difference amount or the defocus amount and then calculates, on the basis of the determined focus adjustment parameter, a lens drive amount for the focus adjustment lens of the photographing optical system 4 to send the lens drive amount to the focus adjustment device 8 .
- the focus adjustment device 8 After receiving the lens drive amount, the focus adjustment device 8 drives the focus adjustment lens of the photographing optical system 4 by the lens drive amount, via the lens drive motor 5 . Details of the focus detection process performed by the controller 3 will be described later with reference to FIGS. 4 and 6 .
- the half mirror 7 is swung up to cover the focus detection sensor 6 and is thus brought out of the optical path.
- all of the incident light fluxes that have passed through the photographing optical system 4 is incident on the image sensor 2 to form the subject image on a light receiving surface of the image sensor 2 .
- the image sensor 2 has a plurality of image-capturing pixels arranged in two dimensions and the plurality of image-capturing pixels receives the incident light fluxes and performs the photoelectric conversion to generate the plurality of electrical image-capturing signals corresponding to the subject image formed by the photographing optical system 4 .
- the plurality of image-capturing signals generated here is output by the image sensor 2 .
- the controller 3 generates an image on the basis of the plurality of image-capturing signals output by the image sensor 2 and then causes the liquid crystal display element 1 to display the generated image as the through image.
- the controller 3 also records the generated image in the storage device 15 when executing the image-capturing process in response to an image-capturing command from the user. Details of the image-capturing process performed by the controller 3 will be described later with reference to FIG. 12 .
- FIG. 2 is a view illustrating the focus detection sensor 6 and the microlens array 9 which covers the focus detection sensor 6 .
- FIG. 2( a ) illustrates an enlarged view of the focus detection sensor 6 and the microlens array 9 in the vicinity of the optical axis 10 a illustrated in FIG. 1 .
- the focus detection sensor 6 has the plurality of focus detection pixels 60 arranged in two dimensions.
- the microlens array 9 has the plurality of microlenses 90 arranged in two dimensions (in a honeycomb-like array) with a pitch of 100 ⁇ m or less.
- a shape of the microlens 90 illustrated in the figure is a sphere, the shape may be a hexagon, which matches the honeycomb-like array.
- FIG. 2( b ) is a view as seen from directly above the microlens array 9 , wherein the microlens array 9 and the focus detection sensor 6 located behind the microlens array 9 are illustrated overlapping each other.
- a plurality of focus detection pixels 60 which are here 5 vertical by 5 horizontal pixels, corresponds to each microlens 90 .
- a part of the incident light fluxes having passed through the photographing optical system 4 illustrated in FIG. 1 is reflected from the half mirror 7 as the reflected light fluxes, and then the reflected light fluxes arrive at and transmit through the microlens array 9 and are incident on the focus detection sensor 6 .
- FIG. 2( b ) is a view as seen from directly above the microlens array 9 , wherein the microlens array 9 and the focus detection sensor 6 located behind the microlens array 9 are illustrated overlapping each other.
- a plurality of focus detection pixels 60 which are here 5 vertical by 5 horizontal pixels, corresponds to each microlens 90 .
- the light flux having transmitted through each microlens 90 is received by the plurality of focus detection pixels 60 corresponding to each microlens 90 , which is here a total of 25 pixels constituted with 5 vertical by 5 horizontal pixels, so that the light flux is converted into the electrical focus detection signal by means of the photoelectric conversion.
- the plurality of focus detection pixels 60 corresponding to each microlens 90 is not limited to the total of 25 pixels constituted with 5 vertical by 5 horizontal pixels.
- FIG. 3 is a view illustrating a relationship between the plurality of focus detection pixels 60 and microlenses 90 .
- FIGS. 3( a ) and 3( b ) are plan views of the plurality of focus detection pixels 60 and microlenses 90 .
- the plurality of focus detection pixels 60 corresponding to each microlens 90 is the total of 25 pixels constituted with 5 vertical by 5 horizontal pixels.
- a pair of focus detection pixel groups described above is specified. For the total of 25 focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels corresponding to each microlens 90 exemplified in FIG.
- three center focus detection pixels are illustrated with hatching among five focus detection pixels included in each of two vertical columns of focus detection pixels located on both ends in a horizontal direction.
- the pixels with hatching form a pair of focus detection pixel groups 610 a and 610 b.
- three center focus detection pixels are illustrated with hatching among five focus detection pixels included in each of a vertical column of focus detection pixels located on the left end in the figure in the horizontal direction and an adjacent vertical column of focus detection pixels.
- the pixels with hatching form one focus detection pixel group 620 a of a pair of focus detection pixel groups 620 a and 620 b .
- Three center focus detection pixels are illustrated with hatching among five focus detection pixels included in each of a vertical column of focus detection pixels located on the right end in the figure in the horizontal direction and an adjacent vertical column of focus detection pixels.
- each of the pair of focus detection pixel groups 620 a and 620 b includes a total of six focus detection pixels: two columns adjacently arranged in the horizontal direction, each column including three focus detection pixels arranged in the vertical direction.
- FIGS. 3( c ) and 3( d ) respectively are cross-sectional views taken along dashed-dotted lines S 1 and S 2 of FIGS. 3( a ) and 3( b ) illustrating the plan views of sets of the total of 25 focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels and the microlens 90 , wherein each dashed-dotted line S 1 , S 2 extends in the horizontal direction through the focus detection pixel that is located in the center of the 25 focus detection pixels 60 .
- FIG. 3( c ) and 3( d ) respectively are cross-sectional views taken along dashed-dotted lines S 1 and S 2 of FIGS. 3( a ) and 3( b ) illustrating the plan views of sets of the total of 25 focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels and the microlens 90 , wherein each dashed-dotted line S 1 , S 2 extends in the horizontal direction through the focus detection pixel that is located in the center
- the pair of focus detection pixel groups 610 a and 610 b receives a pair of light fluxes 11 and 12 having passed through a pair of pupil regions of the photographing optical system 4 and through the microlens 90 , and generates the pair of electrical focus detection signals by means of the photoelectric conversion.
- FIG. 3( a ) illustrates five sets of the 25 focus detection pixels 60 and the microlens 90 .
- a focus detection signal sequence including five focus detection signals generated by five focus detection pixel groups 610 a and a focus detection signal sequence including five focus detection signals generated by five focus detection pixel groups 610 b are obtained.
- the two focus detection signal sequences form a pair of focus detection signal sequences.
- the pair of focus detection pixel groups 620 a and 620 b receives a pair of light fluxes 13 and 14 having passed through the pair of pupil regions of the photographing optical system 4 and through the microlens 90 , and generates the pair of electrical focus detection signals by means of the photoelectric conversion.
- FIG. 3( b ) illustrates five sets of the 25 focus detection pixels 60 and the microlens 90 .
- five focus detection signal sequences generated by five focus detection pixel groups 620 a and five focus detection signal sequences generated by five focus detection pixel groups 620 b are obtained.
- the two focus detection signal sequences form a pair of focus detection signal sequences.
- the focus adjustment of the photographing optical system 4 can be performed on the basis of a phase difference between the pair of focus detection signal sequences obtained in this way, or on the basis of a defocus amount calculated from the phase difference. It should be noted that a distance between the pair of focus detection pixel groups 610 a and 610 b illustrated in FIG. 3( c ) is larger than a distance between the pair of focus detection pixel groups 620 a and 620 b illustrated in FIG. 3( d ) . Therefore, an aperture angle formed by the pair of light fluxes 11 and 12 illustrated in FIG. 3( c ) is larger than an aperture angle formed by the pair of light fluxes 13 and 14 illustrated in FIG. 3( d ) .
- the present invention can be applied to both cases, it is preferable to apply the present invention to a case where a configuration having a large aperture angle as illustrated in FIG. 3( c ) may be employed. This is because it is easier to detect a difference between changes in focus detection signal values caused by a scene including distant-subject images and near-subject images as described later, as the aperture angle increases.
- FIG. 4 is a flowchart of the focus detection process performed by the controller 3 .
- the controller 3 is a computer including a CPU and a memory, for example.
- the CPU executes a computer program stored in the memory to perform process steps constituting the focus detection process illustrated in FIG. 4 .
- the photographic screen 250 includes two subject images formed by the photographing optical system 4 : a subject image 210 of a background including trees and an subject image 220 of a person.
- a focus detection area 200 is also displayed in the photographic screen 250 .
- the subject image 210 of the background including trees located far from the image-capturing apparatus 100 and the subject image 220 of the person located near the image-capturing apparatus 100 are included also in the focus detection area 200 .
- typically a plurality of focus detection areas 200 is displayed on the photographic screen 250 , only one focus detection area 200 specified by an user in step S 101 in FIG. 4 described later is illustrated in the photographic screen 250 in FIG. 5 .
- the controller 3 makes a decision as to whether or not a focus detection area 200 is specified, in step S 101 . If No, the process step in step S 101 is repeated until the decision result is Yes. If Yes, the controller 3 causes the process to proceed to step S 102 , with the specified focus detection area 200 being a target of the process.
- the operating member described above may be an automatic focus detection activation switch, for example, and the process may be started by turning on the automatic focus detection activation switch. Alternatively, the operating member may be a shutter release button and the process may be started by setting the shutter release button in a halfway-press state.
- step S 102 the controller 3 performs a photoelectric conversion control of the focus detection sensor 6 .
- the photoelectric conversion control of the focus detection sensor 6 includes an exposure control of the plurality of focus detection pixels 60 arranged in the focus detection sensor 6 , a read-out control of the plurality of focus detection signals, and/or an amplification control of the plurality of focus detection signals that has been read out, for example.
- step S 103 the controller 3 obtains a pair of focus detection signal sequences on the basis of the plurality of focus detection signals which have been read out in step S 102 .
- step S 104 the controller 3 performs a defocus amount determination process to determine a focus adjustment defocus amount as a defocus amount for focus adjustment. Details of the defocus amount determination process will be described later with reference to FIG. 6 .
- step S 105 the controller 3 makes a decision as to whether or not the photographing optical system 4 is located at a focus position, depending on whether or not the focus adjustment defocus amount determined in step S 104 is approximately zero. If Yes, the process ends. If No, the process proceeds to step S 106 . It is also possible that the controller 3 makes a decision as to a reliability of the focus adjustment defocus amount determined in step S 104 , and a scan operation is performed if it is determined that the focus adjustment defocus amount is unreliable and the focus detection is impossible. If the subject image is not present within the focus detection area 200 already after the start of the focus adjustment lens drive, the focus adjustment lens may be driven on the basis of the focus adjustment defocus amount that was most recently detected, before the process ends.
- step S 106 the controller 3 calculates the lens drive amount for the photographing optical system 4 , on the basis of the focus adjustment defocus amount determined in step S 104 .
- step S 107 the controller 3 sends the lens drive amount calculated in step S 106 to the focus adjustment device 8 and controls the focus adjustment device 8 so that the focus adjustment device 8 drives the lens of the photographing optical system 4 via the lens drive motor 5 .
- the process Upon completion of the process in step S 107 , the process returns to step S 101 .
- FIG. 6 is a flowchart detailing the defocus amount determination process performed by the controller 3 in step S 104 in FIG. 4 .
- the pair of focus detection signal sequences obtained in step S 103 in FIG. 4 will be denoted by ⁇ a[i] ⁇ and ⁇ b[j] ⁇ .
- An initial value of a relative phase shift amount k between the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ is 0.
- a magnitude of the phase shift amount k becomes closer to the phase difference amount of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇
- a correlation value C(k) of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ expressed by the following equation (1) is the minimum value.
- the summation of the right-hand side of the equation (1) is repeated a number of times equal to the number of signals in the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ .
- step S 201 the controller 3 determines the minimum value C(k)_min of the correlation amount C(k) by sequentially calculating the correlation amount C(k) while shifting phases of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ obtained in step S 103 in FIG. 4 relative to each other by a predetermined shift amount for each calculation.
- the controller 3 obtains a specific shift amount X 0 of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ that provides the minimum value C(k)_min, and calculates a defocus amount D 0 as a focus detection amount parameter on the basis of the specific shift amount X 0 .
- step S 202 the controller 3 makes a decision as to whether or not the minimum value C(k)_min of the correlation amount C(k) determined in step S 201 is smaller than a predetermined threshold C(k)_th. If Yes, the controller 3 causes the process to proceed to step S 208 .
- the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ are not identical to each other over the whole focus detection area and there are partial sections where they are not identical (as described later with reference to FIG.
- step S 202 the controller 3 causes the process to proceed to step S 203 .
- step S 203 the controller 3 makes a decision as to whether or not a brightness of the subject image including the subject image 210 of the background including trees and the subject image 220 of the person is lower than a predetermined brightness in the focus detection area 200 specified in step S 101 in FIG. 4 . If Yes, i.e., if the brightness of the subject image is lower than the predetermined brightness, it is likely that the amplification control was performed with a large amplification degree in step S 102 in FIG. 4 . The amplification control with a large amplification degree causes a noise superimposed on the focus detection signal to be amplified.
- step S 204 the controller 3 causes the process to proceed to step S 208 . If the decision result is No in step S 203 , the controller 3 causes the process to proceed to step S 204 .
- a magnitude of the amplification degree of the amplification control performed in step S 102 in FIG. 4 may be used as a brightness decision index in step S 203 in FIG. 6 , for example. If the amplification degree is lower than the predetermined value, the controller 3 makes a decision as to that the brightness of the overall subject image is not lower than the predetermined brightness. In other words, the decision result is No in step S 203 .
- step S 204 the controller 3 sequentially calculates absolute values
- step S 205 the controller 3 divides the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ that have been shifted relative to each other by the specific shift amount X 0 obtained in step S 201 into two pairs of partial signal sequences: a pair of partial signal sequences corresponding to the distant-subject image (the subject image 210 of the background including trees) and a pair of partial signal sequences corresponding to the near-subject image (the subject image 220 of the person).
- the controller 3 performs the division process in step S 205 depending on whether or not each of the plurality of differences obtained from the sequential calculation of the absolute values
- step S 206 the controller 3 calculates a phase difference amount between partial signal sequences in each of the two pairs of partial signal sequences obtained in step S 205 .
- Two phase difference amounts X 1 and X 2 calculated in this way, each corresponding to respective one of the two pairs of partial signal sequences, are a type of focus detection parameter.
- process steps subsequent to step S 207 may therefore be performed on the basis of the two phase difference amounts X 1 and X 2
- the controller 3 further calculates two defocus amounts D 1 and D 2 on the basis of the two phase difference amounts X 1 and X 2 in this embodiment.
- the two defocus amounts D 1 and D 2 are also a type of focus detection parameter.
- step S 207 the controller 3 determines the nearer-side defocus amount of the two defocus amounts D 1 and D 2 calculated in step S 206 as the focus adjustment defocus amount.
- the nearer-side defocus amount is determined on the basis of the fact that a focus position of the photographing optical system 4 for the nearest subject to the image-capturing apparatus 100 is located at the farthest position from the photographing optical system 4 .
- the defocus amount corresponding to the nearer subject image (the subject image 220 of the person) of the two defocus amount D 1 and D 2 calculated in step S 206 is the nearer-side defocus amount.
- step S 208 which is performed if the decision result is Yes in step S 202 or S 203 , the controller 3 determines the defocus amount D 0 calculated on the basis of the specific shift amount X 0 of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ obtained in step S 201 , as the focus adjustment defocus amount.
- the controller 3 Upon completion of step S 208 , the process ends and the controller 3 causes the focus detection process in FIG. 4 to proceed to step S 105 .
- FIG. 7 corresponds to the exemplary photographic screen 250 illustrated in FIG. 5 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ with respect to focus detection pixel positions in the focus detection area 200 having a length of approximately 50 pixels in the horizontal direction.
- a pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ illustrated in FIG. 7 namely a pair of focus detection signal sequences 655 a and 655 b , corresponds to the pair of focus detection signal sequences obtained in step S 103 in FIG. 4 .
- FIG. 7 corresponds to the exemplary photographic screen 250 illustrated in FIG. 5 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ with respect to focus detection pixel positions in the focus detection area 200 having a length of approximately 50 pixels in the horizontal direction.
- a section 310 of focus detection pixel positions 1 to 13 in the horizontal direction in the focus detection area 200 corresponds to the subject image 220 of the person located near the image-capturing apparatus 100 illustrated in FIG. 5 .
- a phase of one focus detection signal sequence 655 a of the pair of focus detection signal sequences 655 a and 655 b is ahead of a phase of the other focus detection signal sequence 655 b .
- a section 320 of focus detection pixel positions 14 to 46 in the horizontal direction in the focus detection area 200 corresponds to the subject image 210 of the background including trees located far from the image-capturing apparatus 100 illustrated in FIG. 5 .
- FIG. 8 illustrates a pair of focus detection signal sequences 660 a and 660 b which is the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ in a state where the correlation amount C(k) is the minimum value C(k)_min as a result of the relative shift of the phases of the pair of focus detection signal sequences 655 a and 655 b by the specific shift amount (X 0 ) performed in step S 201 in FIG. 6 .
- FIG. 8 is a graph illustrating a state where the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ in FIG. 7 are shifted relative to each other by the specific shift amount X 0 which provides the minimum value C(k)_min of the correlation amount C(k) of the pair of the focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ . It will be assumed that the correlation between the pair of focus detection signal sequences 655 a and 655 b is higher because a contrast in a section 320 corresponding to the subject image 210 of the background including trees is higher than that in the section 310 corresponding to the subject image 220 of the person.
- the specific shift amount X 0 obtained in step S 201 in FIG. 6 may be significantly affected by the subject image 210 of the background including trees. Therefore, if the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ in FIG. 7 are shifted relative to each other by the specific shift amount X 0 , the pair of focus detection signal sequences 660 a and 660 b may be essentially identical to each other in the section 320 of the focus detection pixel positions 14 to 46 in the horizontal direction in the focus detection area 200 corresponding to the subject image 210 of the background including trees, as illustrated in FIG. 8 . As illustrated in FIG.
- FIG. 9 explains a way of determining a boundary 350 that divides the whole section 300 in the horizontal direction of the focus detection area 200 into the section 310 corresponding to the subject image 220 of the person and the section 320 corresponding to the subject image 210 of the background including trees as described above, wherein the boundary 350 is located between focus detection pixel positions 13 and 14 .
- FIG. 9 is a graph explaining the division process for the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ illustrated in FIG. 7 and corresponds to the process step in step S 204 in FIG. 6 .
- FIG. 9 illustrates variations of a plurality of differences 671 , each difference being obtained for each focus detection pixel position in the horizontal direction in the focus detection area 200 , by sequentially calculating absolute values
- the section in which each and all of the plurality of differences 671 is lower than the average value of the plurality of differences 671 with respect to change in focus detection pixel positions in the horizontal direction in the focus detection area 200 , i.e., the section of the focus detection pixel positions 14 to 46 is specified as the section 320
- the boundary 350 can be specified so as to be located between the focus detection pixel positions 13 and 14 .
- the section of the focus detection pixel positions 1 to 13 within the whole section 300 which is the opposite side of the boundary 350 to the section 320 , can be specified as the section 310 .
- FIG. 10 is a graph illustrating variations of focus detection signal values of a pair of partial signal sequences 661 a and 661 b in a state where the correlation amount of the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 13 corresponding to the subject image 220 of the person is the minimum value.
- a pair of partial signal sequences in the section of the focus detection pixel positions 1 to 13 corresponding to the subject image 220 of the person is obtained by dividing the pair of focus detection signal sequences 655 a and 655 b illustrated in FIG. 7 .
- a phase difference amount X 1 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 13 corresponding to the subject image 220 of the person.
- FIG. 10 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 13 corresponding to the subject image 220 of the person are shifted relative to each other by the phase difference amount X 1 .
- the defocus amount D 1 is calculated on the basis of the phase difference amount X 1 .
- FIG. 11 is a graph illustrating variations of focus detection signal values of a pair of partial signal sequences 662 a and 662 b in a state where the correlation amount of the pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to the subject image 210 of the background including trees is the minimum value.
- a pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to the subject image 210 of the background including trees is obtained by dividing the pair of focus detection signal sequences 655 a and 655 b illustrated in FIG. 7 .
- a phase difference amount X 2 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to the subject image 210 of the background including trees.
- FIG. 11 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to the subject image 210 of the background including trees are shifted relative to each other by the phase difference amount X 2 .
- the defocus amount D 2 is calculated on the basis of the phase difference amount X 2 .
- FIG. 12 is a flowchart of the image-capturing process performed by the controller 3 .
- the controller 3 is a computer including a CPU and a memory, for example.
- the CPU executes a computer program stored in the memory to perform process steps constituting the image-capturing process illustrated in FIG. 12 .
- step S 501 the controller 3 makes a decision as to whether or not the user issues an image-capturing command via the operating member. If No, the process step in step S 501 is repeated until the decision result is Yes. If Yes, the controller 3 causes the process to proceed to step S 502 .
- the operating member may be a shutter release button, for example, and the decision result is Yes in step S 501 if the shutter release button is set in a full-press state.
- step S 502 the controller 3 performs a photoelectric conversion control of the image sensor 2 .
- the photoelectric conversion control of the image sensor 2 includes an exposure control of the plurality of image-capturing pixels arranged in the image sensor 2 , a read-out control of the plurality of image-capturing signals, and/or an amplification control of the plurality of image-capturing signals that has been read out, for example.
- step S 503 the controller 3 obtains the plurality of image-capturing signals which has been read out in step S 502 and on which the amplification control has been performed.
- step S 504 the controller 3 generates an image on the basis of the plurality of image-capturing signals obtained in step S 503 .
- step S 505 the controller 3 records the image generated in step S 504 in the storage device 15 . Upon completion of step S 505 , the process ends.
- the focus detection device 50 includes the focus detection sensor 6 and the controller 3 , as described above.
- the focus detection sensor 6 receives the pair of light fluxes having passed through the pair of pupil regions of the photographing optical system 4 and outputs the pair of focus detection signal sequences 655 a and 655 b , each being made of the plurality of focus detection signals.
- the controller 3 sequentially calculates absolute values
- the controller 3 divides the pair of focus detection signal sequences 655 a and 655 b into at least two pairs of partial signal sequences: a pair of partial signal sequences corresponding to the subject image 220 of the person located near the image-capturing apparatus 100 and a pair of partial signal sequences corresponding to the subject image 210 of the background including trees located far from the image-capturing apparatus 100 .
- the controller 3 calculates the defocus amount D 1 in accordance with the phase difference amount X 1 of the pair of partial signal sequences corresponding to the subject image 220 of the person and the defocus amount D 2 in accordance with the phase difference amount X 2 of the pair of partial signal sequences corresponding to the subject image 210 of the background including trees.
- the controller 3 determines either one of the defocus amounts D 1 and D 2 as the focus adjustment defocus amount used for the focus adjustment.
- the subject image 210 of the background including trees located far from the image-capturing apparatus 100 or the subject image 220 of the person located near the image-capturing apparatus 100 can be in focus.
- the controller 3 calculates the defocus amounts D 1 and D 2 and determines the nearer-side defocus amount of the defocus amounts D 1 and D 2 as the focus adjustment defocus amount.
- the subject image 220 of the person located near the image-capturing apparatus 100 can be in focus.
- the present invention is applied to the example where two subject images are included in the focus detection area 200 as illustrated in FIG. 5
- the present invention may be applied to a case where three or more subject images are included in the focus detection area 200 . This will be described with reference to FIG. 13 .
- FIG. 13 is a view illustrating an example where three subject images 210 , 220 , and 230 are included in the focus detection area 200 in the photographic screen 250 .
- the photographic screen 250 and the focus detection area 200 include the subject image 210 of a background including trees, the subject image 220 of one person, and the subject image 230 of another person. Although typically a plurality of focus detection areas 200 is displayed on the photographic screen 250 , only one focus detection area 200 specified by an user is illustrated in the photographic screen 250 in FIG. 13 .
- the focus detection process performed by the controller 18 will be described with reference to FIGS. 14 to 19 .
- FIG. 14 corresponds to the exemplary photographic screen 250 illustrated in FIG. 13 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ with respect to focus detection pixel positions in the focus detection area 200 having a length of approximately 50 pixels in the horizontal direction.
- a pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ illustrated in FIG. 14 namely a pair of focus detection signal sequences 655 a and 655 b , corresponds to the pair of focus detection signal sequences obtained by means of a process according to step S 103 in FIG. 4 .
- FIG. 14 corresponds to the exemplary photographic screen 250 illustrated in FIG. 13 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ with respect to focus detection pixel positions in the focus detection area 200 having a length of approximately 50 pixels in the horizontal direction.
- a section 310 of focus detection pixel positions 1 to 12 in the horizontal direction in the focus detection area 200 corresponds to the subject image 220 of the one person located near the image-capturing apparatus 100 illustrated in FIG. 13 .
- a phase of one focus detection signal sequence 655 a of the pair of focus detection signal sequences 655 a and 655 b is behind a phase of the other focus detection signal sequence 655 b .
- a section 320 of focus detection pixel positions 13 to 28 in the horizontal direction in the focus detection area 200 corresponds to the subject image 210 of the background including trees located far from the image-capturing apparatus 100 illustrated in FIG. 13 .
- a phase of one focus detection signal sequence 655 a of the pair of focus detection signal sequences 655 a and 655 b is ahead of a phase of the other focus detection signal sequence 655 b .
- a section 330 of focus detection pixel positions 29 to 46 in the horizontal direction in the focus detection area 200 corresponds to the subject image 230 of the another person who is the nearest subject to the image-capturing apparatus 100 illustrated in FIG. 13 .
- the phase of one focus detection signal sequence 655 a of the pair of focus detection signal sequences 655 a and 655 b is significantly behind the phase of the other focus detection signal sequence 655 b .
- step S 201 in FIG. 6 illustrates a pair of focus detection signal sequences 660 a and 660 b which is the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ in a state where the correlation amount C(k) is the minimum value C(k)_min as a result of the relative shift of the phases of the pair of focus detection signal sequences 655 a and 655 b by the specific shift amount (X 0 ) performed by means of a process according to step S 201 in FIG. 6 .
- FIG. 15 is a graph illustrating a state where the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ in FIG. 14 are shifted relative to each other by the specific shift amount X 0 which provides the minimum value C(k)_min of the correlation amount C(k) of the pair of the focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ .
- the subject image 220 of one person is located on the left side of the focus detection area 200 in the figure, the subject image 210 of the background including trees is located in the center, and the subject image 230 of another person is located on the right side.
- the correlation between the pair of focus detection signal sequences 655 a and 655 b in the section 320 corresponding to the subject image 210 of the background including trees is higher than that in the section 310 corresponding to the subject image 220 of the one person and in the section 330 corresponding to the subject image 230 of the another person because a contrast in the section 320 is higher than that in the section 310 and in the section 330 .
- the specific shift amount X 0 obtained by means of the process according to step S 201 in FIG. 6 may be significantly affected by the subject image 210 of the background including trees.
- the pair of focus detection signal sequences 660 a and 660 b may be essentially identical to each other in the section 320 of the focus detection pixel positions 13 to 28 in the horizontal direction in the focus detection area 200 corresponding to the subject image 210 of the background including trees, as illustrated in FIG. 15 .
- 16 explains a way of determining a boundary 350 that divides the whole section 300 in the horizontal direction of the focus detection area 200 into the section 310 corresponding to the subject image 220 of the one person and the section 320 corresponding to the subject image 210 of the background including trees as described above, wherein the boundary 350 is located between focus detection pixel positions 12 and 13 ; and further determining a boundary 360 that divides the section 300 into the section 320 corresponding to the subject image 210 of the background including trees and the section 330 corresponding to the subject image 230 of the another person.
- FIG. 16 is a graph explaining the division process for the pair of focus detection signal sequences ⁇ a[i] ⁇ and ⁇ b[j] ⁇ illustrated in FIG. 14 and corresponds to a process according to step S 204 in FIG. 6 .
- FIG. 16 illustrates variations of a plurality of differences 671 , each difference being obtained for each focus detection pixel position in the horizontal direction in the focus detection area 200 , by sequentially calculating absolute values
- of the difference considerably increases and decreases for each positional change from one focus detection pixel to the next in the horizontal direction in the focus detection area 200 .
- the average value of the plurality of differences 671 over the whole section 300 in the horizontal direction of the focus detection area 200 is determined.
- the section in which each and all of the plurality of differences 671 is lower than the average value of the plurality of differences 671 with respect to change in focus detection pixel positions in the horizontal direction in the focus detection area 200 i.e., the section of the focus detection pixels 13 to 28 is specified as the section 320 , and the boundaries 350 and 360 can be specified so as to be respectively located between the focus detection pixel positions 12 and 13 and between the focus detection pixel positions 28 and 29 .
- the section of the focus detection pixel positions 1 to 12 in the whole section 300 which is the opposite side of the boundary 350 to the section 320 , can be specified as the section 310
- the section of the focus detection pixel positions 29 to 46 in the whole section 300 which is the opposite side of the boundary 360 to the section 320
- the section 330 can be specified as the section 330 .
- the 14 indicating a current focusing condition can be divided into the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 12 corresponding to the subject image 220 of the one person, the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to the subject image 210 of the background including trees, and the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to the subject image 230 of the another person.
- FIG. 17 is a graph illustrating variations of focus detection signal values of a pair of partial signal sequences 661 a and 661 b in a state where the correlation amount of the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 12 corresponding to the subject image 220 of the one person is the minimum value.
- a pair of partial signal sequences in the section of the focus detection pixel positions 1 to 12 corresponding to the subject image 220 of the one person is obtained by dividing the pair of focus detection signal sequences 655 a and 655 b illustrated in FIG. 14 .
- a phase difference amount X 1 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 12 corresponding to the subject image 220 of the one person.
- FIG. 17 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 1 to 12 corresponding to the subject image 220 of the one person are shifted relative to each other by the phase difference amount X 1 .
- the defocus amount D 1 is calculated on the basis of the phase difference amount X 1 .
- FIG. 18 is a graph illustrating variations of focus detection signal values of a pair of partial signal sequences 662 a and 662 b in a state where the correlation amount of the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to the subject image 210 of the background including trees is the minimum value.
- a pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to the subject image 210 of the background including trees is obtained by dividing the pair of focus detection signal sequences 655 a and 655 b illustrated in FIG. 14 .
- a phase difference amount X 2 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to the subject image 210 of the background including trees.
- FIG. 18 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to the subject image 210 of the background including trees are shifted relative to each other by the phase difference amount X 2 .
- the defocus amount D 2 is calculated on the basis of the phase difference amount X 2 .
- FIG. 19 is a graph illustrating variations of focus detection signal values of a pair of partial signal sequences 663 a and 663 b in a state where the correlation amount of the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to the subject image 230 of the another person is the minimum value.
- a pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to the subject image 230 of the another person is obtained by dividing the pair of focus detection signal sequences 655 a and 655 b illustrated in FIG. 14 .
- a phase difference amount X 3 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to the subject image 230 of the another person.
- FIG. 19 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to the subject image 230 of the another person are shifted relative to each other by the phase difference amount X 3 .
- the defocus amount D 3 is calculated on the basis of the phase difference amount X 3 .
- step S 205 in FIG. 6 the controller 3 performs the division process in step S 205 depending on whether or not each of the plurality of differences 671 as illustrated in FIG. 9 obtained from the sequential calculation of the absolute values
- the controller 3 calculates differential values of the plurality of differences 671 .
- the differential value is obtained by calculating a difference between absolute values
- FIG. 22 is a graph illustrating differences between the absolute values
- of the difference considerably increases and decreases for each positional change from one focus detection pixel to the next in the horizontal direction in the focus detection area 200 . Therefore, as illustrated in FIG. 22 , magnitudes of the differential values in the section 310 are equal to or higher than the predetermined value V described above. In other words, in step S 205 in FIG.
- the controller 3 may divide the pair of focus detection signal sequences 655 a and 655 b into a pair of partial signal sequences corresponding to the subject image 220 of the person located near the image-capturing apparatus 100 and a pair of partial signal sequences corresponding to the subject image 210 of the background including trees located far from the image-capturing apparatus 100 , depending on whether or not a magnitude of a differential value which is a difference between absolute values
- step S 203 in FIG. 6 the controller 3 makes a decision as to whether or not the brightness of the whole subject image including the subject image 210 of the background including trees and the subject image 220 of the person is lower than the predetermined brightness.
- the controller 3 makes a decision as to whether or not the brightness of the whole subject image including the subject image 210 of the background including trees and the subject image 220 of the person is lower than the predetermined brightness.
- the plurality of differences is obtained in step S 204 without the process step in step S 203 , absolute values
- the controller 3 may cause the process to proceed to step S 208 if it is determined that there are absolute values
- step S 206 and step S 207 in FIG. 6 the controller 3 makes a decision as to the nearer-side defocus amount of the two defocus amounts D 1 and D 2 calculated on the basis of the two phase difference amounts X 1 and X 2 , each corresponding to respective one of the two pairs of partial signal sequences, as the focus adjustment defocus amount.
- the controller 3 may determine one defocus amount calculated on the basis of the nearer-side phase difference amount of the two phase difference amounts X 1 and X 2 , each corresponding to respective one of the two pairs of partial signal sequences, as the focus adjustment defocus amount.
- step S 105 in FIG. 4 the controller 3 determines whether or not the photographing optical system 4 is located at a focus position, depending on whether or not the focus adjustment defocus amount determined in step S 104 is approximately zero.
- step S 106 is performed first and, depending on whether or not the lens drive amount of the photographing optical system 4 calculated in step S 106 is approximately 0, it is determined whether or not the photographing optical system 4 is located at a focus position.
- the controller 3 performs the focus adjustment control by driving the lens of the photographing optical system 4 on the basis of the focus adjustment defocus amount, in step S 106 and step S 107 in FIG. 4 .
- the controller 3 may perform the focus adjustment control by driving the image sensor 2 on the basis of the focus adjustment defocus amount.
- the present invention may be applied to not only the focus detection device 50 having the focus detection sensor 6 covered by the microlens array 9 illustrated in FIG. 1 , but also a focus detection device 50 having a focus detection sensor 16 receiving light fluxes that have passed through the half mirror 7 and then have been reflected from a sub-mirror 70 to pass through the image reforming optical system 17 , or a focus detection device 50 having an image sensor 2 including a focus detection sensor covered by the microlens array 19 .
- FIG. 20 is a view illustrating a configuration of an image-capturing apparatus 100 including a focus detection device 50 having a focus detection sensor 16 having a plurality of focus detection pixels arranged therein, the pixels receiving light fluxes that have transmitted through a half mirror 7 and have been reflected from a sub-mirror 70 and have passed through an image reforming optical system 17 .
- FIG. 21 is a view illustrating a configuration of an image-capturing apparatus 100 including a focus detection device 50 having an image sensor 2 including a focus detection sensor covered by a microlens array 19 .
- a plurality of focus detection pixels generating a plurality of focus detection signals and a plurality of image-capturing pixels generating a plurality of image-capturing signals are arranged in a mixed manner in the image sensor 2 covered by the microlens array 19 .
- parts denoted by the same reference numerals as in FIG. 1 are the same as that in the image-capturing apparatus 100 in FIG. 1 and therefore explanation thereof will be omitted.
- a magnitude of an ISO sensitivity in the image-capturing process performed by the image sensor 2 may be used as a brightness decision index in step S 203 in FIG. 6 . If the ISO sensitivity is lower than the predetermined value, the controller 3 makes a decision that the brightness of the overall subject image is not lower than the predetermined brightness. In other words, the decision result is No in step S 203 .
Abstract
A focus detection device includes: a sensor outputting a pair of focus detection signal sequences, each of which being made of a plurality of focus detection signals; a difference calculation unit obtaining a plurality of differences by sequentially calculating differences between focus detection signals corresponding to each other in the pair of focus detection signal sequences; a division unit dividing the pair of focus detection signal sequences into at least two pairs of partial signal sequences based on the plurality of differences; a focus detection parameter calculation unit calculating a first focus detection parameter according to a phase difference amount of a first pair of partial signal sequences and a second focus detection parameter in accordance with a phase difference amount of a second pair of partial signal sequences; and a focus adjustment parameter determination unit determining either the first or second focus detection parameters, as a focus adjustment parameter.
Description
- This application is a continuation of U.S. application Ser. No. 16/151,985 filed Oct. 4, 2018 which is a continuation of U.S. Ser. No. 15/039,175 filed Dec. 19, 2016, which is a National Phase of International Application No. PCT/JP2014/081276 filed Nov. 26, 2014 and is based on and claims priority under 35 U.S.C. 119 from Japanese Patent Application No. 2013-243944 filed Nov. 26, 2013. The contents of the above applications are incorporated herein by reference in their entirety.
- The present invention relates to a focus detection device and an image-capturing apparatus.
- In cameras employing a phase detection method for a focus detection, there is an art that detects a defocus amount of an optional subject among distant subjects and near subjects by previously dividing a focus detection signal sequence used for a calculation in a range sensor, if the distant subjects and the near subjects are present in the same photographic screen, for example.
- PTL1: Japanese Laid-Open Patent Publication No. H06-82686
- According to PTL1, the focus detection pixel row is divided in a plurality of blocks. However, a focus detection error may occur because boundaries between the blocks do not always correspond to boundaries between distant subjects and near subjects present in the same photographic screen.
- According to the first aspect of the present invention, a focus detection device comprises: a focus detection sensor receiving a pair of light fluxes that has passed through a pair of pupil regions of an optical system and outputting a pair of focus detection signal sequences, each focus detection signal sequence being made of a plurality of focus detection signals; a difference calculation unit obtaining a plurality of differences by sequentially calculating differences between focus detection signals corresponding to each other in the pair of focus detection signal sequences; a division unit dividing the pair of focus detection signal sequences into at least two pairs of partial signal sequences, the at least two pairs of partial signal sequences including a first pair of partial signal sequences and a second pair of partial signal sequences, based on the plurality of differences obtained by the difference calculation unit; a focus detection parameter calculation unit calculating a first focus detection parameter in accordance with a phase difference amount of the first pair of partial signal sequences and a second focus detection parameter in accordance with a phase difference amount of the second pair of partial signal sequences; and a focus adjustment parameter determination unit determining either the first focus detection parameter or the second focus detection parameter, as a focus adjustment parameter used for a focus adjustment.
- According to the second aspect of the present invention, in the focus detection device according to the first aspect, it is preferred that the focus detection parameter calculation unit calculates a minimum value of a correlation amount of the pair of focus detection signal sequences while shifting the pair of focus detection signal sequences relative to each other by a predetermined shift amount, and then calculates another focus detection parameter based on a specific shift amount of the pair of the focus detection signal sequences, the specific shift amount providing the minimum value; and the difference calculation unit obtains the plurality of differences in the pair of focus detection signal sequences in a state where the pair of focus detection signal sequences has been shifted relative to each other by the specific shift amount.
- According to the third aspect of the present invention, in the focus detection device according to the second aspect, it is preferred that, if the minimum value is equal to or higher than a predetermined value, the focus adjustment parameter determination unit determines either the first focus detection parameter or the second focus detection parameter as the focus adjustment parameter; and, if the minimum value is lower than the predetermined value, the focus adjustment parameter determination unit determines the another focus detection parameter as the focus adjustment parameter.
- According to the fourth aspect of the present invention, in the focus detection device according to the second aspect or the third aspect, it is preferred that the focus adjustment parameter determination unit determines either the first focus detection parameter or the second focus detection parameter as the focus adjustment parameter, if brightness of a subject image formed by the optical system is not lower than a predetermined brightness; and the focus adjustment parameter determination unit determines the another focus detection parameter as the focus adjustment parameter, if the brightness of the subject image is lower than the predetermined brightness.
- According to the fifth aspect of the present invention, in the focus detection device according to any one of the first to fourth aspects, it is preferred that the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences and the second pair of partial signal sequences, depending on whether or not each of the plurality of differences obtained by the difference calculation unit is equal to or higher than an average value of the plurality of differences.
- According to the sixth aspect of the present invention, in the focus detection device according to any one of the first to fourth aspects, it is preferred that the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences and the second pair of partial signal sequences, depending on whether or not a magnitude of a difference between adjacent differences in the plurality of differences obtained by the difference calculation unit is smaller than a predetermined value.
- According to the seventh aspect of the present invention, in the focus detection device according to any one of the first to sixth aspects, it is preferred that the division unit divides the pair of focus detection signal sequences into the first pair of partial signal sequences, the second pair of partial signal sequences, and a third pair of partial signal sequences, based on the plurality of differences obtained by the difference calculation unit; if the pair of focus detection signal sequences is divided by the division unit into the first pair of partial signal sequences, the second pair of partial signal sequences, and the third pair of partial signal sequences, the focus detection parameter calculation unit calculates a third focus detection parameter in accordance with a phase difference amount of the third pair of partial signal sequences, in case calculating the first focus detection parameter and the second focus detection parameter; and the focus adjustment parameter determination unit determines one of the first focus detection parameter, the second focus detection parameter, and the third focus detection parameter as the focus adjustment parameter.
- According to the eighth aspect of the present invention, in the focus detection device according to any one of the first to seventh aspects, it is preferred that the focus detection parameter calculation unit calculates a first defocus amount and a second defocus amount respectively as the first focus detection parameter and the second focus detection parameter; and the focus adjustment parameter determination unit determines a nearer-side defocus amount of the first defocus amount and the second defocus amount, as the focus adjustment parameter.
- According to the ninth aspect of the present invention, in the focus detection device according to any one of the first to eighth aspects, it is preferred that the focus detection device further comprises: an image sensor receiving light fluxes that have passed through the optical system, via a microlens array, and outputting image-capturing signals. The focus detection sensor is provided independently of the image sensor or included in the image sensor; and, if the focus detection sensor is provided independently of the image sensor, the pair of light fluxes are received by the focus detection sensor after passing through the pair of pupil regions and then passing through a microlens array or an image reforming optical system.
- According to the tenth aspect of the present invention, an image-capturing apparatus comprises: the focus detection device according to the ninth aspect; the optical system; a focus adjustment unit performing the focus adjustment based on the focus adjustment parameter determined by the focus adjustment parameter determination unit; and an image generation unit generating an image based on the image-capturing signals output by the image sensor in case that the optical system focuses on the light receiving surface of the image sensor by the focus adjustment.
- According to the present invention, when a scene including both distant subjects and near subjects is to be focused, a focus adjustment can be performed after properly dividing a focus detection signal sequence depending on distances of the subjects from the present apparatus, under the consideration of the circumstance of the subjects.
-
FIG. 1 is a view illustrating a configuration of an image-capturing apparatus having a focus detection device in one embodiment of the present invention. -
FIG. 2 is a view illustrating a focus detection sensor and a microlens array which covers the focus detection sensor. -
FIG. 3 is a view illustrating a relationship between a plurality of focus detection pixels and a microlens. -
FIG. 4 is a flowchart of a focus detection process performed by a controller. -
FIG. 5 is a view illustrating one example in which two subject images are included in a focus detection area. -
FIG. 6 is a flowchart of a defocus amount determination process performed by the controller. -
FIG. 7 is a graph illustrating variations of focus detection signal values of a pair of focus detection signal sequences with respect to focus detection pixel positions in the focus detection area. -
FIG. 8 is a graph illustrating a state where the pair of focus detection signal sequences are shifted relative to each other by a specific shift amount which provides the minimum value of a correlation amount of the pair of the focus detection signal sequences. -
FIG. 9 is a graph explaining a division process for the pair of focus detection signal sequences. -
FIG. 10 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value. -
FIG. 11 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value. -
FIG. 12 is a flowchart of an image-capturing process performed by the controller. -
FIG. 13 is a view illustrating one example in which three subject images are included in a focus detection area. -
FIG. 14 is a graph illustrating variations of focus detection signal values of a pair of focus detection signal sequences with respect to focus detection pixel positions in the focus detection area. -
FIG. 15 is a graph illustrating a state where the pair of focus detection signal sequences are shifted relative to each other by a specific shift amount which provides the minimum value of a correlation amount of the pair of the focus detection signal sequences. -
FIG. 16 is a graph explaining a division process for the pair of focus detection signal sequences. -
FIG. 17 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value. -
FIG. 18 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value. -
FIG. 19 is a graph illustrating variations of focus detection signal values in a state where the correlation amount in the pair of partial signal sequences is the minimum value. -
FIG. 20 is a view illustrating a configuration of an image-capturing apparatus having another focus detection device. -
FIG. 21 is a view illustrating a configuration of an image-capturing apparatus having another focus detection device. -
FIG. 22 is a graph illustrating differences between absolute values |a[i]−b[j]| of differences illustrated inFIG. 9 . - A focus detection device and an image-capturing apparatus including such a focus detection device according to one embodiment of the present invention will be described with reference to the drawings.
FIG. 1 is a view illustrating a configuration of an image-capturingapparatus 100 including afocus detection device 50 according to this embodiment. The image-capturingapparatus 100 includes thefocus detection device 50, a liquidcrystal display element 1, animage sensor 2, a photographingoptical system 4, alens drive motor 5, ahalf mirror 7, afocus adjustment device 8, and astorage device 15. Thefocus detection device 50 includes afocus detection sensor 6, amicrolens array 9, and acontroller 3. - The photographing
optical system 4 is an optical system for forming subject images on a focal plane. The photographingoptical system 4 includes a plurality of lenses and diaphragms. Among the plurality of lenses, a focus adjustment lens can be moved by thelens drive motor 5 in a direction of anoptical axis 10 of the photographingoptical system 4. - The
half mirror 7 is a thin mirror, such as a pellicle mirror, and is located in an optical path along theoptical axis 10 as illustrated inFIG. 1 . Incident light fluxes pass through the photographingoptical system 4 and then some of the incident light fluxes are reflected from thehalf mirror 7 in a direction of anoptical axis 10 a, i.e., toward themicrolens array 9, while the rest of the incident light fluxes, which are not reflected, transmit through thehalf mirror 7. The light fluxes that have been reflected from thehalf mirror 7, referred to as reflected light fluxes, transmit through themicrolens array 9 made of a plurality of microlenses arranged in two dimensions and the light fluxes are then incident on thefocus detection sensor 6. The light fluxes that have transmitted through thehalf mirror 7, referred to as transmitted light fluxes, are incident on theimage sensor 2. Themicrolens array 9 is arranged on the image-forming plane of the photographingoptical system 4. The position of themicrolens array 9 is equivalent to the position of an image-capturing plane of theimage sensor 2. - The
focus detection sensor 6 has a plurality of focus detection pixels arranged therein, the focus detection pixels generating electrical focus detection signals in accordance with the light fluxes received by the pixels. Among the plurality of focus detection pixels, a pair of focus detection pixel groups made of a part of focus detection pixels in a focus detection area receives a pair of light fluxes among the light fluxes incident on thefocus detection sensor 6 through themicrolens array 9 and performs a photoelectric conversion process, so that a pair of electrical focus detection signal sequences corresponding to the subject image is generated. Details thereof will be described later with reference toFIG. 3 . The focus detection area may be displayed on a screen in such a manner that the focus detection area is superimposed on a through image, when the liquidcrystal display element 1 displays the through image on the screen. The through image is based on a plurality of image-capturing signals, described later, output by theimage sensor 2. A plurality of focus detection areas may be displayed on the screen of the liquidcrystal display element 1 so that an user can specify one of the plurality of focus detection areas displayed on the screen while viewing the screen of the liquidcrystal display element 1. - In generating the pair of focus detection signal sequences as described above, the
controller 3 performs an exposure control of the plurality of focus detection pixels, a read-out control of the plurality of focus detection signals, and/or an amplification control of the plurality of focus detection signals that has been read out, for example, as a photoelectric conversion control of thefocus detection sensor 6. The pair of focus detection signal sequences generated by thefocus detection sensor 6 is output to thecontroller 3. - The
controller 3 detects a focus of the photographingoptical system 4 by the use of the split-pupil phase detection method, on the basis of the pair of focus detection signal sequences output by thefocus detection sensor 6. Thecontroller 3 detects a phase difference amount of the pair of focus detection signal sequences, as a focus detection parameter that is obtained from the focus detection. Alternatively, thecontroller 3 calculates a defocus amount as a focus detection parameter in accordance with the phase difference amount. Thecontroller 3 determines a focus adjustment parameter on the basis of the phase difference amount or the defocus amount and then calculates, on the basis of the determined focus adjustment parameter, a lens drive amount for the focus adjustment lens of the photographingoptical system 4 to send the lens drive amount to thefocus adjustment device 8. After receiving the lens drive amount, thefocus adjustment device 8 drives the focus adjustment lens of the photographingoptical system 4 by the lens drive amount, via thelens drive motor 5. Details of the focus detection process performed by thecontroller 3 will be described later with reference toFIGS. 4 and 6 . - In a photographing process, the
half mirror 7 is swung up to cover thefocus detection sensor 6 and is thus brought out of the optical path. As a result, all of the incident light fluxes that have passed through the photographingoptical system 4 is incident on theimage sensor 2 to form the subject image on a light receiving surface of theimage sensor 2. Theimage sensor 2 has a plurality of image-capturing pixels arranged in two dimensions and the plurality of image-capturing pixels receives the incident light fluxes and performs the photoelectric conversion to generate the plurality of electrical image-capturing signals corresponding to the subject image formed by the photographingoptical system 4. The plurality of image-capturing signals generated here is output by theimage sensor 2. - The
controller 3 generates an image on the basis of the plurality of image-capturing signals output by theimage sensor 2 and then causes the liquidcrystal display element 1 to display the generated image as the through image. Thecontroller 3 also records the generated image in thestorage device 15 when executing the image-capturing process in response to an image-capturing command from the user. Details of the image-capturing process performed by thecontroller 3 will be described later with reference toFIG. 12 . -
FIG. 2 is a view illustrating thefocus detection sensor 6 and themicrolens array 9 which covers thefocus detection sensor 6.FIG. 2(a) illustrates an enlarged view of thefocus detection sensor 6 and themicrolens array 9 in the vicinity of theoptical axis 10 a illustrated inFIG. 1 . Thefocus detection sensor 6 has the plurality offocus detection pixels 60 arranged in two dimensions. Themicrolens array 9 has the plurality ofmicrolenses 90 arranged in two dimensions (in a honeycomb-like array) with a pitch of 100 μm or less. Although a shape of themicrolens 90 illustrated in the figure is a sphere, the shape may be a hexagon, which matches the honeycomb-like array. -
FIG. 2(b) is a view as seen from directly above themicrolens array 9, wherein themicrolens array 9 and thefocus detection sensor 6 located behind themicrolens array 9 are illustrated overlapping each other. In the example inFIG. 2(b) , a plurality offocus detection pixels 60, which are here 5 vertical by 5 horizontal pixels, corresponds to eachmicrolens 90. A part of the incident light fluxes having passed through the photographingoptical system 4 illustrated inFIG. 1 is reflected from thehalf mirror 7 as the reflected light fluxes, and then the reflected light fluxes arrive at and transmit through themicrolens array 9 and are incident on thefocus detection sensor 6. As described later with reference toFIG. 3 , the light flux having transmitted through eachmicrolens 90 is received by the plurality offocus detection pixels 60 corresponding to eachmicrolens 90, which is here a total of 25 pixels constituted with 5 vertical by 5 horizontal pixels, so that the light flux is converted into the electrical focus detection signal by means of the photoelectric conversion. The plurality offocus detection pixels 60 corresponding to eachmicrolens 90 is not limited to the total of 25 pixels constituted with 5 vertical by 5 horizontal pixels. -
FIG. 3 is a view illustrating a relationship between the plurality offocus detection pixels 60 andmicrolenses 90.FIGS. 3(a) and 3(b) are plan views of the plurality offocus detection pixels 60 andmicrolenses 90. In the examples illustrated inFIGS. 3(a) and 3(b) , the plurality offocus detection pixels 60 corresponding to eachmicrolens 90 is the total of 25 pixels constituted with 5 vertical by 5 horizontal pixels. Among the 25focus detection pixels 60, a pair of focus detection pixel groups described above is specified. For the total of 25focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels corresponding to each microlens 90 exemplified inFIG. 3(a) , three center focus detection pixels are illustrated with hatching among five focus detection pixels included in each of two vertical columns of focus detection pixels located on both ends in a horizontal direction. The pixels with hatching form a pair of focusdetection pixel groups - For the total of 25
focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels corresponding to each microlens 90 exemplified inFIG. 3(b) , three center focus detection pixels are illustrated with hatching among five focus detection pixels included in each of a vertical column of focus detection pixels located on the left end in the figure in the horizontal direction and an adjacent vertical column of focus detection pixels. The pixels with hatching form one focusdetection pixel group 620 a of a pair of focusdetection pixel groups detection pixel group 620 b of the pair of focusdetection pixel groups FIG. 3(b) , each of the pair of focusdetection pixel groups -
FIGS. 3(c) and 3(d) respectively are cross-sectional views taken along dashed-dotted lines S1 and S2 ofFIGS. 3(a) and 3(b) illustrating the plan views of sets of the total of 25focus detection pixels 60 constituted with 5 vertical by 5 horizontal focus detection pixels and themicrolens 90, wherein each dashed-dotted line S1, S2 extends in the horizontal direction through the focus detection pixel that is located in the center of the 25focus detection pixels 60. InFIG. 3(c) , the pair of focusdetection pixel groups light fluxes optical system 4 and through themicrolens 90, and generates the pair of electrical focus detection signals by means of the photoelectric conversion.FIG. 3(a) illustrates five sets of the 25focus detection pixels 60 and themicrolens 90. Thus, a focus detection signal sequence including five focus detection signals generated by five focusdetection pixel groups 610 a and a focus detection signal sequence including five focus detection signals generated by five focusdetection pixel groups 610 b are obtained. The two focus detection signal sequences form a pair of focus detection signal sequences. In the same manner, inFIG. 3(d) , the pair of focusdetection pixel groups light fluxes optical system 4 and through themicrolens 90, and generates the pair of electrical focus detection signals by means of the photoelectric conversion.FIG. 3(b) illustrates five sets of the 25focus detection pixels 60 and themicrolens 90. Thus, five focus detection signal sequences generated by five focusdetection pixel groups 620 a and five focus detection signal sequences generated by five focusdetection pixel groups 620 b are obtained. The two focus detection signal sequences form a pair of focus detection signal sequences. - The focus adjustment of the photographing
optical system 4 can be performed on the basis of a phase difference between the pair of focus detection signal sequences obtained in this way, or on the basis of a defocus amount calculated from the phase difference. It should be noted that a distance between the pair of focusdetection pixel groups FIG. 3(c) is larger than a distance between the pair of focusdetection pixel groups FIG. 3(d) . Therefore, an aperture angle formed by the pair oflight fluxes FIG. 3(c) is larger than an aperture angle formed by the pair oflight fluxes FIG. 3(d) . - Although the present invention can be applied to both cases, it is preferable to apply the present invention to a case where a configuration having a large aperture angle as illustrated in
FIG. 3(c) may be employed. This is because it is easier to detect a difference between changes in focus detection signal values caused by a scene including distant-subject images and near-subject images as described later, as the aperture angle increases. -
FIG. 4 is a flowchart of the focus detection process performed by thecontroller 3. Thecontroller 3 is a computer including a CPU and a memory, for example. The CPU executes a computer program stored in the memory to perform process steps constituting the focus detection process illustrated inFIG. 4 . - The process steps constituting the focus detection process illustrated in
FIG. 4 will be explained with reference to an exemplaryphotographic screen 250 illustrated inFIG. 5 . InFIG. 5 , thephotographic screen 250 includes two subject images formed by the photographing optical system 4: asubject image 210 of a background including trees and ansubject image 220 of a person. Afocus detection area 200 is also displayed in thephotographic screen 250. Thesubject image 210 of the background including trees located far from the image-capturingapparatus 100 and thesubject image 220 of the person located near the image-capturingapparatus 100 are included also in thefocus detection area 200. Although typically a plurality offocus detection areas 200 is displayed on thephotographic screen 250, only onefocus detection area 200 specified by an user in step S101 inFIG. 4 described later is illustrated in thephotographic screen 250 inFIG. 5 . - When the focus detection process according to this embodiment is started by the user via an operating member which is not illustrated, the
controller 3 makes a decision as to whether or not afocus detection area 200 is specified, in step S101. If No, the process step in step S101 is repeated until the decision result is Yes. If Yes, thecontroller 3 causes the process to proceed to step S102, with the specifiedfocus detection area 200 being a target of the process. The operating member described above may be an automatic focus detection activation switch, for example, and the process may be started by turning on the automatic focus detection activation switch. Alternatively, the operating member may be a shutter release button and the process may be started by setting the shutter release button in a halfway-press state. - In step S102, the
controller 3 performs a photoelectric conversion control of thefocus detection sensor 6. The photoelectric conversion control of thefocus detection sensor 6 includes an exposure control of the plurality offocus detection pixels 60 arranged in thefocus detection sensor 6, a read-out control of the plurality of focus detection signals, and/or an amplification control of the plurality of focus detection signals that has been read out, for example. - In step S103, the
controller 3 obtains a pair of focus detection signal sequences on the basis of the plurality of focus detection signals which have been read out in step S102. - In step S104, the
controller 3 performs a defocus amount determination process to determine a focus adjustment defocus amount as a defocus amount for focus adjustment. Details of the defocus amount determination process will be described later with reference toFIG. 6 . - In step S105, the
controller 3 makes a decision as to whether or not the photographingoptical system 4 is located at a focus position, depending on whether or not the focus adjustment defocus amount determined in step S104 is approximately zero. If Yes, the process ends. If No, the process proceeds to step S106. It is also possible that thecontroller 3 makes a decision as to a reliability of the focus adjustment defocus amount determined in step S104, and a scan operation is performed if it is determined that the focus adjustment defocus amount is unreliable and the focus detection is impossible. If the subject image is not present within thefocus detection area 200 already after the start of the focus adjustment lens drive, the focus adjustment lens may be driven on the basis of the focus adjustment defocus amount that was most recently detected, before the process ends. - In step S106, the
controller 3 calculates the lens drive amount for the photographingoptical system 4, on the basis of the focus adjustment defocus amount determined in step S104. - In step S107, the
controller 3 sends the lens drive amount calculated in step S106 to thefocus adjustment device 8 and controls thefocus adjustment device 8 so that thefocus adjustment device 8 drives the lens of the photographingoptical system 4 via thelens drive motor 5. Upon completion of the process in step S107, the process returns to step S101. -
FIG. 6 is a flowchart detailing the defocus amount determination process performed by thecontroller 3 in step S104 inFIG. 4 . The pair of focus detection signal sequences obtained in step S103 inFIG. 4 will be denoted by {a[i]} and {b[j]}. An initial value of a relative phase shift amount k between the pair of focus detection signal sequences {a[i]} and {b[j]} is 0. As a magnitude of the phase shift amount k becomes closer to the phase difference amount of the pair of focus detection signal sequences {a[i]} and {b[j]}, a correlation between the pair of focus detection signal sequences {a[i]} and {b[j]} becomes higher (k=i−j). With the highest correlation between the pair of focus detection signal sequences {a[i]} and {b[j]}, a correlation value C(k) of the pair of focus detection signal sequences {a[i]} and {b[j]} expressed by the following equation (1) is the minimum value. The summation of the right-hand side of the equation (1) is repeated a number of times equal to the number of signals in the pair of focus detection signal sequences {a[i]} and {b[j]}. -
C(k)=Σ|a[i]−b[j]| (1) - In step S201, the
controller 3 determines the minimum value C(k)_min of the correlation amount C(k) by sequentially calculating the correlation amount C(k) while shifting phases of the pair of focus detection signal sequences {a[i]} and {b[j]} obtained in step S103 inFIG. 4 relative to each other by a predetermined shift amount for each calculation. Thecontroller 3 obtains a specific shift amount X0 of the pair of focus detection signal sequences {a[i]} and {b[j]} that provides the minimum value C(k)_min, and calculates a defocus amount D0 as a focus detection amount parameter on the basis of the specific shift amount X0. - In step S202, the
controller 3 makes a decision as to whether or not the minimum value C(k)_min of the correlation amount C(k) determined in step S201 is smaller than a predetermined threshold C(k)_th. If Yes, thecontroller 3 causes the process to proceed to step S208. As illustrated inFIG. 5 , if a distant-subject image and a near-subject image are included in thefocus detection area 200, the pair of focus detection signal sequences {a[i]} and {b[j]} are not identical to each other over the whole focus detection area and there are partial sections where they are not identical (as described later with reference toFIG. 8 ), even if the phases of the pair of focus detection signal sequences {a[i]} and {b[j]} are shifted relative to each other by the predetermined shift amount (X0) which provides the minimum value C(k)_min of the correlation amount C(k). The minimum value C(k)_min of the correlation amount C(k) therefore deviates from 0. In contrast, if only either the distant-subject image or the near-subject image is present in thefocus detection area 200 or if a difference in distances of the distant-subject image and the near-subject image is small, the minimum value C(k)_min of the correlation amount C(k) is close to 0. In such a case, there is no need to apply the present invention. Thus, if the decision result is No in step S202, thecontroller 3 causes the process to proceed to step S203. - In step S203, the
controller 3 makes a decision as to whether or not a brightness of the subject image including thesubject image 210 of the background including trees and thesubject image 220 of the person is lower than a predetermined brightness in thefocus detection area 200 specified in step S101 inFIG. 4 . If Yes, i.e., if the brightness of the subject image is lower than the predetermined brightness, it is likely that the amplification control was performed with a large amplification degree in step S102 inFIG. 4 . The amplification control with a large amplification degree causes a noise superimposed on the focus detection signal to be amplified. In such a case, the present invention is not applied because it is likely that an error occurs in calculation of a difference between focus detection signals described later in the explanation of step S204. For this reason, thecontroller 3 causes the process to proceed to step S208. If the decision result is No in step S203, thecontroller 3 causes the process to proceed to step S204. - A magnitude of the amplification degree of the amplification control performed in step S102 in
FIG. 4 may be used as a brightness decision index in step S203 inFIG. 6 , for example. If the amplification degree is lower than the predetermined value, thecontroller 3 makes a decision as to that the brightness of the overall subject image is not lower than the predetermined brightness. In other words, the decision result is No in step S203. - In step S204, the
controller 3 sequentially calculates absolute values |a[i]−b[j]| of differences between focus detection signals corresponding to each other in the pair of focus detection signal sequences {a[i]} and {b[j]} that have been shifted relative to each other by the specific shift amount X0 obtained in step S201, i.e., the pair of focus detection signal sequences {a[i]} and {b[j]} that have been shifted so as to have the highest correlation, so that a plurality of differences is obtained. - In step S205, the
controller 3 divides the pair of focus detection signal sequences {a[i]} and {b[j]} that have been shifted relative to each other by the specific shift amount X0 obtained in step S201 into two pairs of partial signal sequences: a pair of partial signal sequences corresponding to the distant-subject image (thesubject image 210 of the background including trees) and a pair of partial signal sequences corresponding to the near-subject image (thesubject image 220 of the person). For example, thecontroller 3 performs the division process in step S205 depending on whether or not each of the plurality of differences obtained from the sequential calculation of the absolute values |a[i]−b[j]| of the differences in step S204 is equal to or higher than the average value of the plurality of differences. Details thereof will be described later with reference toFIG. 9 . - In step S206, the
controller 3 calculates a phase difference amount between partial signal sequences in each of the two pairs of partial signal sequences obtained in step S205. Two phase difference amounts X1 and X2 calculated in this way, each corresponding to respective one of the two pairs of partial signal sequences, are a type of focus detection parameter. Although process steps subsequent to step S207 may therefore be performed on the basis of the two phase difference amounts X1 and X2, thecontroller 3 further calculates two defocus amounts D1 and D2 on the basis of the two phase difference amounts X1 and X2 in this embodiment. The two defocus amounts D1 and D2, each corresponding to respective one of the two pairs of partial signal sequences, are also a type of focus detection parameter. - In step S207, the
controller 3 determines the nearer-side defocus amount of the two defocus amounts D1 and D2 calculated in step S206 as the focus adjustment defocus amount. The nearer-side defocus amount is determined on the basis of the fact that a focus position of the photographingoptical system 4 for the nearest subject to the image-capturingapparatus 100 is located at the farthest position from the photographingoptical system 4. The defocus amount corresponding to the nearer subject image (thesubject image 220 of the person) of the two defocus amount D1 and D2 calculated in step S206 is the nearer-side defocus amount. Upon completion of step S207, the process ends and thecontroller 3 causes the focus detection process inFIG. 4 to proceed to step S105. - In step S208, which is performed if the decision result is Yes in step S202 or S203, the
controller 3 determines the defocus amount D0 calculated on the basis of the specific shift amount X0 of the pair of focus detection signal sequences {a[i]} and {b[j]} obtained in step S201, as the focus adjustment defocus amount. Upon completion of step S208, the process ends and thecontroller 3 causes the focus detection process inFIG. 4 to proceed to step S105. -
FIG. 7 corresponds to the exemplaryphotographic screen 250 illustrated inFIG. 5 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences {a[i]} and {b[j]} with respect to focus detection pixel positions in thefocus detection area 200 having a length of approximately 50 pixels in the horizontal direction. A pair of focus detection signal sequences {a[i]} and {b[j]} illustrated inFIG. 7 , namely a pair of focusdetection signal sequences FIG. 4 . InFIG. 7 , asection 310 of focusdetection pixel positions 1 to 13 in the horizontal direction in thefocus detection area 200 corresponds to thesubject image 220 of the person located near the image-capturingapparatus 100 illustrated inFIG. 5 . In this section, a phase of one focusdetection signal sequence 655 a of the pair of focusdetection signal sequences detection signal sequence 655 b. InFIG. 7 , asection 320 of focus detection pixel positions 14 to 46 in the horizontal direction in thefocus detection area 200 corresponds to thesubject image 210 of the background including trees located far from the image-capturingapparatus 100 illustrated inFIG. 5 . In this section, the phase of one focusdetection signal sequence 655 a of the pair of focusdetection signal sequences detection signal sequence 655 b.FIG. 8 illustrates a pair of focusdetection signal sequences detection signal sequences FIG. 6 . -
FIG. 8 is a graph illustrating a state where the pair of focus detection signal sequences {a[i]} and {b[j]} inFIG. 7 are shifted relative to each other by the specific shift amount X0 which provides the minimum value C(k)_min of the correlation amount C(k) of the pair of the focus detection signal sequences {a[i]} and {b[j]}. It will be assumed that the correlation between the pair of focusdetection signal sequences section 320 corresponding to thesubject image 210 of the background including trees is higher than that in thesection 310 corresponding to thesubject image 220 of the person. In such a case, the specific shift amount X0 obtained in step S201 inFIG. 6 may be significantly affected by thesubject image 210 of the background including trees. Therefore, if the pair of focus detection signal sequences {a[i]} and {b[j] } inFIG. 7 are shifted relative to each other by the specific shift amount X0, the pair of focusdetection signal sequences section 320 of the focus detection pixel positions 14 to 46 in the horizontal direction in thefocus detection area 200 corresponding to thesubject image 210 of the background including trees, as illustrated inFIG. 8 . As illustrated inFIG. 8 , in thesection 310 of the focusdetection pixel positions 1 to 13 in the horizontal direction in thefocus detection area 200 corresponding to thesubject image 220 of the person, there are certain phase differences between the pair of focusdetection signal sequences FIG. 9 explains a way of determining aboundary 350 that divides thewhole section 300 in the horizontal direction of thefocus detection area 200 into thesection 310 corresponding to thesubject image 220 of the person and thesection 320 corresponding to thesubject image 210 of the background including trees as described above, wherein theboundary 350 is located between focus detection pixel positions 13 and 14. -
FIG. 9 is a graph explaining the division process for the pair of focus detection signal sequences {a[i]} and {b[j]} illustrated inFIG. 7 and corresponds to the process step in step S204 inFIG. 6 .FIG. 9 illustrates variations of a plurality ofdifferences 671, each difference being obtained for each focus detection pixel position in the horizontal direction in thefocus detection area 200, by sequentially calculating absolute values |a[i]−b[j]| of differences between focus detection signals corresponding to each other in the pair of focusdetection signal sequences FIG. 8 where the pair of focus detection signal sequences {a[i]} and {b[j]} inFIG. 7 have been shifted relative to each other by the specific shift amount X0. - In the
section 320 corresponding to thesubject image 210 of the background including trees described above within thewhole section 300 in the horizontal direction of thefocus detection area 200, variations of the absolute values |a[i]−b[j]| of the differences with respect to change in focus detection pixel positions in the horizontal direction in thefocus detection area 200 are generally small. In thesection 310 corresponding to thesubject image 220 of the person described above, the absolute value |a[i]−b[j]| of the difference considerably increases and decreases for each positional change from one focus detection pixel to the next in the horizontal direction in thefocus detection area 200. Now, the average value of the plurality ofdifferences 671 over thewhole section 300 in the horizontal direction of thefocus detection area 200 is determined. Then, none of the absolute values |a[i]−b[j]| of the differences is equal to or higher than the average value in thesection 320 corresponding to thesubject image 210 of the background including trees, while there are a large number of absolute values |a[i]−b[j]| of differences that are equal to or higher than the average value in thesection 310 corresponding to thesubject image 220 of the person. Therefore, inFIG. 9 , the section in which each and all of the plurality ofdifferences 671 is lower than the average value of the plurality ofdifferences 671 with respect to change in focus detection pixel positions in the horizontal direction in thefocus detection area 200, i.e., the section of the focus detection pixel positions 14 to 46 is specified as thesection 320, and theboundary 350 can be specified so as to be located between the focus detection pixel positions 13 and 14. The section of the focusdetection pixel positions 1 to 13 within thewhole section 300, which is the opposite side of theboundary 350 to thesection 320, can be specified as thesection 310. On the basis of this result, in step S205 inFIG. 6 , the pair of focusdetection signal sequences FIG. 7 indicating a current focusing condition can be divided into the pair of partial signal sequences in thesection 310 of the focusdetection pixel positions 1 to 13 corresponding to thesubject image 220 of the person, and the pair of partial signal sequences in thesection 320 of the focus detection pixel positions 14 to 46 corresponding to thesubject image 210 of the background including trees. -
FIG. 10 is a graph illustrating variations of focus detection signal values of a pair ofpartial signal sequences detection pixel positions 1 to 13 corresponding to thesubject image 220 of the person is the minimum value. As described above, in step S205 inFIG. 6 , a pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 13 corresponding to thesubject image 220 of the person is obtained by dividing the pair of focusdetection signal sequences FIG. 7 . A phase difference amount X1 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 13 corresponding to thesubject image 220 of the person.FIG. 10 illustrates a state where the pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 13 corresponding to thesubject image 220 of the person are shifted relative to each other by the phase difference amount X1. In step S206 inFIG. 6 , the defocus amount D1 is calculated on the basis of the phase difference amount X1. -
FIG. 11 is a graph illustrating variations of focus detection signal values of a pair ofpartial signal sequences subject image 210 of the background including trees is the minimum value. As described above, in step S205 inFIG. 6 , a pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to thesubject image 210 of the background including trees is obtained by dividing the pair of focusdetection signal sequences FIG. 7 . A phase difference amount X2 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to thesubject image 210 of the background including trees.FIG. 11 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 14 to 46 corresponding to thesubject image 210 of the background including trees are shifted relative to each other by the phase difference amount X2. In step S206 inFIG. 6 , the defocus amount D2 is calculated on the basis of the phase difference amount X2. -
FIG. 12 is a flowchart of the image-capturing process performed by thecontroller 3. As described above, thecontroller 3 is a computer including a CPU and a memory, for example. The CPU executes a computer program stored in the memory to perform process steps constituting the image-capturing process illustrated inFIG. 12 . - In step S501, the
controller 3 makes a decision as to whether or not the user issues an image-capturing command via the operating member. If No, the process step in step S501 is repeated until the decision result is Yes. If Yes, thecontroller 3 causes the process to proceed to step S502. The operating member may be a shutter release button, for example, and the decision result is Yes in step S501 if the shutter release button is set in a full-press state. - In step S502, the
controller 3 performs a photoelectric conversion control of theimage sensor 2. The photoelectric conversion control of theimage sensor 2 includes an exposure control of the plurality of image-capturing pixels arranged in theimage sensor 2, a read-out control of the plurality of image-capturing signals, and/or an amplification control of the plurality of image-capturing signals that has been read out, for example. - In step S503, the
controller 3 obtains the plurality of image-capturing signals which has been read out in step S502 and on which the amplification control has been performed. - In step S504, the
controller 3 generates an image on the basis of the plurality of image-capturing signals obtained in step S503. - In step S505, the
controller 3 records the image generated in step S504 in thestorage device 15. Upon completion of step S505, the process ends. - The
focus detection device 50 according to this embodiment includes thefocus detection sensor 6 and thecontroller 3, as described above. Thefocus detection sensor 6 receives the pair of light fluxes having passed through the pair of pupil regions of the photographingoptical system 4 and outputs the pair of focusdetection signal sequences controller 3 sequentially calculates absolute values |a[i]−b[j]| of differences between focus detection signals corresponding to each other in the pair of focusdetection signal sequences differences 671 is obtained. Based on the plurality ofdifferences 671 obtained here, thecontroller 3 divides the pair of focusdetection signal sequences subject image 220 of the person located near the image-capturingapparatus 100 and a pair of partial signal sequences corresponding to thesubject image 210 of the background including trees located far from the image-capturingapparatus 100. Thecontroller 3 calculates the defocus amount D1 in accordance with the phase difference amount X1 of the pair of partial signal sequences corresponding to thesubject image 220 of the person and the defocus amount D2 in accordance with the phase difference amount X2 of the pair of partial signal sequences corresponding to thesubject image 210 of the background including trees. Thecontroller 3 determines either one of the defocus amounts D1 and D2 as the focus adjustment defocus amount used for the focus adjustment. Thus, either thesubject image 210 of the background including trees located far from the image-capturingapparatus 100 or thesubject image 220 of the person located near the image-capturingapparatus 100 can be in focus. - The
controller 3 calculates the defocus amounts D1 and D2 and determines the nearer-side defocus amount of the defocus amounts D1 and D2 as the focus adjustment defocus amount. Thus, thesubject image 220 of the person located near the image-capturingapparatus 100 can be in focus. - (1) Although in the above-described embodiment, the present invention is applied to the example where two subject images are included in the
focus detection area 200 as illustrated inFIG. 5 , the present invention may be applied to a case where three or more subject images are included in thefocus detection area 200. This will be described with reference toFIG. 13 . -
FIG. 13 is a view illustrating an example where threesubject images focus detection area 200 in thephotographic screen 250. Thephotographic screen 250 and thefocus detection area 200 include thesubject image 210 of a background including trees, thesubject image 220 of one person, and thesubject image 230 of another person. Although typically a plurality offocus detection areas 200 is displayed on thephotographic screen 250, only onefocus detection area 200 specified by an user is illustrated in thephotographic screen 250 inFIG. 13 . On the basis of the example illustrated inFIG. 13 , the focus detection process performed by the controller 18 will be described with reference toFIGS. 14 to 19 . -
FIG. 14 corresponds to the exemplaryphotographic screen 250 illustrated inFIG. 13 and is a graph illustrating variations of the focus detection signal values of the pair of focus detection signal sequences {a[i]} and {b[j]} with respect to focus detection pixel positions in thefocus detection area 200 having a length of approximately 50 pixels in the horizontal direction. A pair of focus detection signal sequences {a[i]} and {b[j]} illustrated inFIG. 14 , namely a pair of focusdetection signal sequences FIG. 4 . InFIG. 14 , asection 310 of focusdetection pixel positions 1 to 12 in the horizontal direction in thefocus detection area 200 corresponds to thesubject image 220 of the one person located near the image-capturingapparatus 100 illustrated inFIG. 13 . In this section, a phase of one focusdetection signal sequence 655 a of the pair of focusdetection signal sequences detection signal sequence 655 b. InFIG. 14 , asection 320 of focus detection pixel positions 13 to 28 in the horizontal direction in thefocus detection area 200 corresponds to thesubject image 210 of the background including trees located far from the image-capturingapparatus 100 illustrated inFIG. 13 . In this section, a phase of one focusdetection signal sequence 655 a of the pair of focusdetection signal sequences detection signal sequence 655 b. InFIG. 14 , asection 330 of focus detection pixel positions 29 to 46 in the horizontal direction in thefocus detection area 200 corresponds to thesubject image 230 of the another person who is the nearest subject to the image-capturingapparatus 100 illustrated inFIG. 13 . In this section, the phase of one focusdetection signal sequence 655 a of the pair of focusdetection signal sequences detection signal sequence 655 b.FIG. 15 illustrates a pair of focusdetection signal sequences detection signal sequences FIG. 6 . -
FIG. 15 is a graph illustrating a state where the pair of focus detection signal sequences {a[i]} and {b[j]} inFIG. 14 are shifted relative to each other by the specific shift amount X0 which provides the minimum value C(k)_min of the correlation amount C(k) of the pair of the focus detection signal sequences {a[i]} and {b[j]}. InFIG. 13 , in the horizontal direction of thefocus detection area 200, thesubject image 220 of one person is located on the left side of thefocus detection area 200 in the figure, thesubject image 210 of the background including trees is located in the center, and thesubject image 230 of another person is located on the right side. It will be assumed that the correlation between the pair of focusdetection signal sequences section 320 corresponding to thesubject image 210 of the background including trees is higher than that in thesection 310 corresponding to thesubject image 220 of the one person and in thesection 330 corresponding to thesubject image 230 of the another person because a contrast in thesection 320 is higher than that in thesection 310 and in thesection 330. In such a case, the specific shift amount X0 obtained by means of the process according to step S201 inFIG. 6 may be significantly affected by thesubject image 210 of the background including trees. In such a case, if the pair of focus detection signal sequences {a[i]} and {b[j]} inFIG. 14 are shifted relative to each other by the specific shift amount X0, the pair of focusdetection signal sequences section 320 of the focus detection pixel positions 13 to 28 in the horizontal direction in thefocus detection area 200 corresponding to thesubject image 210 of the background including trees, as illustrated inFIG. 15 . - As illustrated in
FIG. 15 , in thesection 310 of the focusdetection pixel positions 1 to 12 in the horizontal direction in thefocus detection area 200 corresponding to thesubject image 220 of the one person, there are certain phase differences between the pair of focusdetection signal sequences FIG. 15 , in thesection 330 of the focus detection pixel positions 29 to 46 in the horizontal direction in thefocus detection area 200 corresponding to thesubject image 230 of the another person, there are certain phase differences between the pair of focusdetection signal sequences FIG. 16 explains a way of determining aboundary 350 that divides thewhole section 300 in the horizontal direction of thefocus detection area 200 into thesection 310 corresponding to thesubject image 220 of the one person and thesection 320 corresponding to thesubject image 210 of the background including trees as described above, wherein theboundary 350 is located between focus detection pixel positions 12 and 13; and further determining aboundary 360 that divides thesection 300 into thesection 320 corresponding to thesubject image 210 of the background including trees and thesection 330 corresponding to thesubject image 230 of the another person. -
FIG. 16 is a graph explaining the division process for the pair of focus detection signal sequences {a[i]} and {b[j]} illustrated inFIG. 14 and corresponds to a process according to step S204 inFIG. 6 .FIG. 16 illustrates variations of a plurality ofdifferences 671, each difference being obtained for each focus detection pixel position in the horizontal direction in thefocus detection area 200, by sequentially calculating absolute values |a[i]−b[j]| of differences between focus detection signals corresponding to each other in the pair of focusdetection signal sequences FIG. 15 where the pair of focus detection signal sequences {a[i]} and {b[j]} inFIG. 14 have been shifted relative to each other by the specific shift amount X0. In thesection 320 corresponding to thesubject image 210 of the background including trees described above within thewhole section 300 in the horizontal direction of thefocus detection area 200, variations of the absolute values |a[i]−b[j]| of the differences with respect to change in focus detection pixel positions in the horizontal direction in thefocus detection area 200 are generally small. - In the
section 310 corresponding to thesubject image 220 of the one person and thesection 330 corresponding to thesubject image 230 of the another person as described above, the absolute value |a[i]−b[j]| of the difference considerably increases and decreases for each positional change from one focus detection pixel to the next in the horizontal direction in thefocus detection area 200. Now, the average value of the plurality ofdifferences 671 over thewhole section 300 in the horizontal direction of thefocus detection area 200 is determined. Then, none of the absolute values |a[i]−b[j]| of the differences is equal to or higher than the average value, in thesection 320 corresponding to thesubject image 210 of the background including trees, while there are a large number of absolute values |a[i]−b[j]| of differences that are equal to or higher than the average value, in thesection 310 corresponding to thesubject image 220 of the one person and in thesection 330 corresponding to thesubject image 230 of the another person. Therefore, inFIG. 16 , the section in which each and all of the plurality ofdifferences 671 is lower than the average value of the plurality ofdifferences 671 with respect to change in focus detection pixel positions in the horizontal direction in thefocus detection area 200, i.e., the section of thefocus detection pixels 13 to 28 is specified as thesection 320, and theboundaries detection pixel positions 1 to 12 in thewhole section 300, which is the opposite side of theboundary 350 to thesection 320, can be specified as thesection 310, and the section of the focus detection pixel positions 29 to 46 in thewhole section 300, which is the opposite side of theboundary 360 to thesection 320, can be specified as thesection 330. On the basis of this result, in the process of dividing into three parts which can be performed according to step S205 inFIG. 6 , the pair of focusdetection signal sequences FIG. 14 indicating a current focusing condition can be divided into the pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 12 corresponding to thesubject image 220 of the one person, the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to thesubject image 210 of the background including trees, and the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to thesubject image 230 of the another person. -
FIG. 17 is a graph illustrating variations of focus detection signal values of a pair ofpartial signal sequences detection pixel positions 1 to 12 corresponding to thesubject image 220 of the one person is the minimum value. As described above, in the process of dividing into three parts which can be performed according to step S205 inFIG. 6 , a pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 12 corresponding to thesubject image 220 of the one person is obtained by dividing the pair of focusdetection signal sequences FIG. 14 . A phase difference amount X1 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 12 corresponding to thesubject image 220 of the one person.FIG. 17 illustrates a state where the pair of partial signal sequences in the section of the focusdetection pixel positions 1 to 12 corresponding to thesubject image 220 of the one person are shifted relative to each other by the phase difference amount X1. In the process according to step S206 inFIG. 6 , the defocus amount D1 is calculated on the basis of the phase difference amount X1. -
FIG. 18 is a graph illustrating variations of focus detection signal values of a pair ofpartial signal sequences subject image 210 of the background including trees is the minimum value. As described above, in the process of dividing in three which can be performed according to step S205 inFIG. 6 , a pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to thesubject image 210 of the background including trees is obtained by dividing the pair of focusdetection signal sequences FIG. 14 . A phase difference amount X2 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to thesubject image 210 of the background including trees.FIG. 18 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 13 to 28 corresponding to thesubject image 210 of the background including trees are shifted relative to each other by the phase difference amount X2. In the process according to step S206 inFIG. 6 , the defocus amount D2 is calculated on the basis of the phase difference amount X2. -
FIG. 19 is a graph illustrating variations of focus detection signal values of a pair ofpartial signal sequences subject image 230 of the another person is the minimum value. As described above, in the process of dividing in three which can be performed according to step S205 inFIG. 6 , a pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to thesubject image 230 of the another person is obtained by dividing the pair of focusdetection signal sequences FIG. 14 . A phase difference amount X3 of the pair of partial signal sequences is obtained by performing a correlation operation while shifting phases of the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to thesubject image 230 of the another person.FIG. 19 illustrates a state where the pair of partial signal sequences in the section of the focus detection pixel positions 29 to 46 corresponding to thesubject image 230 of the another person are shifted relative to each other by the phase difference amount X3. In the process according to step S206 inFIG. 6 , the defocus amount D3 is calculated on the basis of the phase difference amount X3. - (2) In step S205 in
FIG. 6 according to one embodiment as described above, thecontroller 3 performs the division process in step S205 depending on whether or not each of the plurality ofdifferences 671 as illustrated inFIG. 9 obtained from the sequential calculation of the absolute values |a[i]−b[j]| of the differences in step S204 is equal to or higher than the average value of the plurality ofdifferences 671. However, other methods may be used. For example, thecontroller 3 calculates differential values of the plurality ofdifferences 671. The differential value is obtained by calculating a difference between absolute values |a[i]−b[j]| of two differences at adjacent focus detection pixel positions in the horizontal direction in thefocus detection area 200, over the whole range of the focusdetection pixel positions 1 to 46.FIG. 22 is a graph illustrating differences between the absolute values |a[i]−b[j]| of the differences illustrated inFIG. 9 . - With reference to
FIG. 9 , in thesection 320 corresponding to thesubject image 210 of the background including trees described above within thewhole section 300 in the horizontal direction of thefocus detection area 200, variations of the absolute values |a[i]−b[j]| of the differences with respect to change in focus detection pixel positions in the horizontal direction in thefocus detection area 200 are generally small, as described above. Therefore, as illustrated inFIG. 22 , magnitudes of the differential values in thesection 320 are smaller than the predetermined value V which is close to 0. In thesection 310 corresponding to thesubject image 220 of the person described above, the absolute value |a[i]−b[j]| of the difference considerably increases and decreases for each positional change from one focus detection pixel to the next in the horizontal direction in thefocus detection area 200. Therefore, as illustrated inFIG. 22 , magnitudes of the differential values in thesection 310 are equal to or higher than the predetermined value V described above. In other words, in step S205 inFIG. 6 , thecontroller 3 may divide the pair of focusdetection signal sequences subject image 220 of the person located near the image-capturingapparatus 100 and a pair of partial signal sequences corresponding to thesubject image 210 of the background including trees located far from the image-capturingapparatus 100, depending on whether or not a magnitude of a differential value which is a difference between absolute values |a[i]−b[j]| of differences adjacent to each other in the plurality ofdifferences 671 obtained in step S204 is lower than a predetermined value V. - (3) In step S203 in
FIG. 6 according to one embodiment described above, thecontroller 3 makes a decision as to whether or not the brightness of the whole subject image including thesubject image 210 of the background including trees and thesubject image 220 of the person is lower than the predetermined brightness. As described above, in the case where the brightness of the whole subject image is lower than the predetermined brightness, noises superimposed on the focus detection signals are also amplified. Therefore, if the plurality of differences is obtained in step S204 without the process step in step S203, absolute values |a[i]−b[j]| of differences that are equal to or higher than the average value are included also in the section of the focus detection pixel positions 14 to 46 corresponding to thesubject image 210 of the background including trees inFIG. 9 . Thus, thecontroller 3 may cause the process to proceed to step S208 if it is determined that there are absolute values |a[i]−b[j]| of differences that are equal to or higher than the average value over thewhole section 300 in the horizontal direction of thefocus detection area 200 in step S204 without the process step in step S203. - (4) In the above-described embodiment, in step S206 and step S207 in
FIG. 6 , thecontroller 3 makes a decision as to the nearer-side defocus amount of the two defocus amounts D1 and D2 calculated on the basis of the two phase difference amounts X1 and X2, each corresponding to respective one of the two pairs of partial signal sequences, as the focus adjustment defocus amount. However, thecontroller 3 may determine one defocus amount calculated on the basis of the nearer-side phase difference amount of the two phase difference amounts X1 and X2, each corresponding to respective one of the two pairs of partial signal sequences, as the focus adjustment defocus amount. - (5) In the above-described embodiment, in step S105 in
FIG. 4 , thecontroller 3 determines whether or not the photographingoptical system 4 is located at a focus position, depending on whether or not the focus adjustment defocus amount determined in step S104 is approximately zero. However, it is also possible that step S106 is performed first and, depending on whether or not the lens drive amount of the photographingoptical system 4 calculated in step S106 is approximately 0, it is determined whether or not the photographingoptical system 4 is located at a focus position. - (6) In the above-described embodiment, the
controller 3 performs the focus adjustment control by driving the lens of the photographingoptical system 4 on the basis of the focus adjustment defocus amount, in step S106 and step S107 inFIG. 4 . However, thecontroller 3 may perform the focus adjustment control by driving theimage sensor 2 on the basis of the focus adjustment defocus amount. - (7) The present invention may be applied to not only the
focus detection device 50 having thefocus detection sensor 6 covered by themicrolens array 9 illustrated inFIG. 1 , but also afocus detection device 50 having afocus detection sensor 16 receiving light fluxes that have passed through thehalf mirror 7 and then have been reflected from a sub-mirror 70 to pass through the image reformingoptical system 17, or afocus detection device 50 having animage sensor 2 including a focus detection sensor covered by themicrolens array 19.FIG. 20 is a view illustrating a configuration of an image-capturingapparatus 100 including afocus detection device 50 having afocus detection sensor 16 having a plurality of focus detection pixels arranged therein, the pixels receiving light fluxes that have transmitted through ahalf mirror 7 and have been reflected from a sub-mirror 70 and have passed through an image reformingoptical system 17.FIG. 21 is a view illustrating a configuration of an image-capturingapparatus 100 including afocus detection device 50 having animage sensor 2 including a focus detection sensor covered by amicrolens array 19. In other words, a plurality of focus detection pixels generating a plurality of focus detection signals and a plurality of image-capturing pixels generating a plurality of image-capturing signals are arranged in a mixed manner in theimage sensor 2 covered by themicrolens array 19. InFIGS. 20 and 21 , parts denoted by the same reference numerals as inFIG. 1 are the same as that in the image-capturingapparatus 100 inFIG. 1 and therefore explanation thereof will be omitted. - In the image-capturing
apparatus 100 illustrated inFIG. 21 , a magnitude of an ISO sensitivity in the image-capturing process performed by theimage sensor 2 may be used as a brightness decision index in step S203 inFIG. 6 . If the ISO sensitivity is lower than the predetermined value, thecontroller 3 makes a decision that the brightness of the overall subject image is not lower than the predetermined brightness. In other words, the decision result is No in step S203. - The embodiments and variations described above may be combined. The present invention is not limited to the embodiments and variations described above, and other forms conceivable within the technical idea of the present invention are encompassed in the scope of the present invention, unless impairing features of the present invention.
- The disclosure of the following priority application is herein incorporated by reference:
- Japanese Patent Application No. 2013-243944 (filed Nov. 26, 2013)
- 1 . . . liquid crystal display element, 2 . . . image sensor, 3 . . . controller, 4 . . . photographing optical system, 5 . . . lens drive motor, 6 . . . focus detection sensor, 7 . . . half mirror, 8 . . . focus adjustment device, 9 . . . microlens array, 10 . . . optical axis, 15 . . . storage device, 16 . . . focus detection sensor, 17 . . . image reforming optical system, 19 . . . microlens array, 50 . . . focus detection device, 70 . . . sub-mirror
Claims (8)
1. A focus detection device comprising:
an image sensor that includes a plurality of first light receiving units each receiving light which has passed through a first region of a photographing optical system forming a subject image and a plurality of second light receiving units each receiving light which has passed through a second region of the photographing optical system and that outputs a set of first signals generated at the plurality of first light receiving units and a set of second signals generated at the plurality of second light receiving units; and
a controller that divides each set of the set of first signals and the set of second signals into a plurality of sections and controls drive of the photographing optical system based upon a shift amount of the set of first signals and the set of second signals respectively corresponding to a section among the plurality of sections.
2. The focus detection device according to claim 1 , wherein
the controller divides the each set of the set of first signals and the set of second signals into the plurality of sections, calculates the shift amount of the set of first signals and the set of second signals respectively corresponding to each of the plurality of sections and controls the drive of the photographing optical system based upon the shift amount calculated.
3. The focus detection device according to claim 1 , wherein
the set of first signals is based upon the subject image formed by the light which has passed through the first region of the photographing optical system, and
the set of second signals is based upon the subject image formed by the light which has passed through the second region of the photographing optical system.
4. The focus detection device according to claim 1 , wherein
the image sensor includes a plurality of pixels each having a lens, and
the plurality of pixels respectively have the plurality of first light receiving units each arranged at a first position corresponding to the lens and respectively have the plurality of second light receiving units each arranged at a second position different from the first position corresponding to the lens.
5. The focus detection device according to claim 1 , wherein
the controller divides the each set of the set of first signals and the set of second signals based upon an absolute value of a difference between each signal of the set of first signals and each signal of the set of second signals.
6. The focus detection device according to claim 1 , wherein
the set of first signals is based upon the subject image formed by the light which has passed through the first region,
the set of second signals is based upon the subject image formed by the light which has passed through the second region, and
the controller divides the each set of the set of first signals and the set of second signals into a section in which an absolute value of a difference between each signal of the set of first signals and each signal of the set of second signals is equal to or higher than an average value of differences constituted with the difference and a section in which the absolute value is lower than the average value.
7. The focus detection device according to claim 1 , wherein
the controller controls the drive of the photographing optical system based upon the shift amount calculated in correspondence with an image of a subject located in closest distance from the photographing optical system upon calculating shift amounts constituted with the shift amount.
8. An image-capturing apparatus comprising:
the focus detection device according to claim 1 .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/355,000 US20190215441A1 (en) | 2013-11-26 | 2019-03-15 | Focus detection device and image-capturing apparatus |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-243944 | 2013-11-26 | ||
JP2013243944A JP2015102735A (en) | 2013-11-26 | 2013-11-26 | Focus detection device and imaging device |
PCT/JP2014/081276 WO2015080164A1 (en) | 2013-11-26 | 2014-11-26 | Focal point detection device, and imaging device |
US201615039175A | 2016-12-19 | 2016-12-19 | |
US16/151,985 US10291840B2 (en) | 2013-11-26 | 2018-10-04 | Focus detection device and image-capturing apparatus |
US16/355,000 US20190215441A1 (en) | 2013-11-26 | 2019-03-15 | Focus detection device and image-capturing apparatus |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/151,985 Continuation US10291840B2 (en) | 2013-11-26 | 2018-10-04 | Focus detection device and image-capturing apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190215441A1 true US20190215441A1 (en) | 2019-07-11 |
Family
ID=53199103
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/039,175 Active US10122910B2 (en) | 2013-11-26 | 2014-11-26 | Focus detection device and image-capturing apparatus |
US16/151,985 Active US10291840B2 (en) | 2013-11-26 | 2018-10-04 | Focus detection device and image-capturing apparatus |
US16/355,000 Abandoned US20190215441A1 (en) | 2013-11-26 | 2019-03-15 | Focus detection device and image-capturing apparatus |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/039,175 Active US10122910B2 (en) | 2013-11-26 | 2014-11-26 | Focus detection device and image-capturing apparatus |
US16/151,985 Active US10291840B2 (en) | 2013-11-26 | 2018-10-04 | Focus detection device and image-capturing apparatus |
Country Status (4)
Country | Link |
---|---|
US (3) | US10122910B2 (en) |
JP (1) | JP2015102735A (en) |
CN (2) | CN106415349B (en) |
WO (1) | WO2015080164A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3098638B1 (en) * | 2015-05-29 | 2022-05-11 | Phase One A/S | Adaptive autofocusing system |
CN108139563B (en) * | 2015-09-30 | 2020-09-01 | 富士胶片株式会社 | Focus control device, focus control method, focus control program, lens device, and imaging device |
JP6685712B2 (en) | 2015-12-14 | 2020-04-22 | キヤノン株式会社 | Image processing device, distance detection device, focus adjustment device, imaging device, image processing method and program |
JP6700973B2 (en) * | 2016-05-24 | 2020-05-27 | キヤノン株式会社 | Imaging device and control method thereof |
CN106791373B (en) * | 2016-11-29 | 2020-03-13 | Oppo广东移动通信有限公司 | Focusing processing method and device and terminal equipment |
CN110632734B (en) * | 2017-05-24 | 2021-09-14 | Oppo广东移动通信有限公司 | Focusing method and related product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100245656A1 (en) * | 2009-03-31 | 2010-09-30 | Sony Corporation | Imaging device and focus detecting method |
US20100302433A1 (en) * | 2008-02-13 | 2010-12-02 | Canon Kabushiki Kaisha | Image forming apparatus |
US8730545B2 (en) * | 2011-03-24 | 2014-05-20 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing a control program for imaging device |
US8804016B2 (en) * | 2011-03-24 | 2014-08-12 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing an imaging program |
US9106824B2 (en) * | 2011-03-31 | 2015-08-11 | Fujifilm Corporation | Imaging apparatus and driving method selecting one of a phase difference AF mode and a contrast AF mode |
US9338341B2 (en) * | 2008-07-10 | 2016-05-10 | Canon Kabushiki Kaisha | Image pickup apparatus capable of reading proper functional pixels when image signals are mixed or skipped and are read out, and method of controlling the same |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59123808A (en) | 1982-12-29 | 1984-07-17 | Nippon Seimitsu Kogyo Kk | Automatic focus control device |
JPS6118912A (en) * | 1984-07-06 | 1986-01-27 | Canon Inc | Focus detecting device |
JPS6122316A (en) * | 1984-07-11 | 1986-01-30 | Canon Inc | Focus detecting device |
JPS6155620A (en) | 1984-08-27 | 1986-03-20 | Canon Inc | Automatic focusing device |
JP2985087B2 (en) | 1988-11-16 | 1999-11-29 | 株式会社ニコン | Focus detection device |
JP3248257B2 (en) | 1992-09-02 | 2002-01-21 | 株式会社ニコン | Focus detection device |
JP3491343B2 (en) * | 1994-06-29 | 2004-01-26 | 株式会社ニコン | Focus detection device and focus detection method |
JPH1026526A (en) | 1996-07-10 | 1998-01-27 | Fuji Photo Film Co Ltd | Triangulation type range finding method |
JP3806473B2 (en) * | 1996-11-29 | 2006-08-09 | オリンパス株式会社 | Ranging device |
JP2002006208A (en) * | 2000-06-23 | 2002-01-09 | Asahi Optical Co Ltd | Digital still camera with automatic focus detecting mechanism |
JP4838175B2 (en) * | 2007-03-01 | 2011-12-14 | オリンパスイメージング株式会社 | Focus detection device |
JP4973273B2 (en) * | 2007-03-28 | 2012-07-11 | 株式会社ニコン | Digital camera |
JP5191168B2 (en) * | 2007-06-11 | 2013-04-24 | 株式会社ニコン | Focus detection apparatus and imaging apparatus |
JP5029274B2 (en) * | 2007-10-10 | 2012-09-19 | 株式会社ニコン | Imaging device |
JP2009109839A (en) * | 2007-10-31 | 2009-05-21 | Nikon Corp | Image tracking device and imaging device |
JP5317562B2 (en) * | 2008-07-17 | 2013-10-16 | キヤノン株式会社 | Phase difference detection device, imaging device, phase difference detection method, phase difference detection program |
JP2010091943A (en) * | 2008-10-10 | 2010-04-22 | Canon Inc | Imaging apparatus |
JP5388544B2 (en) * | 2008-11-05 | 2014-01-15 | キヤノン株式会社 | Imaging apparatus and focus control method thereof |
JP2011029905A (en) * | 2009-07-24 | 2011-02-10 | Fujifilm Corp | Imaging device, method and program |
JP2011221253A (en) * | 2010-04-08 | 2011-11-04 | Sony Corp | Imaging apparatus, solid-state image sensor, imaging method and program |
JP2012003087A (en) * | 2010-06-17 | 2012-01-05 | Olympus Corp | Imaging apparatus |
JP5513326B2 (en) * | 2010-09-07 | 2014-06-04 | キヤノン株式会社 | Imaging device and imaging apparatus |
US9065999B2 (en) * | 2011-03-24 | 2015-06-23 | Hiok Nam Tay | Method and apparatus for evaluating sharpness of image |
JP5956782B2 (en) * | 2011-05-26 | 2016-07-27 | キヤノン株式会社 | Imaging device and imaging apparatus |
JP5845023B2 (en) * | 2011-08-08 | 2016-01-20 | キヤノン株式会社 | FOCUS DETECTION DEVICE, LENS DEVICE HAVING THE SAME, AND IMAGING DEVICE |
JP5907595B2 (en) * | 2011-09-27 | 2016-04-26 | キヤノン株式会社 | Imaging device |
US9124875B2 (en) * | 2012-05-23 | 2015-09-01 | Fujifilm Corporation | Stereoscopic imaging apparatus |
JP6014452B2 (en) * | 2012-10-16 | 2016-10-25 | キヤノン株式会社 | FOCUS DETECTION DEVICE, LENS DEVICE HAVING THE SAME, AND IMAGING DEVICE |
JP6288909B2 (en) * | 2012-10-19 | 2018-03-07 | キヤノン株式会社 | Imaging device and imaging apparatus |
JP6033038B2 (en) * | 2012-10-26 | 2016-11-30 | キヤノン株式会社 | FOCUS DETECTION DEVICE, IMAGING DEVICE, IMAGING SYSTEM, AND FOCUS DETECTION METHOD |
US9456141B2 (en) * | 2013-02-22 | 2016-09-27 | Lytro, Inc. | Light-field based autofocus |
-
2013
- 2013-11-26 JP JP2013243944A patent/JP2015102735A/en active Pending
-
2014
- 2014-11-26 CN CN201480074000.4A patent/CN106415349B/en active Active
- 2014-11-26 US US15/039,175 patent/US10122910B2/en active Active
- 2014-11-26 WO PCT/JP2014/081276 patent/WO2015080164A1/en active Application Filing
- 2014-11-26 CN CN201910149826.7A patent/CN109639953B/en active Active
-
2018
- 2018-10-04 US US16/151,985 patent/US10291840B2/en active Active
-
2019
- 2019-03-15 US US16/355,000 patent/US20190215441A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100302433A1 (en) * | 2008-02-13 | 2010-12-02 | Canon Kabushiki Kaisha | Image forming apparatus |
US9338341B2 (en) * | 2008-07-10 | 2016-05-10 | Canon Kabushiki Kaisha | Image pickup apparatus capable of reading proper functional pixels when image signals are mixed or skipped and are read out, and method of controlling the same |
US20100245656A1 (en) * | 2009-03-31 | 2010-09-30 | Sony Corporation | Imaging device and focus detecting method |
US8730545B2 (en) * | 2011-03-24 | 2014-05-20 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing a control program for imaging device |
US8804016B2 (en) * | 2011-03-24 | 2014-08-12 | Fujifilm Corporation | Color imaging element, imaging device, and storage medium storing an imaging program |
US9106824B2 (en) * | 2011-03-31 | 2015-08-11 | Fujifilm Corporation | Imaging apparatus and driving method selecting one of a phase difference AF mode and a contrast AF mode |
Also Published As
Publication number | Publication date |
---|---|
CN106415349B (en) | 2019-03-26 |
US20190037130A1 (en) | 2019-01-31 |
CN109639953A (en) | 2019-04-16 |
US10291840B2 (en) | 2019-05-14 |
CN109639953B (en) | 2021-02-26 |
CN106415349A (en) | 2017-02-15 |
JP2015102735A (en) | 2015-06-04 |
US10122910B2 (en) | 2018-11-06 |
US20170118396A1 (en) | 2017-04-27 |
WO2015080164A1 (en) | 2015-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10291840B2 (en) | Focus detection device and image-capturing apparatus | |
JP4961993B2 (en) | Imaging device, focus detection device, and imaging device | |
JP6029309B2 (en) | Focus detection device | |
US8736742B2 (en) | Image pickup apparatus that performs automatic focus control and control method for the image pickup apparatus | |
US9344617B2 (en) | Image capture apparatus and method of controlling that performs focus detection | |
JP4823167B2 (en) | Imaging device | |
JP5845023B2 (en) | FOCUS DETECTION DEVICE, LENS DEVICE HAVING THE SAME, AND IMAGING DEVICE | |
US20180084192A1 (en) | Image capture apparatus, control method for image capture apparatus and recording medium | |
US9967451B2 (en) | Imaging apparatus and imaging method that determine whether an object exists in a refocusable range on the basis of distance information and pupil division of photoelectric converters | |
JP5417827B2 (en) | Focus detection apparatus and imaging apparatus | |
US10404904B2 (en) | Focus detection device, focus adjustment device, and camera | |
JP7292123B2 (en) | IMAGING DEVICE AND CONTROL METHOD THEREOF, PROGRAM, STORAGE MEDIUM | |
JP2016224372A (en) | Focus detection device, imaging device and electronic device | |
JP6005246B2 (en) | Imaging apparatus, histogram display method, program, and image processing apparatus | |
JP2016071275A (en) | Image-capturing device and focus control program | |
JP6257201B2 (en) | FOCUS DETECTION DEVICE, ITS CONTROL METHOD, CONTROL PROGRAM, AND IMAGING DEVICE | |
JP6916419B2 (en) | Focus adjuster | |
JP2017032874A (en) | Focus detection device and method, and imaging apparatus | |
JP2014206601A (en) | Imaging apparatus and focus adjustment method | |
JP2016114721A (en) | Imaging apparatus and method of controlling the same | |
JP2021184102A (en) | Imaging apparatus | |
KR20160038842A (en) | Imaging apparatus and imaging method | |
JP5836792B2 (en) | Imaging device, histogram display method, and program | |
JP2018116185A (en) | Focus detection device, control method thereof, control program, and imaging apparatus | |
JP2018031931A (en) | Focus detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |