JP2010139563A - Focus detector, focusing device, and imaging apparatus - Google Patents

Focus detector, focusing device, and imaging apparatus Download PDF

Info

Publication number
JP2010139563A
JP2010139563A JP2008313479A JP2008313479A JP2010139563A JP 2010139563 A JP2010139563 A JP 2010139563A JP 2008313479 A JP2008313479 A JP 2008313479A JP 2008313479 A JP2008313479 A JP 2008313479A JP 2010139563 A JP2010139563 A JP 2010139563A
Authority
JP
Japan
Prior art keywords
focus detection
focus
optical system
image
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2008313479A
Other languages
Japanese (ja)
Other versions
JP2010139563A5 (en
JP5417827B2 (en
Inventor
Toshiaki Maeda
敏彰 前田
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2008313479A priority Critical patent/JP5417827B2/en
Publication of JP2010139563A publication Critical patent/JP2010139563A/en
Publication of JP2010139563A5 publication Critical patent/JP2010139563A5/ja
Application granted granted Critical
Publication of JP5417827B2 publication Critical patent/JP5417827B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

Focus adjustment is performed by preventing focus adjustment from being performed based on a focus evaluation value influenced by a subject unintended by a photographer included in a focus detection area.
A focus detection apparatus detects first focus detection information indicating a focus adjustment state of an optical system with respect to a first detection region set in an image plane by the optical system. And second focus detection information indicating the focus adjustment state of the optical system 1 with respect to the second detection region set in a range including the first detection region in the image plane by a method different from that of the first focus detection means 9. A second focus detection means 6 for detecting the first focus detection information and a determination means for determining whether or not to adjust the focus of the optical system 1 according to the first focus detection information based on the first focus detection information and the second focus detection information. 9.
[Selection] Figure 1

Description

  The present invention relates to a focus detection device, a focus adjustment device, and an imaging device.

  The focus evaluation value is calculated while moving the focusing lens little by little at a predetermined interval (hereinafter referred to as AF (Auto focus) search), and the focus adjustment is performed with the position where the focus evaluation value is maximized as the in-focus position. 2. Description of the Related Art A digital camera that detects a human face from a captured image and obtains a focus evaluation value for the detected face area in a contrast detection type automatic focus adjustment apparatus (hereinafter simply referred to as contrast type AF) is known. (For example, refer to Patent Document 1).

Prior art documents related to the invention of this application include the following.
JP 2006-208443 A

  In general, a human face has a low contrast, and a landscape has a high contrast because it contains many high-frequency components. Since contrast AF greatly depends on the contrast level in the focus detection area, when taking a commemorative photo at a scenic spot or the like, if the high-frequency part of the background is included in the focus detection area, it will focus on the human face. Even when captured in the detection area, the background may be focused.

A focus detection apparatus according to the present invention includes first focus detection means for detecting first focus detection information indicating a focus adjustment state of an optical system with respect to a first detection region set in an image plane by the optical system, A second focus for detecting second focus detection information indicating a focus adjustment state of the optical system with respect to a second detection region set in a range including the first detection region in the image plane by a method different from the focus detection means. And a determination unit that determines whether or not to adjust the focus of the optical system in accordance with the first focus detection information based on the first focus detection information and the second focus detection information.
A focus adjustment apparatus according to the present invention includes the focus detection apparatus described above and a focus adjustment unit that performs focus adjustment of an optical system based on at least one of first focus detection information and second focus detection information.
An imaging apparatus according to the present invention includes the above-described focus adjustment apparatus, and captures an image by an optical system.

  According to the present invention, it is possible to prevent the focus adjustment from being performed based on the focus evaluation value affected by the subject unintended by the photographer included in the focus detection area, and to perform the focus adjustment accurately.

  FIG. 1 is a diagram illustrating a configuration of a digital camera according to an embodiment. The photographing lens 1 includes a focusing lens, and forms a subject image on a predetermined image plane. The focusing lens of the photographic lens 1 is driven by a focus motor 10. A stepping motor is normally used as the focus motor 10, and the open loop control pulse drive is performed by the CPU 9.

  The imaging element 2 is provided at the position of the image plane by the photographing lens 1. In the image pickup device 2, image pickup pixels are two-dimensionally arranged, and a focus detection pixel is incorporated in a portion corresponding to a focus detection region for phase difference AF set in advance in an image plane by the photographing lens 1. Yes. Each imaging pixel receives light through the photographing lens 1 and outputs an image signal corresponding to the light intensity distribution. On the other hand, as described later, the focus detection pixels are paired with adjacent pixels. Each pair of focus detection pixels receives a pair of lights via the photographing lens 1 and outputs a pair of focus detection signals according to the light intensity distribution.

  The analog signal processing unit 3 includes a CDS circuit, an AGC circuit, a color separation circuit, and the like (not shown). The image signal output from each imaging pixel of the image sensor 2 and the focus detection output from each focus detection pixel of the image sensor 2. Each of the signals is processed. In the CDS circuit, the signal is subjected to correlated double sampling (CDS) processing, and in the AGC circuit, the signal level is adjusted. The A / D converter 4 converts the processed image signal and focus detection signal output from the analog signal processing unit 3 into digital signals. The digital signal processing unit 5 includes signal processing circuits such as a γ correction circuit, a luminance signal and a color difference signal generation circuit, and performs various processes on the image signal.

  The phase difference AF unit 6 performs AF calculation (phase difference AF calculation) by a pupil division type phase difference detection method based on the focus detection signal converted into a digital signal by the A / D converter 4. The display processing unit 7 performs processing for displaying various images on the LCD 8 based on image data and the like held in the buffer memory 12. The LCD 8 is a liquid crystal display, and displays various images according to the control of the display processing unit 7.

  The CPU 9 has an AE calculation function, an AF calculation function, an AWB calculation function, and the like. With these functions, sequence control of the entire camera, exposure calculation control, focus detection and focus adjustment control, white balance control, and the like are performed. In the focus detection control performed by the CPU 9, AF calculation (contrast AF calculation) by a contrast method is performed based on the image data converted into a digital signal by the A / D converter 4. In contrast AF, there is a correlation between the degree of blur of an image and the contrast, and focusing is performed using the fact that the contrast of the image is maximized when the image is focused. The level of contrast can be evaluated based on the magnitude of the focus evaluation value obtained from an evaluation function that extracts a high frequency component of the image signal.

  When searching for the peak of the focus evaluation value, usually, the focusing lens is shifted by a predetermined amount toward the infinity side or the close side, and the focus evaluation value calculated at that time is compared with the focus evaluation value before the movement. When the focus evaluation value after the movement is large, it is considered that the resolution (in-focus level) tends to increase, and the focusing lens is further moved in the same direction, and the same calculation and comparison are performed. On the other hand, when the focus evaluation value after the movement is small, it is considered that the degree of focusing tends to be low, and the focusing lens is moved in the reverse direction to perform the same calculation and comparison. The position at which the focus evaluation value is maximized, that is, the in-focus position is searched by so-called “mountain climbing AF” in which such processing is repeated.

  Although different from the AF search described above, the focus evaluation value is acquired at predetermined intervals while moving the focusing lens from a predetermined position, for example, from the infinite end to the closest end, and the maximum value of the focus evaluation value is searched from the entire range. A so-called “entire search” may be performed.

  The buffer memory 12 is a frame memory that can store data for a plurality of frames imaged by the imaging device 2, and stores image data that has been subjected to a series of processing by the digital signal processing unit 5. The face recognition calculation unit 11 performs face recognition processing on the still image display still image held in the buffer memory 12, and when the presence of a face is detected, the face detection area position recognized by the CPU 9 is detected. Output coordinates indicating size.

  The recording / playback signal processing unit 13 performs recording control of the image data subjected to the series of processing by the digital signal processing unit 5 to the external storage device 14 and reading control of the image data recorded in the external storage device 14. . The external storage device 14 is a non-volatile recording medium that can be attached to and detached from the camera, and records image data under the control of the recording / playback signal processing unit 13.

  FIG. 2 is a diagram showing an example of a focus detection area for phase difference AF set in the image plane by the photographing lens 1. This focus detection area is an example of an area (focus detection area, focus detection position) on which an image is sampled on the screen when focus detection is performed using a focus detection pixel array to be described later. In this embodiment, a focus detection area 101 for phase difference AF is arranged at the center in a rectangular image plane 100. A plurality of focus detection pixels are linearly arranged in the longitudinal direction of the focus detection area 101 indicated by a rectangle. Here, an example in which only one focus detection area 101 exists in the image plane 100 as a focus detection area for phase difference AF is shown, but in reality, a plurality of focus detection areas are respectively in the image plane 100. It is set at a different position.

  FIG. 3 is a front view showing a detailed configuration of the image sensor 2, and shows an enlarged view of the vicinity of the focus detection area 101 on the image sensor 2. The image sensor 2 includes an imaging pixel 310 and a pair of focus detection pixels 313 and 314. The imaging pixels 310 are arranged in a two-dimensional square lattice in the horizontal and vertical directions, while the focus detection pixels 313 and 314 are arranged in the horizontal direction.

  As shown in FIG. 4, the imaging pixel 310 includes a microlens 20 and a photoelectric conversion unit 21. The focus detection pixel 313 includes a microlens 20 and a photoelectric conversion unit 23 as shown in FIG. The shape of the photoelectric conversion unit 23 is a left semicircle in contact with the vertical bisector of the microlens 20. Further, the focus detection pixel 314 includes a microlens 20 and a photoelectric conversion unit 24 as shown in FIG. The shape of the photoelectric conversion unit 24 is a right semicircle in contact with the vertical bisector of the microlens 20.

  The photoelectric conversion units 23 and 24 are arranged in the horizontal direction when the microlenses 20 are displayed in a superimposed manner, and have a symmetrical shape with respect to the vertical bisector of the microlens 20. The focus detection pixels 313 and the focus detection pixels 314 are alternately arranged in the horizontal direction (the arrangement direction of the photoelectric conversion units 23 and 24). The spectral sensitivity characteristics of the image pickup pixel 310 and the focus detection pixels 313 and 314 are as shown in FIG.

  FIG. 7 is a cross-sectional view of the imaging pixel 310. In the imaging pixel 310, the microlens 20 is disposed in front of the photoelectric conversion unit 21 for imaging, and the photoelectric conversion unit 21 is projected forward by the microlens 20. The photoelectric conversion unit 21 is formed on the semiconductor circuit substrate 29.

  FIG. 8A is a cross-sectional view of the focus detection pixel 313. In the focus detection pixel 313, the microlens 20 is disposed in front of the photoelectric conversion unit 23, and the photoelectric conversion unit 23 is projected forward by the microlens 20. The photoelectric conversion unit 23 is formed on the semiconductor circuit substrate 29, and the microlens 20 is integrally and fixedly formed thereon by the manufacturing process of the semiconductor image sensor. The photoelectric conversion unit 23 is disposed on one side of the optical axis of the microlens 20.

  FIG. 8B is a cross-sectional view of the focus detection pixel 314. In the focus detection pixel 314, the micro lens 20 is disposed in front of the photoelectric conversion unit 24, and the photoelectric conversion unit 24 is projected forward by the micro lens 20. The photoelectric conversion unit 24 is formed on the semiconductor circuit substrate 29, and the microlens 20 is integrally and fixedly formed thereon by a manufacturing process of the semiconductor image sensor. The photoelectric conversion unit 24 is disposed on one side of the optical axis of the microlens 20 and on the opposite side of the photoelectric conversion unit 23.

  FIG. 9 shows a configuration of a focus detection optical system of a pupil division type phase difference detection method using a microlens. In FIG. 9, reference numeral 90 denotes an exit pupil set at a distance d forward from the microlens arranged on the planned imaging plane of the photographing lens 1 (see FIG. 1). This distance d is a distance determined according to the curvature and refractive index of the microlens, the distance between the microlens and the photoelectric conversion unit, and is referred to as a distance measuring pupil distance in this specification. 91 is an optical axis of the interchangeable lens, 20a to 20d are micro lenses, 23a, 23b, 24a and 24b are photoelectric conversion units, 313a, 313b, 314a and 314b are focus detection pixels, and 73, 74, 83 and 84 are focus detection light fluxes. It is.

  Reference numeral 93 denotes an area of the photoelectric conversion units 23a and 23b projected by the micro lenses 20a and 20c, and is referred to as a distance measuring pupil in this specification. In FIG. 9, the distance measuring pupil 93 is shown as an elliptical area for easy understanding, but in reality, the shape of the photoelectric conversion unit is an enlarged projection. Similarly, 94 is an area of the photoelectric conversion units 24a and 24b projected by the microlenses 20b and 20d, and is called a distance measuring pupil in this specification. In FIG. 9, the distance measuring pupil 94 is shown as an elliptical region for easy understanding, but in reality, the shape of the photoelectric conversion unit is a magnified shape.

  In FIG. 9, four adjacent focus detection pixels 313a, 313b, 314a, and 314b are schematically illustrated. However, in other focus detection pixels, the photoelectric conversion units are applied from the corresponding distance measurement pupils to the respective microlenses. Receives incoming light flux. The arrangement direction of the focus detection pixels is made to coincide with the arrangement direction of the pair of distance measuring pupils, that is, the arrangement direction of the pair of photoelectric conversion units.

  The microlenses 20a to 20d are disposed in the vicinity of the planned imaging plane of the photographing lens 1 (see FIG. 1). The microlenses 20a to 20d project the shapes of the photoelectric conversion units 23a, 23b, 24a, and 24b arranged behind them onto the exit pupil 90 that is separated from the microlenses 20a to 20c by the distance measurement pupil distance d. The projection shape forms distance measuring pupils 93 and 94. That is, the projection direction of the photoelectric conversion unit in each focus detection pixel is determined so that the projection shape (ranging pupils 93 and 94) of the photoelectric conversion unit of each focus detection pixel matches on the exit pupil 90 at the projection distance d. Has been.

  The light beam 73 passing through the distance measuring pupil 93 and traveling toward the microlens 20a forms an image on the microlens 20a. The photoelectric conversion unit 23a outputs a signal corresponding to the light intensity of the formed image. Similarly, the light beam 83 passing through the distance measuring pupil 93 and traveling toward the microlens 20c forms an image on the microlens 20c, and the photoelectric conversion unit 23b outputs a signal corresponding to the light intensity of the formed image.

  Further, the light flux 74 that passes through the distance measuring pupil 94 and travels toward the microlens 20b forms an image on the microlens 20b. The photoelectric conversion unit 24a outputs a signal corresponding to the light intensity of the formed image. Similarly, the light beam 84 that passes through the distance measuring pupil 94 and travels toward the micro lens 20d forms an image on the micro lens 20d, and the photoelectric conversion unit 24b outputs a signal corresponding to the light intensity of the formed image.

  A large number of the above-described two types of focus detection pixels are arranged in a straight line, and the output of the photoelectric conversion unit of each pixel is grouped into an output group corresponding to the distance measurement pupil 93 and the distance measurement pupil 94, whereby the distance measurement pupil 93 and the measurement pupil. Information relating to the intensity distribution of the image can be obtained for a pair of images formed on the pixel array by the focus detection light beams that have passed through the distance pupils 94, respectively. The phase difference AF unit 6 shown in FIG. 1 detects the amount of image misalignment between a pair of images by performing a correlation calculation process and a phase difference detection process on this information by a so-called pupil division type phase difference detection method. Further, by performing a conversion operation according to the center-of-gravity interval of the pair of distance measuring pupils on the image shift amount, the current image plane relative to the planned image plane (the focus corresponding to the position of the microlens array on the planned image plane) The deviation (defocus amount) of the imaging plane in the detection area is calculated.

  Based on the defocus amount obtained by the phase difference AF unit 6 as described above, the CPU 9 controls the focus motor 10 to drive the focusing lens of the photographic lens 1 to adjust the focus of the photographic lens 1. In this way, the phase difference AF is performed.

  On the other hand, in contrast AF, the CPU 9 sets a focus detection area for contrast AF in the image plane by the photographing lens 1 based on the face detection area specified by the coordinates output from the face recognition calculation unit 11. For example, for a face detection area of an image as shown in FIG. 10, an area that matches the face detection area is set as a focus detection area 30 for contrast AF. Note that the face detection area and the focus detection area 30 for contrast AF do not have to coincide completely. For example, only a part of the face detection area may be set in the focus detection area 30 for contrast AF. Alternatively, a range larger than the face detection area may be set in the focus detection area 30 for contrast AF.

  Based on the image signal output from the imaging pixel corresponding to the focus detection area 30 set as described above, the focus evaluation value is obtained by moving the focusing lens of the photographing lens 1 as described above under the control of the CPU 9. Find the maximum focus position. When the in-focus position is obtained, the focusing lens of the taking lens 1 is driven to the in-focus position, and the focus of the taking lens 1 is adjusted. In this way, contrast AF is performed.

  By the way, when contrast AF is performed using the face detection area as the focus detection region 30, a characteristic curve of the focus evaluation value with respect to the position of the focusing lens as shown in FIG. 11 may be obtained. Since the background subject contains a lot of high-frequency components, the characteristic curve of the focus evaluation value of only the background is a characteristic curve indicated by a broken line in FIG. On the other hand, since the human face has a small contrast change (spatial frequency change), the characteristic curve of the focus evaluation value of only the person becomes a characteristic curve as shown by a solid line in FIG. As shown in FIG. 10, when a person's face and background overlap in the focus detection area 30, that is, when there is a perspective conflict, the background characteristic curve (broken line) and the person characteristic curve (solid line) are combined. A characteristic curve of the focused focus evaluation value may be obtained.

  When the focus search is performed based on the characteristic curve of the focus evaluation value, the lens position L4 corresponding to the background position is used instead of the position of the person who is the original main subject (lens position L2 shown in FIG. 11). The lens is driven as the target drive position. In this case, although the focus detection area is set by the face detection function, the out-of-focus photo is not correctly focused on the person.

  In order to deal with the above problems, in the present embodiment, by performing the processing shown in the flowchart of FIG. 12, either phase difference AF or contrast AF is appropriately selected according to the shooting situation. The focus of the photographic lens 1 is adjusted. Hereinafter, the contents of the processing will be described with reference to the flowchart of FIG.

  In step S <b> 10, face recognition processing is executed by the face recognition calculation unit 11. Information on the face detection area detected by this face recognition processing is output to the CPU 9 as described above. In step S20, the CPU 9 sets a focus detection area for contrast AF based on the face detection area obtained by the face recognition process in step S10. Thereby, for example, the focus detection area 30 in FIG. 10 that matches the face detection area is set.

  In step S30, the CPU 9 includes any one of the focus detection areas for phase difference AF set in the image plane by the photographing lens 1 within the focus detection area for contrast AF set in step S20. It is determined whether or not. When at least one focus detection area for phase difference AF exists in the focus detection area for contrast AF, the process proceeds to step S120. On the other hand, if no focus detection area for phase difference AF exists in the focus detection area for contrast AF, the process proceeds to step S40.

  For example, as shown in FIG. 13, it is assumed that a focus detection area for phase difference AF is set at positions indicated by alternate long and short dash lines 41, 42, and 43 in the image plane of the photographing lens 1, respectively. Further, it is assumed that the focus detection area 30 for contrast AF is set for the face detection area detected by the face recognition process. In such a case, since there is no focus detection area for phase difference AF in the focus detection area 30, the process proceeds to step S40.

  In step S40, the search lens 1 searches for a focus detection area for phase difference AF, which is a detection target of the defocus amount, by the CPU 9 based on the focus detection area for contrast AF set in step S20. Set in the image plane. Here, an area within a predetermined range located below the focus detection area for contrast AF is set as a search area. For example, as shown in FIG. 14, a search area 31 of the same size adjacent to the lower side of the focus detection area 30 is set. Thereby, even if there is no phase detection AF focus detection area in the face detection area, the range of the phase detection AF focus detection area for detecting the defocus amount can be extended to the neck and chest of the person. The size of the focus detection area 30 and the search area 31 may be different. Further, when the orientation of the camera changes around the optical axis of the photographic lens 1, it is preferable to change the setting direction of the search area with respect to the focus detection region for contrast AF according to the change in the orientation. In other words, even if the orientation of the camera changes, the search area is always positioned below the focus detection area for contrast AF.

  In step S50, the CPU 9 determines whether or not a focus detection area for phase difference AF exists in the search area set in step S40. If there is at least one focus detection area for phase difference AF in the search area, the process proceeds to step S60. On the other hand, when no focus detection area for phase difference AF exists in the search area, the process proceeds to step S130. In the example of FIG. 14 described above, since the search area 31 and the alternate long and short dash line 42 overlap, it is determined that the focus detection area for phase difference AF indicated by the alternate long and short dash line 42 exists in the search area 31. Proceed to S60.

  In step S60, the phase difference AF unit 6 detects the defocus amount for the phase difference AF focus detection area determined to be in the search area in step S50. That is, based on a pair of focus detection signals output from the focus detection pixels arranged corresponding to the focus detection area, a defocus amount representing a shift amount of the pair of images is calculated.

  Note that the position of the search area 31 in FIG. 14 is in the direction in which the image of the person whose face is recognized in step S10 extends with the position of the focus detection area 30 as a reference. In step S60, a defocus amount is detected with respect to a focus detection area for phase difference AF existing in the search area 31. In other words, in step S60, the phase difference AF unit 6 is positioned in the direction in which the image by the photographing lens 1 extends with the focus detection region 30 as a reference among the plurality of focus detection regions for phase difference AF. The defocus amount is detected with respect to the focus detection area for use. That is, the defocus amount is detected with respect to the focus detection area for phase difference AF located below the area corresponding to the face set as the focus detection area for contrast AF.

  In step S70, the CPU 9 calculates an in-focus distance (referred to as in-focus distance A) corresponding to the defocus amount detected in step S60. This in-focus distance A is determined corresponding to the in-focus position when the focusing lens is moved in accordance with the defocus amount, and corresponds to the shooting distance from the camera to the subject in focus.

  In step S80, the CPU 9 detects a focus evaluation value for the focus detection area for contrast AF set in step S20. Here, as described above, the movement of the focusing lens and the detection of the focus evaluation value based on the image signal output from the imaging pixel corresponding to the focus detection area for contrast AF are repeated to obtain the maximum focus evaluation value. Find the in-focus position.

  In step S90, the CPU 9 calculates an in-focus distance (referred to as in-focus distance B) corresponding to the maximum focus evaluation value detected in step S80. The in-focus distance B is determined according to the position of the focusing lens when the maximum focus evaluation value is obtained, that is, the in-focus position, and is in focus from the camera in the same manner as the in-focus distance A obtained in step S70. This corresponds to the shooting distance to the subject.

  In step S100, the CPU 9 determines whether or not the difference (B−A) between the focusing distance B and the focusing distance A calculated in steps S90 and S70 is larger than a predetermined threshold th. When the value of B−A exceeds the threshold value th, that is, when the focusing distance B is larger than the focusing distance A and the difference exceeds the threshold value th, the process proceeds to step S110. On the other hand, if the value of B−A is equal to or smaller than the threshold th, the process proceeds to step S140. Note that the threshold th used for this determination is preferably determined based on the depth of field. Here, the depth of field can be obtained based on the actual focal length of the photographing lens 1 and the stop.

  In step S110, the CPU 9 performs phase difference AF based on the defocus amount detected in step S60. That is, the focus adjustment of the photographing lens 1 is performed by driving the focusing lens to the in-focus position corresponding to the defocus amount using the focus motor 10. If contrast AF is performed when the difference between the in-focus distance B and the in-focus distance A is large, there is a risk that the focus adjustment is performed with the lens position corresponding to the position of the background as the target drive position as described above. Therefore, in such a case, focus adjustment is performed accurately by executing step S110 and performing phase difference AF. If step S110 is performed, the process shown in the flowchart of FIG. 12 will be complete | finished.

  On the other hand, if it is determined in step S100 that B−A is equal to or less than the threshold th, the CPU 9 performs contrast AF based on the focus evaluation value detected in step S80 in step S140. That is, by using the focus motor 10 to drive the focusing lens to the in-focus position when the maximum focus evaluation value is obtained, the focus of the photographing lens 1 is adjusted. If step S140 is performed, the process shown to the flowchart of FIG. 12 will be complete | finished.

  By executing the processing as described above, a focus evaluation value and a defocus amount are detected, and based on these values, a step is performed to determine whether or not to adjust the focus of the photographing lens 1 according to the focus evaluation value. This can be determined in S100. That is, when the difference between the focusing distance B and the focusing distance A exceeds a predetermined threshold th, it is determined that the focus of the photographing lens 1 is not adjusted according to the focus evaluation value, and the process proceeds to step S110, where the phase difference AF To adjust the focus. On the other hand, if the difference between the in-focus distance B and the in-focus distance A is equal to or smaller than the threshold value th, it is determined that the photographing lens 1 is adjusted in accordance with the focus evaluation value, and the process proceeds to step S140. Make adjustments. Thereby, the focus adjustment of the photographing lens 1 can be performed by appropriately selecting one of the phase difference AF and the contrast AF according to the photographing situation.

  If it is determined in step S30 that a focus detection area for phase difference AF exists within the focus detection area for contrast AF, the phase difference AF unit 6 causes the focus detection area for phase difference AF to be detected in step S120. On the other hand, the defocus amount is detected as in step S60. Here, when there are a plurality of focus detection areas for phase difference AF in the focus detection area for contrast AF, any of them may be selected to detect the defocus amount. Alternatively, the defocus amount may be detected for all corresponding focus detection areas. In this case, one of the detected defocus amounts, for example, the defocus amount indicating the closest distance may be selected as the defocus amount for focus adjustment, or the average value of each detected defocus amount may be obtained. Good. If step S120 is performed, it will progress to step S110 and CPU9 will perform phase difference AF based on the defocus amount detected by step S120.

  As described above, when the focus detection area for phase difference AF is within the focus detection area for contrast AF, focus adjustment by phase difference AF is performed without performing the determination in step S100. Thereby, when there is a phase difference AF focus detection area corresponding to the face detection area, the focus adjustment of the photographing lens 1 can be appropriately performed by performing the phase difference AF with priority. At this time, if an appropriate defocus amount cannot be obtained in step S120, contrast AF may be used instead of phase difference AF.

  As described above, in step S120, the phase difference AF unit 6 detects the defocus amount with respect to the focus detection area for phase difference AF existing in the focus detection area 30 for contrast AF. On the other hand, in step S60 described above, the defocus amount is detected for the focus detection area for phase difference AF existing in the search area 31 set around the focus detection area 30. In summary, the phase difference AF unit 6 applies to the focus detection region 30 in the image plane by the photographing lens 1 and the focus detection region for phase difference AF set around the focus detection region 30 (that is, the range including the focus detection region 30). Thus, the defocus amount is detected. Instead of setting the search area, a focus detection area for phase difference AF set in the vicinity of the focus detection area 30 may be selected.

  If it is determined in step S50 that the focus detection area for phase difference AF does not exist in the search area, the CPU 9 performs focus on the focus detection area for contrast AF in the same manner as in step S80 in step S130. An evaluation value is detected. If step S130 is performed, it will progress to step S140 and will perform contrast AF based on the focus evaluation value detected by step S130 by CPU9.

  After performing phase difference AF in step S110, contrast AF may be further performed. Such a focus adjustment method is called hybrid AF. The manner in which the position of the focusing lens changes with time in this hybrid AF will be described below with reference to the example shown in FIG.

  First, at time T0, it is assumed that the focusing lens is at the position L0 and the defocus amount with the position L2 as the in-focus position is detected. In this case, during the period from time T0 to T1, the focusing lens is moved to a position L1 before the position L2 by a predetermined amount. Thereafter, contrast AF is performed between times T1 and T2, and focus evaluation values Va, Vb, and Vc are obtained at positions L1, L2, and L3, respectively. The positions L1, L2, and L3 are all set at equal intervals. The maximum one of the focus evaluation values Va, Vb, and Vc acquired in this way is selected, and the focusing lens is moved to a position corresponding to the maximum focus evaluation value. In this way, focus adjustment is performed.

  With the hybrid AF described above, it is possible to perform focus adjustment utilizing the advantages of the phase difference AF and the contrast AF. That is, the focus adjustment can be quickly performed using the phase difference AF, and the focus adjustment can be performed more accurately using the contrast AF. It should be noted that either normal phase difference AF or hybrid AF may be selectively used according to the shooting situation.

  According to the embodiment described above, the following operational effects are obtained.

(1) The CPU 9 detects a focus evaluation value indicating a focus adjustment state of the photographing lens 1 with respect to a focus detection region for contrast AF set in the image plane by the photographing lens 1 (step S80). The phase difference AF unit 6 is different from the CPU 9 in that the focus of the photographing lens 1 is set on the focus detection area for phase difference AF set in the range including the focus detection area for contrast AF in the image plane. A defocus amount indicating the adjustment state is detected (step S60). Based on the focus evaluation value and the defocus amount thus detected, the CPU 9 determines whether or not to adjust the focus of the photographing lens 1 according to the focus evaluation value by contrast AF (step S100). Since it did in this way, it can prevent that focus adjustment is performed based on the focus evaluation value influenced by the subject which the photographer contained in the focus detection area does not intend, and can perform focus adjustment correctly.

(2) In step S100, the CPU 9 determines the focus distance B corresponding to the maximum focus evaluation value among the focus evaluation values detected in step S80, and the focus distance A corresponding to the defocus amount detected in step S60. If the difference exceeds a predetermined threshold value th, the contrast AF is not performed and the process proceeds to step S110. That is, it is determined not to adjust the focus of the photographing lens 1 according to the focus evaluation value. Thereby, it is possible to avoid the focus adjustment using the lens position corresponding to the background position as the target drive position.

(3) The CPU 9 determines whether or not the focus detection area for phase difference AF is within the focus detection area for contrast AF (step S30). If there is, the CPU 9 performs step S100 without performing the determination of step S100. Focus adjustment by phase difference AF is performed by the processing of S120 and S110. Therefore, when there is a focus detection area for phase difference AF corresponding to the face detection area, the focus adjustment of the photographing lens 1 can be appropriately performed by executing phase difference AF with priority.

(4) In step S60, the phase difference AF unit 6 uses the contrast AF focus detection area set in step S20 among a plurality of phase difference AF focus detection areas set in advance as a reference. The defocus amount is detected with respect to the focus detection area for phase difference AF located in the direction in which the image of the above extends. In other words, the CPU 9 detects an area corresponding to the face in the image obtained by the photographing lens 1 (step S10), and sets the area corresponding to the face as a focus detection area for contrast AF (step S20). In step S60, the phase difference AF unit 6 detects the defocus amount with respect to the focus detection area for phase difference AF located below the area corresponding to the face. Therefore, even when there is no phase difference AF focus detection area corresponding to the face detection area, the defocus amount can be detected by selecting an appropriate focus detection area.

(5) The CPU 9 sets the search range of the focus detection area for phase difference AF in the image by the photographing lens 1 based on the focus detection area for contrast AF set in step S20 (step S40). In step S60, the phase difference AF unit 6 detects the defocus amount with respect to the focus detection area for phase difference AF located within the search range. Therefore, similarly to the above, even when there is no focus detection area for phase difference AF corresponding to the face detection area, an appropriate focus detection area can be selected to detect the defocus amount.

(6) The image pickup device 2 is provided at the position of the image plane by the photographing lens 1, receives an image pixel that receives light through the photographing lens 1 and outputs an image signal, and a pair of lights through the photographing lens 1. And a plurality of pixels including a focus detection pixel that receives light and outputs a pair of focus detection signals. In step S80, the CPU 9 detects a focus evaluation value based on the image signal output from the imaging pixel corresponding to the focus detection area for contrast AF. In step S60, the phase difference AF unit 6 detects the defocus amount based on the pair of focus detection signals output from the focus detection pixels. Therefore, it is possible to acquire both the image signal and the focus detection signal by one image pickup by the image pickup device 2 and detect the focus evaluation value and the defocus amount, respectively.

(7) The CPU 9 controls the focus motor 10 to drive the focusing lens based on at least one of the focus evaluation value detected in step S80 and the defocus amount detected in step S60. Focus adjustment is performed (steps S110 and S140). Thereby, focus adjustment can be appropriately performed according to the determination result of step S100.

(8) In step S110, the CPU 9 may use hybrid AF. When hybrid AF is used, focus adjustment of the photographing lens 1 is first performed based on the defocus amount by phase difference AF, and then focus evaluation value is detected and focus adjustment of the photographing lens 1 is performed by contrast AF. In this way, accurate and quick focus adjustment can be performed by taking advantage of each of the phase difference AF and the contrast AF.

  In the embodiment described above, the defocus amount is detected based on the focus detection signal from the focus detection pixel arranged in the image sensor 2, but the focus detection device for outputting the focus detection signal. May be provided separately from the image sensor 2. That is, a part of the light beam that passes through the photographing lens 1 is branched and guided to the focus detection device, and in this focus detection device, the image shift amount of the pair of images is detected by the pupil division type phase difference detection method. Good. In this case, the image pickup device 2 may not have focus detection pixels, and only the image pickup pixels may be arranged.

  In the above embodiment, the face of the person is detected by the face recognition process, and the focus detection area for contrast AF is set according to the detection result. However, the detection target is limited to the face of the person. Needless to say, it may be an animal, a plant, an industrial product, or the like. Further, a focus detection area for contrast AF may be set without performing such object detection. For example, an area designated by the user on the through image may be set as a focus detection area for contrast AF.

  In the above embodiment, focus adjustment is performed by selecting at least one of contrast AF and phase difference AF. However, another AF method may be selected. For example, either the active AF method or the external light passive AF method may be selected. The present invention can be applied to any one as long as at least one of two different AF methods is appropriately selected according to the photographing situation and the focus adjustment is performed.

It is a figure which shows the structure of the digital camera of one embodiment. It is a figure which shows the example of the focus detection area | region for phase difference AF set in the image surface by a photographic lens. It is a front view which shows the detailed structure of an image pick-up element. It is a front view which shows the detail of an imaging pixel. It is a front view which shows the detail of a focus detection pixel. It is a figure which shows the spectral sensitivity characteristic of an imaging pixel and a focus detection pixel. It is sectional drawing of an imaging pixel. It is sectional drawing of a focus detection pixel. It is a figure which shows the structure of the focus detection optical system of the pupil division type phase difference detection system using a micro lens. It is a figure which shows the example of a setting of a focus detection area. It is a figure which shows the example of the characteristic curve of the focus evaluation value with respect to a focusing lens position in case a person's face and background have overlapped in the focus detection area. 10 is a flowchart of processing executed when focus adjustment of a photographing lens is performed by appropriately selecting one of phase difference AF and contrast AF according to photographing conditions. It is a figure which shows the example of a setting of the focus detection area for phase difference AF in the image surface by a photographic lens, and the focus detection area for contrast AF. It is a figure which shows the example of a setting of a search area. It is a figure which shows the mode of the position change of a focusing lens according to the time passage in hybrid AF.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Shooting lens 2 Image sensor 6 Phase difference AF part 9 CPU
10 Focus motor 11 Face recognition calculation unit

Claims (10)

  1. First focus detection means for detecting first focus detection information indicating a focus adjustment state of the optical system with respect to a first detection region set in an image plane by the optical system;
    Second focus detection information indicating a focus adjustment state of the optical system with respect to a second detection region set in a range including the first detection region in the image plane by a method different from that of the first focus detection unit. Second focus detection means for detecting
    And determining means for determining whether to adjust the focus of the optical system in accordance with the first focus detection information based on the first focus detection information and the second focus detection information. Focus detection device.
  2. The focus detection apparatus according to claim 1,
    The determination means has a predetermined threshold difference between a first focus distance of the optical system corresponding to the first focus detection information and a second focus distance of the optical system corresponding to the second focus detection information. When the value exceeds the value, it is determined that focus adjustment of the optical system is not performed according to the first focus detection information.
  3. The focus detection apparatus according to claim 1 or 2,
    The determination unit, wherein the determination unit does not perform determination when the second detection region is within the first detection region.
  4. The focus detection apparatus according to claim 3.
    The second focus detection unit is configured to detect the second detection area positioned in a direction in which an image formed by the optical system extends with respect to the first detection area among the plurality of second detection areas. A focus detection apparatus for detecting bifocal detection information.
  5. The focus detection apparatus according to claim 4,
    Face detection means for detecting an area corresponding to the face in the image;
    The first focus detection means detects the first focus detection information using the area corresponding to the face as the first detection area,
    The focus detection apparatus, wherein the second focus detection means detects the second focus detection information for a second detection region located below the region corresponding to the face.
  6. The focus detection apparatus according to claim 4 or 5,
    Search range setting means for setting a search range of the second detection area in the image based on the first detection area,
    The focus detection apparatus, wherein the second focus detection means detects the second focus detection information for the second detection region located within the search range.
  7. In the focus detection apparatus according to any one of claims 1 to 6,
    An imaging pixel provided at the position of the image plane that receives light through the optical system and outputs an image signal, and receives a pair of light through the optical system and outputs a pair of focus detection signals. An imaging unit having a plurality of pixels including a focus detection pixel;
    The first focus detection means detects the first focus detection information based on the image signal output from the imaging pixel corresponding to the first detection region,
    The focus detection apparatus, wherein the second focus detection means detects the second focus detection information based on the pair of focus detection signals.
  8. The focus detection apparatus according to any one of claims 1 to 7,
    A focus adjustment apparatus comprising: a focus adjustment unit that performs focus adjustment of the optical system based on at least one of the first focus detection information and the second focus detection information.
  9. The focus adjustment device according to claim 8.
    The focus adjustment apparatus, wherein after the focus adjustment of the optical system based on the second focus detection information is performed by the focus adjustment means, the first focus detection means detects the first focus detection information. .
  10. A focusing device according to claim 8 or 9,
    An image pickup apparatus for picking up an image by the optical system.
JP2008313479A 2008-12-09 2008-12-09 Focus detection apparatus and imaging apparatus Active JP5417827B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008313479A JP5417827B2 (en) 2008-12-09 2008-12-09 Focus detection apparatus and imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008313479A JP5417827B2 (en) 2008-12-09 2008-12-09 Focus detection apparatus and imaging apparatus

Publications (3)

Publication Number Publication Date
JP2010139563A true JP2010139563A (en) 2010-06-24
JP2010139563A5 JP2010139563A5 (en) 2012-08-09
JP5417827B2 JP5417827B2 (en) 2014-02-19

Family

ID=42349796

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008313479A Active JP5417827B2 (en) 2008-12-09 2008-12-09 Focus detection apparatus and imaging apparatus

Country Status (1)

Country Link
JP (1) JP5417827B2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012181324A (en) * 2011-03-01 2012-09-20 Nikon Corp Imaging apparatus
WO2012133413A1 (en) * 2011-03-31 2012-10-04 富士フイルム株式会社 Imaging device, method for controlling imaging device, and program
JP2013003501A (en) * 2011-06-21 2013-01-07 Nikon Corp Camera
WO2013047220A1 (en) * 2011-09-28 2013-04-04 富士フイルム株式会社 Digital camera
WO2014041859A1 (en) * 2012-09-11 2014-03-20 ソニー株式会社 Imaging control device, imaging device, and method for controlling imaging control device
JP2016136271A (en) * 2016-03-01 2016-07-28 株式会社ニコン Imaging device
JP2018116296A (en) * 2018-03-07 2018-07-26 株式会社ニコン Focus detection apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001264622A (en) * 2000-03-15 2001-09-26 Ricoh Co Ltd Automatic focusing device, digital camera, portable information input device, focusing position detecting method and computer readable recording medium
JP2007179029A (en) * 2005-11-30 2007-07-12 Nikon Corp Focusing device and method, and camera
JP2008070640A (en) * 2006-05-10 2008-03-27 Canon Inc Point adjustment apparatus, imaging apparatus, control method for focus adjustment apparatus and program and recording medium
JP2008070428A (en) * 2006-09-12 2008-03-27 Nikon Corp Focusing apparatus and camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001264622A (en) * 2000-03-15 2001-09-26 Ricoh Co Ltd Automatic focusing device, digital camera, portable information input device, focusing position detecting method and computer readable recording medium
JP2007179029A (en) * 2005-11-30 2007-07-12 Nikon Corp Focusing device and method, and camera
JP2008070640A (en) * 2006-05-10 2008-03-27 Canon Inc Point adjustment apparatus, imaging apparatus, control method for focus adjustment apparatus and program and recording medium
JP2008070428A (en) * 2006-09-12 2008-03-27 Nikon Corp Focusing apparatus and camera

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012181324A (en) * 2011-03-01 2012-09-20 Nikon Corp Imaging apparatus
WO2012133413A1 (en) * 2011-03-31 2012-10-04 富士フイルム株式会社 Imaging device, method for controlling imaging device, and program
CN103430073A (en) * 2011-03-31 2013-12-04 富士胶片株式会社 Imaging device, method for controlling imaging device, and program
JP5468178B2 (en) * 2011-03-31 2014-04-09 富士フイルム株式会社 Imaging device, imaging device control method, and program
US8730380B2 (en) 2011-03-31 2014-05-20 Fujifilm Corporation Imaging device, method for controlling imaging device, and computer-readable storage medium
JP2013003501A (en) * 2011-06-21 2013-01-07 Nikon Corp Camera
WO2013047220A1 (en) * 2011-09-28 2013-04-04 富士フイルム株式会社 Digital camera
WO2014041859A1 (en) * 2012-09-11 2014-03-20 ソニー株式会社 Imaging control device, imaging device, and method for controlling imaging control device
US9219856B2 (en) 2012-09-11 2015-12-22 Sony Corporation Imaging control device, imaging apparatus, and control method performed by imaging control device
JPWO2014041859A1 (en) * 2012-09-11 2016-08-18 ソニー株式会社 Imaging control device, imaging device, and imaging control device control method
JP2016136271A (en) * 2016-03-01 2016-07-28 株式会社ニコン Imaging device
JP2018116296A (en) * 2018-03-07 2018-07-26 株式会社ニコン Focus detection apparatus

Also Published As

Publication number Publication date
JP5417827B2 (en) 2014-02-19

Similar Documents

Publication Publication Date Title
JP5264131B2 (en) Imaging device
US8730374B2 (en) Focus detection apparatus
JP5219865B2 (en) Imaging apparatus and focus control method
US8576329B2 (en) Focus detection apparatus and control method therefor
US8477233B2 (en) Image capturing apparatus
JP2004191629A (en) Focus detector
CN1332263C (en) Camera
JP5169499B2 (en) Imaging device and imaging apparatus
JP2008152012A (en) Imaging element, focus detection device and imaging apparatus
JP5468178B2 (en) Imaging device, imaging device control method, and program
US9100556B2 (en) Image processing apparatus and image processing method
JP5066851B2 (en) Imaging device
JP5458475B2 (en) Focus detection apparatus and imaging apparatus
JP5247044B2 (en) Imaging device
US8018524B2 (en) Image-pickup method and apparatus having contrast and phase difference forcusing methods wherein a contrast evaluation area is changed based on phase difference detection areas
JP5012495B2 (en) Imaging element, focus detection device, focus adjustment device, and imaging device
US20120147227A1 (en) Image pickup apparatus and control method thereof
US20090009651A1 (en) Imaging Apparatus And Automatic Focus Control Method
US8259215B2 (en) Image pickup apparatus having focus control using phase difference detection
EP2430487B1 (en) Focus detection apparatus
JP5034556B2 (en) Focus detection apparatus and imaging apparatus
JP6080417B2 (en) Image processing apparatus and image processing method
EP2615825A2 (en) Image processing apparatus, image sensing appparatus, control method, program, and recording medium
US8175447B2 (en) Image pickup apparatus and control method therefor
JP2013145314A5 (en)

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111206

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120127

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120619

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20121030

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121113

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130115

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130723

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130924

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131022

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131104

R150 Certificate of patent or registration of utility model

Ref document number: 5417827

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250