JP2013113857A - Imaging device, and control method therefor - Google Patents

Imaging device, and control method therefor Download PDF

Info

Publication number
JP2013113857A
JP2013113857A JP2011256745A JP2011256745A JP2013113857A JP 2013113857 A JP2013113857 A JP 2013113857A JP 2011256745 A JP2011256745 A JP 2011256745A JP 2011256745 A JP2011256745 A JP 2011256745A JP 2013113857 A JP2013113857 A JP 2013113857A
Authority
JP
Japan
Prior art keywords
focus detection
imaging
mode
pixel
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2011256745A
Other languages
Japanese (ja)
Inventor
Keisuke Kudo
圭介 工藤
Original Assignee
Canon Inc
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc, キヤノン株式会社 filed Critical Canon Inc
Priority to JP2011256745A priority Critical patent/JP2013113857A/en
Publication of JP2013113857A publication Critical patent/JP2013113857A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To select an optimum AF method according to an AF frame position in an imaging apparatus having an imaging element in which focus detection pixels are arranged, and to shorten an AF processing time while suppressing an influence on a through image.
An imaging unit including an imaging pixel that photoelectrically converts light from an imaging optical system to generate a subject image, and a focus detection pixel that receives light passing through a partial area of an exit pupil of the imaging optical system. The image pickup apparatus having the function switches between a first mode for reading out the first area of the image pickup means and a second mode for reading out the area narrower than the first area faster than the first mode. When focus detection pixels are not included in the AF frame set in the imaging screen, AF is performed using the contrast detection method in the second mode, and when focus detection pixels are included in the AF frame, The mode 1 is set, AF is performed based on a signal from the focus detection pixel, and AF by a contrast detection method is performed based on the focus detection result.
[Selection] Figure 9

Description

  The present invention relates to an imaging apparatus such as a digital camera or a video camera, and more particularly to an imaging apparatus that performs automatic focus detection (AF).
  Some imaging apparatuses having a solid-state imaging device such as a CCD or a CMOS sensor have a live view function. With the live view function, the subject image can be confirmed by sequentially outputting image signals read continuously from the image sensor to a display device such as a liquid crystal display installed on the back of the camera. It is a function. Further, as a general method using a light beam that has passed through a photographing lens in an automatic focus detection / adjustment method of an imaging apparatus, there are a contrast detection method (called a blur method) and a phase difference detection method (called a shift method).
  The contrast detection method is a method often used in video movie equipment (camcorder) for moving image shooting and electronic still cameras, and an image sensor is used as a focus detection sensor. In this method, an evaluation value is generated based on an output signal of the image sensor, particularly information on high-frequency components (contrast information), and the position of the photographing lens where the evaluation value is the largest is used as the in-focus position. In a method called a hill-climbing method among contrast detection methods, it is necessary to obtain an evaluation value while moving the photographic lens by a small amount, and to move the photographic lens until it is found that the evaluation value is the maximum as a result. Therefore, it is not suitable for high-speed AF operation.
  Further, there is a method of performing high-speed AF by increasing the frame rate by reading out only pixels in a partial region of the image sensor during the AF operation by the contrast detection method (Patent Document 1).
  On the other hand, the phase difference detection method is a technique often used for single-lens reflex cameras. In the phase difference detection method, the light beam that has passed through the exit pupil of the photographing lens is divided into two, and the two divided light beams are received by a pair of focus detection sensors. Then, the amount of deviation of the photographic lens in the focus direction is directly obtained by detecting the amount of deviation of the output signal according to the amount of received light, that is, the amount of relative positional deviation in the light beam splitting direction. Therefore, once the accumulation operation is performed by the focus detection sensor, the amount and direction of the focus shift can be obtained, and a high-speed focus adjustment operation is possible. However, in order to divide the light beam that has passed through the exit pupil of the photographic lens into two and obtain a signal corresponding to each light beam, optical path dividing means such as a quick return mirror or a half mirror is provided in the imaging optical path, and beyond that In general, a focus detection optical system and an AF sensor are provided. Therefore, there is a drawback that the apparatus is large and expensive. Also, when performing live view, there is a problem that focus detection by the phase difference detection method cannot be performed because the quick return mirror is normally retracted.
  In order to solve the above problem, a technique has been proposed for providing a phase difference detection function to an image sensor, eliminating the need for a dedicated AF sensor, and realizing high-speed phase difference AF. For example, in Patent Document 2, in some of the light receiving elements (pixels) of the image sensor, the pupil division function is given by decentering the sensitivity region of the light receiving unit with respect to the optical axis of the on-chip microlens. These pixels are used as focus detection pixels, and are arranged at predetermined intervals between the imaging pixel groups, thereby performing phase difference focus detection. Further, since the location where the focus detection pixels are arranged corresponds to a defective portion of the imaging pixel, image information is generated by interpolation from surrounding imaging pixel information.
JP 2007-72254 A JP 2000-156823 A
  However, in the conventional technique disclosed in Patent Document 1 described above, only a partial area of the image sensor is read during the AF operation, and thus a through image of the subject cannot be displayed. For this reason, the display is blacked out or a part of the area is not updated, and the user cannot check the through image during the AF operation.
  Further, in the conventional technique disclosed in the above-mentioned Patent Document 2, it takes a long time to generate image information by interpolating from the surrounding photographing pixel information at the position where the focus detection pixels are arranged. Pixels can be placed only in part of the image sensor.
  An object of the present invention is to select an optimum AF method in accordance with the AF frame position in an imaging apparatus having an imaging element in which focus detection pixels are arranged, and to shorten the AF processing time while suppressing the influence on the through image. That is.
  In order to achieve the above object, a first aspect of the present invention is an image pickup apparatus in which an interchangeable lens having an image pickup optical system is detachable, and is an image pickup that photoelectrically converts light from the image pickup optical system to generate a subject image. An image pickup means having a pixel and a focus detection pixel that receives light passing through a partial area of the exit pupil of the imaging optical system, and focus detection of the image pickup optical system based on a signal from the focus detection pixel A first focus detection unit that performs, a second focus detection unit that calculates a focus signal based on a high frequency component of a signal from the imaging unit, and performs focus detection of the imaging optical system based on the focus signal; Control means for switching between a first mode for reading out the first area of the imaging means and a second mode for reading out the area narrower than the first area faster than the first mode. Set in the imaging screen. If the focus detection pixel is not included in the focus detection area, the control unit sets the second mode, performs focus detection by the second focus detection unit, and the focus detection area includes the focus detection pixel. When a focus detection pixel is included, the control unit sets the first mode, performs focus detection by the second focus detection unit, and determines the second focus based on the focus detection result. The focus detection is performed by the detection means.
  According to a second aspect of the present invention, an interchangeable lens provided with an imaging optical system can be attached and detached, an imaging pixel that photoelectrically converts light from the imaging optical system to generate a subject image, and a part of an exit pupil of the imaging optical system A first focus detection method for controlling an image pickup apparatus having an image pickup means having a focus detection pixel that receives light passing through an area, wherein the focus detection of the image pickup optical system is performed based on a signal from the focus detection pixel. A second focus detection step of calculating a focus signal based on a high frequency component of a signal from the imaging means, and performing focus detection of the imaging optical system based on the focus signal, and a first of the imaging means And a control step for switching between a first mode in which reading is performed for the first area and a second mode in which reading is performed at a higher speed than the first mode for areas narrower than the first area. Configuration If the focus detection pixel is not included in the focus detection area, the control step sets the second mode, performs focus detection by the second focus detection step, and the focus detection area includes the focus detection area. When a focus detection pixel is included, the first mode is set by the control step, focus detection is performed by the second focus detection step, and the second focus is determined based on the focus detection result. Focus detection is performed by a detection step.
  According to the present invention, in an imaging apparatus having an imaging element in which focus detection pixels are arranged, an optimal AF method is selected according to the AF frame position, and the AF processing time is shortened while suppressing the influence on the through image. It becomes possible.
1 is a cross-sectional view of a camera that is Embodiment 1 of the present invention. The front view and sectional drawing which show arrangement | positioning of the imaging pixel in an image sensor. The front view and sectional drawing which show arrangement | positioning of the focus detection pixel in an image sensor. The front view which shows the example of arrangement | positioning of the imaging pixel and focus detection pixel in an image sensor. The front view which shows the other example of arrangement | positioning of the imaging pixel and focus detection pixel in an image sensor. FIG. 3 is a front view showing an arrangement of image pickup pixels and focus detection pixels in an image pickup element used in the camera of Embodiment 1. FIG. 6 is a diagram for explaining processing for averaging the outputs of focus detection pixels in the camera of Embodiment 1; FIG. 3 is a block diagram illustrating an electrical configuration of the camera according to the first embodiment. 3 is a flowchart showing the operation of the camera of Embodiment 1. FIG. 6 is a diagram illustrating a positional relationship between a focus detection pixel arrangement region and an AF frame in the camera according to the first exemplary embodiment. FIG. 6 is a diagram illustrating a change in a readout area due to frame rate switching in the camera of Embodiment 1. 9 is a flowchart showing the operation of the camera of Embodiment 2. 10 is a flowchart showing the operation of the camera of Embodiment 3. 10 is a flowchart showing the operation of the camera of Example 4.
  Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
  FIG. 1 shows a configuration of a single-lens reflex digital camera as an imaging apparatus of the present invention. FIG. 1A shows the state of the optical finder mode, and FIG. 1B shows the state during the live view mode and the recording image capturing operation. In FIG. 1, an interchangeable lens 102 having an imaging optical system is detachably mounted on the front surface of a camera body 101. The camera body 101 and the interchangeable lens 102 are electrically connected via a mount contact group 112. In the interchangeable lens 102, a photographing lens group 119 and a diaphragm 113 are provided as a photographing optical system. The photographic lens group 119 is illustrated as a single lens in FIG. 1, but actually includes a plurality of lenses. The photographing lens group 119 includes a focus lens for performing focus adjustment. The diaphragm 113 can adjust the amount of light taken into the camera body 101.
  The main mirror 103 is composed of a half mirror. In the optical finder mode, the main mirror 103 is disposed obliquely on an optical path from the photographing optical system (hereinafter referred to as an imaging optical path), and reflects a part of the light beam from the photographing optical system to guide it to the finder optical system. In the optical viewfinder mode, the light beam that has passed through the main mirror 103 is reflected by the sub mirror 104 disposed behind the main mirror 103 and guided to the AF sensor 105. In the live view mode, the main mirror 103 and the sub mirror 104 are retracted outside the imaging optical path.
  The AF sensor 105 is a light receiving element that is provided as a separate body from an imaging device described later, and realizes AF of a sensor-specific phase difference detection method. The AF sensor 105 includes a secondary imaging lens (not shown) that divides the light beam from the photographing optical system via the sub mirror 104 into two and forms two images on the two divided light beams. The AF sensor 105 includes a pair of line sensors that photoelectrically convert the two images (hereinafter referred to as a line sensor pair). Each line sensor is configured by arranging a plurality of photoelectric conversion elements in a line. The line sensor pair photoelectrically converts two images and outputs two image signals corresponding to the two images.
  The two image signals are sent to the imaging control circuit 202 described later with reference to FIG. The imaging control circuit 202 performs a correlation operation on the two image signals, and calculates a phase difference (hereinafter referred to as an AF sensor phase difference) as a deviation amount (including a direction) between the two image signals. Then, the imaging control circuit 202 calculates a defocus amount (including the direction) indicating the focus state of the imaging optical system based on the AF sensor phase difference. Actually, not only one line sensor pair but also a plurality of line sensor pairs are provided to correspond to a plurality of focus detection areas provided in the imaging screen. Of the plurality of focus detection areas, the defocus amount is calculated based on two image signals from the line sensor pair corresponding to one focus detection area selected automatically by the imaging control circuit 202 or by the photographer. The In this case, the AF sensor 105 and the imaging control circuit 202 function as third focus detection means.
  The imaging control circuit 202 calculates a driving amount (including a direction) for obtaining a focused state of the focus lens in the interchangeable lens 102 based on the defocus amount. By moving the focus lens by the driving amount, an in-focus state by AF (focus control) of the sensor-specific phase difference detection method can be obtained in the optical finder mode.
  The image sensor 108 is configured by a CCD sensor, a CMOS sensor, or the like. A low-pass filter 106 and a focal plane shutter 107 are provided on the front surface of the image sensor 108.
  As described in the background art, contrast detection AF is a method of performing AF based on high-frequency component information (contrast information) of the output signal of the image sensor 108. The imaging control circuit 202 generates an evaluation value (focus signal) using information on the high-frequency component of the output signal of the image sensor 108, and sets the position of the focus lens where the evaluation value is the largest as the in-focus position. In this case, the imaging control circuit 202 functions as a second focus detection unit.
  The focus plate 109 is disposed on the planned image plane of the photographing optical system. The pentaprism 110 is a member for changing the finder optical path. The photographer can observe the subject by looking at the focus plate 109 through the eyepiece 114. The focus plate 109, the pentaprism 110, and the eyepiece 114 constitute a finder optical system.
  The AE sensor 111 performs photometry using a part of the light flux from the pentaprism 110.
  The release button 115 can be pressed halfway (first stroke) and fully pressed (second stroke). When the release button 115 is pressed halfway, imaging preparation processing such as AE or AF is performed, and when the release button 115 is fully pressed, the imaging device 108 is exposed to perform imaging processing for acquiring a recording image. . In the following description, the half-press operation of the release button 115 is referred to as SW1-ON, and the full-press operation is referred to as SW2-ON.
  The live view start / end button 116 is a button for switching between the optical finder mode and the live view mode each time it is operated. In the optical finder mode, as described above, AF of the sensor-specific phase difference detection method is performed using the AF sensor 105 with the main mirror 103 lowered as shown in FIG. On the other hand, in the live view mode, the light beam from the photographing optical system is directly guided to the image sensor 108 with the main mirror 103 raised as shown in FIG. A live view image (image data) generated based on an output signal from the image sensor 108 is displayed on a display element 117 such as a liquid crystal display provided on the back surface of the camera body 101. Thereby, the photographer can observe the subject without using the finder optical system.
  In the live view mode, the light beam from the photographing optical system is not guided to the AF sensor 105. However, as will be described later, AF can be performed by a phase difference detection method (hereinafter referred to as a sensor-integrated phase difference detection method) using output signals from some pixels of the image sensor 108.
  The AF by the sensor-integrated phase difference detection method will be described. In this embodiment, the image sensor 108 photoelectrically converts a subject image formed by a light beam from a photographing optical system and outputs an image signal used to generate a live view image or a recording image. Have In addition, the image sensor 108 includes a focus detection pixel group that photoelectrically converts two images formed by a light beam divided into two by a pupil division function described later, from the light beam from the photographing optical system.
  The imaging pixel group and the focus detection pixel group will be described with reference to FIG. FIG. 2A shows an imaging pixel group of 2 rows × 2 columns. In this embodiment, a two-dimensional single-plate CMOS color image sensor in which primary color filters of R (Red), G (Green), and B (Blue) are Bayer arranged is used as the image sensor 108. In the Bayer array, G pixels are arranged diagonally among 4 pixels of 2 rows × 2 columns, and R and B pixels are arranged as the other two pixels. The pixel arrangement of 2 rows × 2 columns is repeated on the entire image sensor 108.
  FIG. 2B shows a cross section AA of the imaging pixel shown in FIG. ML is a microlens arranged in front of each pixel, CFR is an R color filter, and CFG is a G color filter. PD schematically indicates a photoelectric conversion unit of the CMOS sensor. CL is a wiring layer for forming signal lines for transmitting various signals in the CMOS sensor. TL schematically shows the imaging optical system.
  The microlens ML and the photoelectric conversion unit PD of the imaging pixel are configured to capture the light flux that has passed through the imaging optical system TL as effectively as possible. That is, the exit pupil EP of the imaging optical system TL and the photoelectric conversion unit PD are placed in a conjugate relationship by the microlens ML, and the effective area of the photoelectric conversion unit PD is set large.
  Further, FIG. 2B shows the incident light beam to the R pixel, but the light beam that has passed through the imaging optical system TL also enters the G pixel and the B pixel. Therefore, the diameter of the exit pupil EP corresponding to each of the RGB imaging pixels is increased, and the light flux from the subject can be efficiently captured. Thereby, it is possible to improve the S / N of the imaging signal used for generating the live view image and the recording image.
  FIG. 3A is a plan view of a pixel group of 2 rows × 2 columns including a focus detection pixel group that performs pupil division of the imaging optical system in the horizontal direction (lateral direction). The G pixel outputs a G imaging signal that is a main component of luminance information. Since human image recognition characteristics are sensitive to luminance information, image quality degradation is likely to be noticed when G pixels are missing. Conversely, the R and B pixels are pixels that mainly acquire color information. However, since humans are insensitive to color information, image quality degradation is recognized even if some pixels are acquired from the color information. Hateful. For this reason, in this embodiment, among the 2 × 2 imaging pixels shown in FIG. 2A, the G pixels are left as imaging pixels, and some of the R and B pixels are replaced with focus detection pixels. Yes. In FIG. 3A, focus detection pixels are denoted by SA and SB.
  FIG. 3B shows a cross section BB of the focus detection pixel shown in FIG. The microlens ML and the photoelectric conversion unit PD in the focus detection pixel are the same as the microlens ML and the photoelectric conversion unit PD of the imaging pixel illustrated in FIG.
  In this embodiment, since the pixel signal from the focus detection pixel is not used for generating a live view image or a recording image, a transparent film CFW (White) is disposed instead of the color filter in the focus detection pixel. Has been. Further, in order to give the focus detection pixel a pupil division function, the openings (hereinafter referred to as aperture openings) OPHA and OPHB of the wiring layer CL are arranged in one direction with respect to the center line of the microlens ML. ing.
  Specifically, in the focus detection pixel SA, the aperture opening OPHA is arranged to be biased to the right side, and receives the light beam that has passed through the left exit pupil region EPHA among the exit pupils of the imaging optical system TL. On the other hand, the aperture opening OPHB of the focus detection pixel SB is arranged to be biased to the left side (that is, opposite to the aperture opening OPHA) and passes through the right exit pupil region EPHB of the exit pupil of the imaging optical system TL. The received light beam is received. Thereby, the focus detection pixels SA and SB can receive two images of the same subject and photoelectrically convert them.
  A plurality of focus detection pixels SA are regularly arranged in the horizontal direction, and one of the two images formed on the focus detection pixel group SA is defined as an A image. Further, a plurality of focus detection pixel groups SB are regularly arranged in the horizontal direction, and the other image formed on these focus detection pixel groups SB out of the two images is defined as a B image. The A and B images are photoelectrically converted by the focus detection pixel groups SA and SB, and the imaging control circuit 202 outputs a phase difference as a shift amount (including direction) of image signals output from the focus detection pixel groups SA and SB. (Hereinafter referred to as focus detection pixel phase difference) is calculated. Based on the focus detection pixel phase difference, a defocus amount (including the direction) indicating the focus state of the photographing optical system can be obtained. In this case, the imaging control circuit 202 functions as a first focus detection unit.
  When detecting the phase difference between the A image and the B image in the vertical direction (longitudinal direction), the aperture opening OPHA of the focus detection pixel SA is on the upper side, and the aperture opening OPHB of the focus detection pixel SB is on the lower side. What is necessary is just to arrange it biased. That is, the focus detection pixels SA and SB shown in FIG.
  FIG. 4 shows an arrangement example of the imaging pixel group and the focus detection pixel group on the imaging element. In the camera of this embodiment, in order to increase the frame rate at which an image is displayed on the display element 117 in the live view mode, the number of read pixels in each of the H (horizontal) direction and the V (vertical) direction is all pixels in each direction. It is thinned out to be 1/3 of. The imaging pixels with G, R, and B in the figure are pixels from which pixel signals are read out when thinned. White pixels without a symbol are pixels from which pixel signals are not read out when thinned out, but pixel signals of these pixels are also read out when all pixels are read (for example, when a recording image is acquired). .
  Further, the focus detection pixel groups SA and SB are arranged so as not to have a thinning cycle of the imaging pixels so that pixel signals can be read even in such a thinned state. In consideration of the fact that the focus detection pixel groups SA and SB are not used for image generation, in this embodiment, the focus detection pixel groups SA and SB are discretely arranged at intervals in the H and V directions. Further, it is desirable that the focus detection pixel is not arranged at the position of the G pixel so that pixel deficiency (that is, image degradation) on the image due to the focus detection pixel groups SA and SB is less noticeable.
  In the present embodiment, a pair of 4 rows × 4 pixels at the time of thinning indicated by a thick black frame in FIG. 4 (12 rows × 12 columns of pixels in the case where pixels are not thinned) is paired in one block. Focus detection pixels SA and SB are arranged. BLOC_H (i, j) in the figure represents a block name. A 4 × 4 block is treated as one pixel unit.
  In a single pixel unit, in blocks that are in the same position in the X direction and different in the Y direction, the focus detection pixels SA and SB are shifted by one pixel (three pixels if not thinned out) in the horizontal direction. I am letting. This is indicated by an arrow in the figure. This is for improving the sampling characteristics of the discretely arranged focus detection pixel groups. That is, since the focus detection pixels SA and SB are pixels that are pupil-divided in the X direction, the shift amount is performed in units of one pixel so that sampling is dense in the X direction. For the same reason, in the blocks that are in the same position in the Y direction and are different in the X direction, the focus detection pixels SA and SB are thinned in the vertical direction for each pixel (three pixels if not thinned). It is shifted to.
  The plurality of pixel units are arranged at positions corresponding to the plurality of focus detection areas on the entire surface of the image sensor. As described above, pixel signals (that is, two image signals) from the focus detection pixels SA and SB corresponding to the selected focus detection region are read out. FIG. 4 shows an example in which the image signal in the horizontal direction is acquired and used for AF. However, when the image signal in the vertical direction is acquired, it is shown in FIG. 5 in which the horizontal and vertical directions in FIG. 4 are transposed. Such a pixel arrangement may be used.
  Further, when the focus detection pixels are arranged in a checkered pattern combining the arrangement shown in FIG. 4 and the arrangement shown in FIG. 5, the arrangement becomes as shown in FIG. 6, and image signals are obtained in the horizontal direction and the vertical direction, respectively. be able to.
  In the actual AF calculation, as shown in FIG. 7, a plurality of focus detection pixels in one pixel unit are treated as one pixel, and pixel signals from the plurality of focus detection pixels are added and averaged. The S / N ratio of the output from the one pixel can be improved. In the example of FIG. 7, the pixel signals from two focus detection pixels are averaged to obtain an output as one pixel, but the number of focus detection pixels to be averaged is arbitrary. The image sensor 108 of the present embodiment has the pixel arrangement shown in FIG. 6 and can perform AF using a sensor-integrated phase difference detection method.
  As described above, in the camera of the present embodiment, AF by the sensor-specific phase difference detection method can be performed in the optical finder mode, and AF by the sensor integrated phase difference detection method can be performed in the live view mode.
  Next, the electrical configuration of the camera of this embodiment will be described with reference to FIG. The components shown in FIG. 1 are denoted by the same reference numerals in FIG.
  In FIG. 8, the light beam incident from the photographing optical system passes through the stop 113 and reaches the main mirror 103. In FIG. 8, illustration of the submirror 104 is omitted. The main mirror 103 separates the incident light beam into a reflected light beam and a transmitted light beam and guides them to the AE sensor 111 and the AF sensor 105 respectively, and a live operation for retracting the main mirror 103 (and the sub mirror 104) out of the imaging optical path. Switch between view mode.
  In the live view mode, the shutter 107 is opened. For this reason, the incident light beam reaches the image sensor 108 as it is. The analog image signal output from the image sensor 108 is converted into a digital signal by the A / D converter 207 and input to the signal processing circuit 208 serving as an image generation unit. The signal processing circuit 208 generates a color image signal by performing various kinds of signal processing on the imaging signal. The color image signal is sent to the display circuit 209 at a predetermined cycle. Thereby, a live view image (moving image) is displayed on the display element 117 shown in FIG.
  At the time of imaging, a color image signal as a recording image (still image) generated in the same manner as the live view mode (with a larger number of pixels than the live view image) according to SW2-ON 210. As a result, the recording image is recorded on a recording medium (not shown) (semiconductor memory, optical disk, etc.). A color image signal as a recording image is displayed on the display element 117 via the display circuit 209 for a predetermined time.
  In the live view mode, the signal processing circuit 208 calculates a contrast evaluation value for the imaging signal and sends it to the imaging control circuit 202. The imaging control circuit 202 searches the position where the contrast evaluation value reaches a peak while driving the focus lens included in the interchangeable lens 102 via the lens driving circuit 206, and drives the focus lens to the lens position where the peak value is reached.
  In the live view mode, the signal processing circuit 208 sends two image signals obtained by the focus detection pixel group on the image sensor 108 to the imaging control circuit 202. The imaging control circuit 202 performs a correlation operation on the two image signals from the focus detection pixel group, and calculates a focus detection pixel phase difference between the two image signals. The imaging control circuit 202 calculates the defocus amount of the focus lens based on the focus detection pixel phase difference, and drives the focus lens via the lens drive circuit 206 based on the defocus amount.
  In the optical finder mode, the AE sensor 111 sends a photometric signal to the imaging control circuit 202. Further, the AF sensor 105 sends two image signals obtained by the line sensor pair to the imaging control circuit 202.
  The imaging control circuit 202 determines the aperture diameter and shutter speed of the diaphragm 113 during the imaging operation based on the photometric signal obtained from the AE sensor 111. The imaging control circuit 202 controls the diaphragm 113 and the shutter 107 based on the determined aperture diameter and shutter speed via the diaphragm driving circuit 205 and the shutter (SH) driving circuit 203 according to SW2-ON during imaging. . The aperture driving circuit 205 is provided in the interchangeable lens 102.
  Further, in the optical finder mode, the imaging control circuit 202 performs a correlation operation on the two image signals from the AF sensor 105 as described above in response to SW1-ON, and the AF sensor position of the two image signals. Calculate the phase difference. The photographing control circuit 202 calculates the defocus amount of the focus lens based on the AF sensor phase difference, and drives the focus lens via the lens drive circuit 206 based on the defocus amount. The lens driving circuit 206 is provided in the interchangeable lens 102.
  The imaging control circuit 202 functions as a first focus information calculation unit, a second focus information calculation unit, a correction value calculation unit, and a control unit.
  Next, the operation of the camera in the live view mode of this embodiment will be described using the flowchart shown in FIG. This operation is executed in accordance with a computer program stored in the imaging control circuit 202.
  In step S901, the imaging control circuit 202 determines whether or not the focus detection pixel is included in the position of the AF frame, and if not included, the process proceeds to step S902.
  FIG. 10 shows the arrangement of the focus detection pixels, and shows a case where the focus detection pixels are arranged with a width of 50% in the horizontal direction and 50% in the vertical direction of the imaging region of the live view. In FIG. 10A, the AF frame is in the center, and the focus detection pixel is included at the position of the AF frame. On the other hand, when the AF frame is in the vicinity as shown in FIG. 10B, no focus detection pixel is included.
  In step S902, the imaging control circuit 202 switches the frame rate. For example, the state of driving at 60 fps is switched to 120 fps. At that time, if the frame rate is switched to a high-speed frame rate, the reading time will not be in time, and the image of the entire area cannot be read and only a part of the area is read. In that case, reading is performed so as to include the position of the AF frame. FIG. 11 shows a read area when the frame rate is switched. In FIG. 11A, the entire area (first area) is read (first mode), but in FIG. 11B when the frame rate is switched to the high-speed frame rate, only a part of the area is read (second state). Mode). Therefore, live view display cannot be performed in the entire area during the period when the frame rate is switched to the high-speed frame rate, and display blackout or a partial area is updated during that period. Here, by switching to the high-speed frame rate, the contrast evaluation value can be acquired at an early cycle, so that the processing time for contrast AF can be shortened.
  In step S903, the imaging control circuit 202 controls the lens driving circuit 206 to drive a predetermined amount of lens, acquires a contrast evaluation value, and determines in step S904 whether the contrast evaluation value is a peak. If not, steps S903 and S904 are repeated until the contrast evaluation value reaches a peak. If a contrast peak is detected, the process proceeds to step S905.
  In step S905, the imaging control circuit 202 controls the lens driving circuit 206 so as to drive the lens to the lens position where the contrast evaluation value has a peak, and the process proceeds to step S906.
  In step S906, the shooting control circuit 202 returns the high-speed frame rate to the normal frame rate of the live view because AF has been completed, and displays the entire area.
  On the other hand, if it is determined in step S901 that the focus detection pixel is included in the position of the AF frame and the process proceeds to step S907, the defocus amount obtained by the correlation calculation of the two image signals of the focus detection pixel described above is used. The defocus direction is acquired, and the process proceeds to step S908.
  Here, when the focus detection pixel is included in the position of the AF frame, the contrast AF is performed in a state where the defocus amount and the defocus direction (focus detection result) are known in advance. Even without this, high-speed AF can be performed. Further, since focus detection can be performed with the main mirror 103 up, the live view display is not blacked out or only a partial area is not updated.
  In step S908, the imaging control circuit 202 controls the lens driving circuit 206 to drive the lens in the defocus direction acquired in step S907, and determines in step S908 whether the contrast evaluation value is a peak. If not, steps S908 and S909 are repeated until the contrast evaluation value reaches a peak. If a contrast peak is detected, the process proceeds to step S910.
  In step S910, the imaging control circuit 202 controls the lens driving circuit 206 so as to drive the lens to the lens position where the contrast evaluation value has a peak, and ends the AF operation.
  According to the present embodiment, when the focus detection pixel is included in the AF frame position, since the high-speed frame rate is not switched, it is possible to perform shooting while performing live view display. In this case, AF can be speeded up by performing AF by the sensor-integrated phase difference detection method. On the other hand, when the focus detection pixel is not included in the AF frame position, the AF frame position is switched to the high-speed frame rate, so that the AF by the contrast detection method can be performed at a higher speed.
  As described above, in this embodiment, in an imaging apparatus having an imaging element in which focus detection pixels are partially arranged, it is determined by determining whether or not a focus detection pixel is included in the AF frame position. It is possible to select an optimum AF method and shorten the AF processing time.
  The operation of the camera in the live view mode according to the second embodiment of the present invention will be described below using the flowchart shown in FIG. Since the basic configuration is the same as that of the first embodiment, different parts will be described below.
  In step S1201 of FIG. 12, the imaging control circuit 202 determines whether or not the focus detection pixel is included in the position of the AF frame. If not included, the process proceeds to step S1202. Since the operation in this case is the same as the operation in steps S902 to S906 of the first embodiment, description thereof is omitted.
  If it is determined in step S1201 that the focus detection pixel is included in the position of the AF frame, the process proceeds to step S1207. In step S1207, the imaging control circuit 202 determines whether the reliability of the defocus amount acquired from the focus detection pixel is higher than a certain threshold value.
  Here, regarding the reliability of the defocus amount, various methods have been proposed for quantification, but in this embodiment, the higher the degree of coincidence of the shape of the image signals of the A image and the B image, the more reliable. Judgment is high. Here, Ai, Bi (i = 1, 2, 3,...) Are obtained as A and Bi image signals obtained after AF by a line sensor composed of n (plural) photoelectric conversion elements in total. n). The reliability R is
Is calculated by Since the value of R decreases as the degree of coincidence between the image signal of the A image and the image signal of the B image increases, the actual reliability increases as the value of R decreases.
  In step S1207, when the reliability R is smaller than a certain threshold value Rth (first case), it is determined that the defocus amount and direction are highly reliable, and the process advances to step S1208. On the other hand, when the reliability R is greater than a certain threshold value Rth (second case), the defocus amount and direction are determined to be low in reliability, and the process proceeds to step S1202.
  When the process proceeds to step S1208, the operations in steps S1208 to S1211 are the same as the operations in steps S907 to S910 of the first embodiment, and thus the description thereof is omitted.
  As described above, in this embodiment, when the focus detection pixel is included in the AF frame position, the reliability of the defocus amount is high (R is smaller than the threshold value) (R is smaller than the threshold value). Only AF with a sensor-integrated phase difference detection method is performed. For this reason, it is possible to prevent the lens from being driven in the wrong direction when the reliability is low (R is larger than the threshold value). Also, when the reliability is low, the AF processing time can be shortened by performing AF by the contrast detection method by switching to a high frame rate.
  The operation of the camera according to the third embodiment of the present invention will be described below. The basic operation is the same as in the first and second embodiments, but in this embodiment, it is determined whether to switch the frame rate according to the brightness of the image in the AF frame area.
  A flowchart of this embodiment is shown in FIG. In step S1301, it is determined whether or not the brightness of the image in the AF frame area is higher than a threshold value. If it is determined that the brightness is higher than a certain threshold value, the process proceeds to step S1302 to switch to the high-speed frame rate and is not determined to be higher than the threshold value. In this case, the frame rate is not switched.
  Switching to a high frame rate shortens the longest accumulation time of the image sensor, resulting in an underexposed image when the luminance is low. As a result, the contrast evaluation value of the image becomes low, and there is a high possibility that the image cannot be focused accurately. In the present embodiment, it is possible to prevent a decrease in focusing accuracy by switching the frame rate only when a certain luminance or higher.
  When the frame rate switching control as described above is applied to the first embodiment, if it is determined in step S901 in FIG. 9 that the focus detection pixel is not included in the position of the AF frame (NO in step 901), step S1301 in FIG. To proceed to. Here, when it is determined that the brightness of the image in the AF frame area is higher than the threshold value, processing in steps 902 to S906 is performed. On the other hand, if it is not determined that the brightness of the image in the AF frame area is higher than the threshold value, the process of steps S903 to S905 is performed without switching the frame rate in step S902. In this case, since the frame rate is not switched at a high speed, the frame rate is not switched in step S906.
  When the frame rate switching control is applied to the second embodiment, if it is determined in step S1201 in FIG. 12 that the focus detection pixel is not included in the position of the AF frame (NO in step 1201), the process proceeds to step S1301 in FIG. Try to go forward. If it is determined that the brightness of the image in the AF frame area is higher than the threshold value, the processing in steps 1202 to S1206 is performed. On the other hand, when it is not determined that the brightness of the image in the AF frame area is higher than the threshold value, the processing of steps S1203 to S1205 is performed without switching the frame rate of step S1202. In this case, since the frame rate is not switched at a high speed, the frame rate is not switched in step S1206.
  The operation of the camera according to the fourth embodiment of the present invention will be described below. The basic operation is the same as in the first and second embodiments, but in this embodiment, it is determined whether or not to switch the frame rate according to the interchangeable lens mounted.
  A flowchart of this embodiment is shown in FIG. In step S1401, it is determined based on information acquired from the interchangeable lens 102 whether the mounted interchangeable lens is a lens that can be driven at high speed. If it is determined that the lens can be driven at high speed, the process proceeds to step S1302, and the frame rate is switched. If it is determined that the lens cannot be driven at high speed, the frame rate is not switched.
  When the frame rate is switched to the high-speed frame rate, blackout of the through image display or only a partial area is updated during the period. In the case of a lens that can be driven at a high speed, the time until focusing is shortened, so that the period of no display can be shortened. However, in the case of a lens that cannot be driven at a high speed, the time until focusing becomes long even at a high frame rate. For this reason, the period during which no image is displayed becomes longer, and the user cannot know the change of the subject during the AF operation. On the other hand, in this embodiment, when the attached lens cannot be driven at high speed, the through image display is updated without switching the frame rate.
  When the frame rate switching control as described above is applied to the first embodiment, if it is determined in step S901 in FIG. 9 that the focus detection pixel is not included in the position of the AF frame (NO in step 901), step S1401 in FIG. To proceed to. If it is determined that the lens can be driven at a high speed, the processing in steps 902 to S906 is performed. On the other hand, if it is determined that the lens is not capable of being driven at high speed, the processing in steps S903 to S905 is performed without switching the frame rate in step S902. In this case, since the frame rate is not switched at a high speed, the frame rate is not switched in step S906.
  When the frame rate switching control is applied to the second embodiment, if it is determined in step S1201 in FIG. 12 that the focus detection pixel is not included in the position of the AF frame (NO in step 1201), the process proceeds to step S1401 in FIG. Try to go forward. If it is determined that the lens can be driven at high speed, the processing in steps 1202 to S1206 is performed. On the other hand, if it is determined that the lens is not capable of being driven at high speed, the processing of steps S1203 to S1205 is performed without switching the frame rate of step S1202. In this case, since the frame rate is not switched at a high speed, the frame rate is not switched in step S1206.
  As mentioned above, although preferable embodiment of this invention was described, this invention is not limited to these embodiment, A various deformation | transformation and change are possible within the range of the summary.
DESCRIPTION OF SYMBOLS 101 Camera main body 102 Interchangeable lens 105 AF sensor 108 Image pick-up element 117 Display element 119 Image | photographing optical system 202 Image pick-up control circuit 208 Signal processing circuit SA Focus detection pixel SB Focus detection pixel

Claims (6)

  1. An image pickup apparatus in which an interchangeable lens provided with an image pickup optical system is detachable,
    An imaging unit including an imaging pixel that photoelectrically converts light from the imaging optical system to generate a subject image, and a focus detection pixel that receives light passing through a partial region of the exit pupil of the imaging optical system;
    First focus detection means for performing focus detection of the imaging optical system based on a signal from the focus detection pixel;
    A second focus detection unit that calculates a focus signal based on a high-frequency component of a signal from the imaging unit, and performs focus detection of the imaging optical system based on the focus signal;
    Control means for switching between a first mode for reading out the first area of the imaging means and a second mode for reading out the area narrower than the first area faster than the first mode. And
    When the focus detection pixel is not included in the focus detection area set in the imaging screen, the control unit sets the second mode and performs focus detection by the second focus detection unit,
    When the focus detection pixel is included in the focus detection area, the control unit sets the first mode, performs focus detection by the second focus detection unit, and based on the focus detection result An image pickup apparatus that performs focus detection by the second focus detection means.
  2.   In the second case where the reliability of the focus detection result by the first focus detection means is lower than the first case, the control means sets the second mode and the focus detection by the second focus detection means. When the reliability of the focus detection result by the first focus detection unit is the first, the control unit sets the first mode and performs focus detection by the first focus detection unit. The imaging apparatus according to claim 1, wherein focus detection is performed by the second focus detection unit based on the focus detection result.
  3.   The imaging apparatus according to claim 1, wherein the control unit controls switching between the first mode and the second mode based on luminance of image data corresponding to the focus detection area. .
  4.   The control unit controls switching between the first mode and the second mode based on information on whether or not the mounted interchangeable lens is a lens that can be driven at high speed. Or the imaging device of 2.
  5. Further comprising display means for displaying image data based on a signal from the imaging means;
    5. The image pickup apparatus according to claim 1, wherein, when shooting is performed in a state where image data is displayed on the display unit, the control unit sets the second mode. 6.
  6. An interchangeable lens equipped with an imaging optical system is detachable, and receives light passing through a part of an imaging pixel that photoelectrically converts light from the imaging optical system to generate a subject image and an exit pupil of the imaging optical system. A method for controlling an imaging apparatus having an imaging means having a focus detection pixel,
    A first focus detection step of performing focus detection of the imaging optical system based on a signal from the focus detection pixel;
    A second focus detection step of calculating a focus signal based on a high-frequency component of a signal from the imaging means, and performing focus detection of the imaging optical system based on the focus signal;
    A control step for switching between a first mode for reading out the first area of the imaging means and a second mode for reading out the area narrower than the first area at a higher speed than in the first mode. And
    When the focus detection pixel is not included in the focus detection area set in the imaging screen, the control step sets the second mode, and the focus detection by the second focus detection step is performed.
    When the focus detection pixel is included in the focus detection area, the first mode is set by the control step, and focus detection is performed by the second focus detection step. Based on the focus detection result A method of controlling an imaging apparatus, wherein focus detection is performed by the second focus detection step.
JP2011256745A 2011-11-24 2011-11-24 Imaging device, and control method therefor Pending JP2013113857A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011256745A JP2013113857A (en) 2011-11-24 2011-11-24 Imaging device, and control method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011256745A JP2013113857A (en) 2011-11-24 2011-11-24 Imaging device, and control method therefor

Publications (1)

Publication Number Publication Date
JP2013113857A true JP2013113857A (en) 2013-06-10

Family

ID=48709491

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011256745A Pending JP2013113857A (en) 2011-11-24 2011-11-24 Imaging device, and control method therefor

Country Status (1)

Country Link
JP (1) JP2013113857A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015011216A (en) * 2013-06-28 2015-01-19 キヤノン株式会社 Imaging device, and control method and control program of the same
JP2015087558A (en) * 2013-10-31 2015-05-07 キヤノン株式会社 Imaging apparatus, imaging system, method of controlling imaging apparatus, program, and recording medium
JP2015108778A (en) * 2013-12-05 2015-06-11 キヤノン株式会社 Image capturing device and control method therefor
JP2015114416A (en) * 2013-12-10 2015-06-22 キヤノン株式会社 Imaging device, imaging device control method, and control program
WO2016103745A1 (en) * 2014-12-25 2016-06-30 オリンパス株式会社 Imaging element, focal point detecting device, and focal point detecting method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015011216A (en) * 2013-06-28 2015-01-19 キヤノン株式会社 Imaging device, and control method and control program of the same
JP2015087558A (en) * 2013-10-31 2015-05-07 キヤノン株式会社 Imaging apparatus, imaging system, method of controlling imaging apparatus, program, and recording medium
JP2015108778A (en) * 2013-12-05 2015-06-11 キヤノン株式会社 Image capturing device and control method therefor
JP2015114416A (en) * 2013-12-10 2015-06-22 キヤノン株式会社 Imaging device, imaging device control method, and control program
WO2016103745A1 (en) * 2014-12-25 2016-06-30 オリンパス株式会社 Imaging element, focal point detecting device, and focal point detecting method
CN107113382A (en) * 2014-12-25 2017-08-29 奥林巴斯株式会社 Photographing element, focus detection device and focus detecting method
US10491803B2 (en) 2014-12-25 2019-11-26 Olympus Corporation Imaging element, focus detection apparatus, and focus detection method

Similar Documents

Publication Publication Date Title
JP5388544B2 (en) Imaging apparatus and focus control method thereof
JP5322783B2 (en) IMAGING DEVICE AND CONTROL METHOD OF IMAGING DEVICE
JP2011017800A (en) Focus detection apparatus
US9578231B2 (en) Image capture apparatus and method for controlling the same
JP5276374B2 (en) Focus detection device
US9247122B2 (en) Focus adjustment apparatus and control method therefor
JP2011023823A (en) Apparatus and method for processing image
JP5753371B2 (en) Imaging apparatus and control method thereof
US9380200B2 (en) Image pickup apparatus and control method in which object searching based on an image depends on pixel combining on the image
JP2013113857A (en) Imaging device, and control method therefor
JP2013254166A (en) Imaging device and control method of the same
JP2017032646A (en) Image-capturing device and method for controlling the same
JP5858683B2 (en) Focus detection apparatus and imaging apparatus
JP2014206601A (en) Imaging apparatus and focus adjustment method
JP2014016569A (en) Imaging device and control method of the same
JP2013160991A (en) Imaging apparatus
JP6188536B2 (en) Imaging device, imaging system, imaging device control method, program, and storage medium
JP2016071275A (en) Image-capturing device and focus control program
JP2009031562A (en) Light receiving element, light receiver, focus detecting device, camera
JP6845912B2 (en) Imaging device and its control method
JP2020064208A (en) Imaging apparatus and control method thereof
JP6929140B2 (en) Control device, image pickup device, control method, program, and storage medium
JP2020101723A (en) Imaging apparatus
JP6561437B2 (en) Focus adjustment device and imaging device
JP2012247613A (en) Imaging device