WO2018128098A1 - Imaging device and focus adjusting method - Google Patents

Imaging device and focus adjusting method Download PDF

Info

Publication number
WO2018128098A1
WO2018128098A1 PCT/JP2017/046203 JP2017046203W WO2018128098A1 WO 2018128098 A1 WO2018128098 A1 WO 2018128098A1 JP 2017046203 W JP2017046203 W JP 2017046203W WO 2018128098 A1 WO2018128098 A1 WO 2018128098A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
subject
unit
distance
focus lens
Prior art date
Application number
PCT/JP2017/046203
Other languages
French (fr)
Japanese (ja)
Inventor
円嘉 堀内
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2018128098A1 publication Critical patent/WO2018128098A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/28Circuitry to measure or to take account of the object contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention is an imaging device having a focus adjustment function.
  • the present invention relates to an imaging apparatus and a focus adjustment method capable of performing focus adjustment during an exposure operation for image recording.
  • an imaging apparatus equipped with an automatic focus adjustment device that automatically adjusts the focus lens is commercially available.
  • the conventional imaging apparatus cannot detect the focus lens focus state when performing an exposure operation for image recording. That is, in the conventional imaging device, when the subject distance is close, the subject distance fluctuates due to camera shake while the subject is being photographed. However, it is impossible to correct out-of-focus (out-of-focus state shifts from the in-focus state) that occurs during still image exposure. Therefore, a displacement sensor, such as an acceleration sensor, that detects the amount of camera displacement that occurs during the exposure operation to correct out-of-focus in the shooting state, is detected by the displacement sensor.
  • a camera with an automatic focusing function having a function of sequentially correcting the focus lens focus shift based on the amount (see Patent Document 1).
  • Patent Document 1 The camera with an automatic focusing function disclosed in Patent Document 1 described above performs focus adjustment using a displacement sensor such as an acceleration sensor.
  • a displacement sensor such as an acceleration sensor.
  • an error of the displacement amount such as an offset deviation is generated from the gradually detected displacement amount. For this reason, an error in the amount of displacement adversely affects the accuracy of focus adjustment.
  • the present invention has been made in view of circumstances in the background art. That is, it is an object of the present invention to provide an imaging apparatus and a focus adjustment method that can perform focus adjustment with high accuracy while photographing.
  • An imaging apparatus includes an optical lens including a focus lens that changes a focus state of a subject to be imaged, a focus driving unit that moves the focus lens in an optical axis direction, and the optical lens.
  • An image sensor that exposes a formed subject image and converts it into a video signal; a distance measuring unit that detects a subject distance between the imaging device and the subject; a subject distance detected by the distance measuring unit; and the focus
  • a focus control unit that controls a movement amount of the focus lens by the focus driving unit based on a focus position of the lens, and the focus control unit is configured to record the video signal by the imaging element for recording the video signal. Based on the amount of deviation between the subject distance detected by the distance measuring unit and the focus position of the focus lens while exposing the subject image, In a direction to match the serial object distance and the focus position, and performs control to move the focus focusing by the focus drive unit.
  • the focus adjustment method when detecting an image signal of an object image formed by an optical lens including a focus lens by an image sensor and detecting the image signal for recording, Based on the detection of the subject distance between the imaging device and the subject and the amount of deviation between the detected subject distance and the focus position of the focus lens, the subject distance and the focus position are set to coincide with each other. , Calculating a movement amount relative to the focus lens, and moving the focus lens based on the calculated movement amount.
  • an imaging apparatus and a focus adjustment method capable of performing focus adjustment with high accuracy while photographing.
  • FIG. 3 is a block diagram illustrating functions of a focus correction amount calculation unit 12 in the first embodiment of the present invention.
  • 6 is a flowchart illustrating an operation for correcting out-of-focus blur during still image exposure of the camera according to the first embodiment of the present invention. It is a timing chart which shows operation
  • FIG. 10 is a block diagram illustrating the function of a focus correction amount calculation unit 18 in the second embodiment of the present invention. It is a timing chart which shows the operation
  • FIG. 10 is a block diagram illustrating functions of a focus correction amount calculation unit 20 in the third embodiment of the present invention. It is a timing chart which shows operation
  • This camera includes an imaging unit that converts a subject image into image data, and a display unit that is disposed on the back of the main body.
  • This display unit displays a subject image in a live view based on the image data converted by the imaging unit.
  • the user observes the live view display and determines the composition of the image for shooting and the time point (shutter timing) for starting shooting.
  • still image data acquired by photographing is recorded on a recording medium.
  • the image data recorded on the recording medium can be reproduced and displayed on the display unit when the reproduction mode is selected.
  • an AF sensor 5 an image sensor 7 in which a ranging pixel is provided in a part of the pixels in the modification of the present embodiment
  • the defocus amount of the focus lens is calculated by so-called phase difference AF at predetermined time intervals during exposure for acquiring still image data.
  • the focus lens is subjected to focus adjustment based on the calculation result so as to maintain the in-focus state.
  • FIG. 1 is a block diagram mainly showing an electrical configuration in the present embodiment.
  • This camera includes a camera body 1, an optical system 2, an optical system drive unit 3, a half mirror 3, an AF sensor 5, a shutter 6, an image sensor 7, and a system control CPU 10.
  • the optical system 2 includes a focus lens and forms a subject image on the image sensor 6.
  • the focus lens can move in the optical axis direction.
  • the optical system drive unit 3 moves the focus lens to the in-focus position.
  • the optical system 2 functions as an optical lens including a focus lens that can change the focus state of a subject to be imaged.
  • Half mirror 4 is a translucent mirror such as a pellicle mirror.
  • the half mirror 4 is on the optical axis of the optical system 2 and is fixed by being inclined 45 degrees with respect to the optical axis.
  • the half mirror 4 transmits a part of the subject light beam transmitted through the optical system 2 and reflects the remaining light beam to the AF sensor 5.
  • the half mirror 4 is fixed.
  • the half mirror 4 is not limited to a fixed one.
  • the half mirror 4 may be configured to move so as to move on the optical axis of the optical system 2 when the release button is fully pressed.
  • the half mirror 4 has an action of causing a part of the subject image incident by the optical system to enter the imaging surface of the image sensor, an action of dividing a different part of the subject image in a direction different from the incident direction of the imaging surface, It functions as a spectroscopic unit (optical path splitter) having an effect of guiding a part of the subject image to the distance measuring unit (ranging sensor).
  • the distance measuring unit detects the subject distance by a distance measuring sensor different from the image sensor.
  • the AF sensor 5 is a detection element that receives a subject light beam split from the half mirror 4 and outputs an electrical signal for generating distance measurement data.
  • the AF sensor 5 may employ a phase difference detection method having a polarizing element and two detection units.
  • the phase difference detection method the light split from the half mirror 4 is further separated into two light beams having different polarization directions by a polarizing element.
  • the focus detection method detects a defocus amount (a defocus amount) from a positional relationship where each separated light beam forms an image.
  • the shutter 6 is disposed between the optical system 2 and the image sensor 7 and on the optical axis of the optical system 2.
  • the shutter 6 performs an opening / closing operation based on an instruction from the shutter control unit 13.
  • a subject image is formed on the image sensor 7 and is in an exposed state.
  • a subject image is not formed on the image sensor 7 and a light shielding state is established. That is, the exposure time of the image sensor 7 at the time of still image shooting is controlled by controlling the time when the shutter 6 is opened.
  • the imaging element 7 has a plurality of pixels arranged on the imaging surface, and each pixel converts a subject image formed on the imaging surface into an electrical signal (pixel signal). A pixel signal is read out from each of these pixels by the imaging pixel control unit 14. Further, video data is formed from these pixel signals.
  • the image sensor 7 functions as an image sensor that exposes a subject image formed by an optical lens and converts it into a video signal.
  • the system control CPU 10 is a controller having a CPU (Central Processing Unit), its peripheral circuits, and volatile and nonvolatile memory, and controls each part in the camera body 1 according to a program stored in the nonvolatile memory. To control the entire camera.
  • a distance measurement data detection unit 11, a focus correction amount calculation unit 12, a shutter control unit 13, an imaging pixel control unit 14, and a focus lens drive amount calculation unit 15 are provided. These parts are realized by executing a peripheral circuit in the system control CPU 10 and a control program.
  • the system control CPU 10 has various functions. However, in this specification, the focus adjustment control during still image exposure will be mainly described, and description of other functions will be omitted.
  • the system control CPU 10 functions as a focus control unit that controls the amount of movement of the focus lens by the focus driving unit based on the subject distance detected by the distance measuring unit and the focus position of the focus lens (for example, FIG. S11 to S17 in FIG. 4, S35 to S43 in FIG. 10, S51 to S59 in FIG. 15A, and S61 to S71 in FIG. 15B).
  • the focus control unit detects the amount of deviation between the subject distance detected by the distance measuring unit and the focus lens focus position while the image sensor exposes the subject image when shooting a still image for recording. Based on the above, a drive signal is given to the focus drive unit in a direction to match the subject distance and the focus position, and the focus lens is moved.
  • the shutter control unit 13 includes a shutter control circuit, calculates subject luminance based on the pixel signal from the image sensor 14, and obtains a shutter speed value to obtain appropriate exposure (exposure amount) based on the subject luminance. calculate.
  • the shutter 6 is opened and closed based on the calculated shutter speed value (or the shutter speed value manually set by the user) to control the exposure time.
  • the imaging pixel control unit 14 includes an imaging control circuit for reading out pixel signals from each imaging pixel provided in the imaging element 7.
  • the imaging pixel control unit 14 reads pixel signals for live view display until the release button is fully pressed. Further, when the release button is fully pressed and the opening / closing operation of the shutter 6 is completed, still image data for recording is read out.
  • the ranging data detection unit 11 may have a ranging data detection circuit, and detects the ranging data based on the electrical signal acquired from the AF sensor 5.
  • the focus correction amount calculation unit 12 may include a focus correction amount calculation circuit, and based on the distance measurement data detected by the distance measurement data detection unit 11, the amount of deviation (defocus) from the focus position of the focus lens. Amount), that is, a focus correction amount in the optical axis direction is calculated.
  • the distance measurement data detection unit 11 functions as a distance measurement unit that detects a subject distance between the imaging apparatus and the subject. In addition, the distance measuring unit operates to detect the subject distance in parallel with the exposure operation while the image sensor exposes the subject image (see, for example, FIGS. 5, 9, and 14). ).
  • the distance measuring unit detects the defocus amount of the optical system 2 as the subject distance.
  • the present invention is not limited to this.
  • the distance from the camera body 1 to the subject may be directly detected using the subject light flux that has passed through the optical system other than the optical system 2, and the subject distance may be used.
  • the focus lens drive amount calculation unit 15 may have a focus lens drive amount calculation circuit, performs unit conversion from the focus correction amount calculated by the focus correction amount calculation unit 12 to focus lens drive, and drives the drive position or drive. Calculate the amount.
  • the system control CPU 10 realizes the functions of the distance measurement data detection unit 11, the focus correction amount calculation unit 12, the shutter control unit 13, the imaging pixel control unit 14, and the focus lens drive amount calculation unit 15 by software using a program. Alternatively, it may be realized by a peripheral circuit, or may be realized by a combination of a program and a peripheral circuit.
  • the optical system drive unit 3 includes a drive actuator (for example, a stepping motor) and a drive circuit, and moves the focus lens of the optical system 2 in the optical axis direction.
  • the optical system drive unit 3 performs focus control of the focus lens of the optical system 2 based on the drive position or drive amount obtained from the focus lens drive amount calculation unit 15.
  • the optical system drive unit 3 functions as a focus drive unit that moves the focus lens in the optical axis direction.
  • FIG. 2 is a block diagram showing a modification of the first embodiment. Since most of the blocks in FIG. 2 are the same as those in FIG. 1, differences will be mainly described.
  • the distance measuring unit (ranging sensor) in the modification of the first embodiment has the function of the AF sensor 5 in FIG.
  • the image pickup device 7 has a plurality of pixels on the image pickup surface, like the image pickup device 7 shown in FIG. In FIG. 1, the image sensor 7 has only image pixels.
  • the image sensor 7 includes an image plane phase difference AF pixel (ranging pixel) in addition to the image pickup pixel.
  • the image sensor 7 in FIG. 2 also has a function of converting it into an electrical signal for obtaining distance measurement data corresponding to the distance to the subject. That is, the image sensor 7 includes distance measuring pixels on the imaging plane and image pickup pixels (non-range pixels) that output image signals from the subject image.
  • the pixel signal from the imaging pixel is output to the imaging pixel control unit 14, while the electrical signal from the ranging pixel is output to the ranging pixel control unit 16. Accordingly, the subject image information is processed by the imaging pixel control unit 14, and the distance information to the subject is processed by the ranging pixel control unit 16.
  • the image pickup pixel and the distance measurement pixel are independently operated to start and end the exposure for each pixel, and a pixel output circuit for independently reading the pixel signal for each pixel is formed.
  • the imaging pixel control unit 14 and the ranging pixel control unit 16 can operate in parallel during exposure to read out the output of the imaging pixel and the output of the ranging pixel. That is, it is a configuration that can obtain the output of the ranging pixel during exposure.
  • the ranging pixel control unit 16 may include a readout control circuit that performs readout control of readout of the ranging pixel, and based on the electrical signal obtained from the ranging pixel of the image sensor 7, the ranging data up to the subject To produce. Further, since the distance measurement information can be acquired from the distance measurement pixels of the image sensor 7, the AF sensor 5 is not required, and the half mirror 4 is not required. For this reason, the light beam that has passed through the optical system 2 is directly imaged on the image sensor 7 without being dispersed.
  • the distance measuring unit can independently read out pixel signals output from the distance measuring pixels while the non-range pixels are performing the exposure operation, detect the subject distance, and acquire the result.
  • the distance measuring unit includes a distance measuring pixel of the image sensor 7 and a distance measuring pixel control unit 16.
  • FIG. 1 and FIG. 2 The difference between FIG. 1 and FIG. 2 is a method for acquiring distance measurement data up to the subject.
  • the distance measurement data detection unit 11 and the distance measurement pixel control unit 16 of the system control CPU 10 output the same type of distance measurement data. It becomes.
  • the half mirror 4 and the AF sensor 5 can be omitted by providing a distance measuring pixel in the image sensor 7.
  • FIG. 3 is a block diagram showing details of the function of the focus correction amount calculation unit 12, and is composed of distance measurement data 121 and a defocus amount calculation unit 122.
  • Each block may be realized by software by the CPU of the system control CPU 10 according to a program, may be realized by a peripheral circuit, or may be realized by a combination of a program and a peripheral circuit.
  • the distance measurement data 121 is output from the distance measurement data detection unit 11 that processes an electrical signal output from the AF sensor 5.
  • the distance measurement data 121 corresponds to data output from the distance measurement pixel control unit 16 that processes an electrical signal from the distance measurement pixel of the image sensor 7.
  • the defocus amount calculation unit 122 calculates the defocus amount based on the distance measurement data 121 during still image exposure.
  • the focus lens driving amount calculation unit 15 performs unit conversion for driving the focus lens based on the calculated defocus amount, and calculates the driving position or driving amount of the focus lens.
  • the lens-side optical system drive unit 3 can use the calculated drive position / drive amount to drive the focus lens of the optical system 2 and perform focus correction in the optical axis direction during still image exposure.
  • This control flow is realized by the system control CPU 10 controlling each part in the camera body 1 according to a program stored in the nonvolatile memory.
  • step S3 If the result of determination in step S1 is that 1R has been depressed, an autofocus operation (AF operation) is started, and distance measurement data is first acquired (S3).
  • the ranging data detection unit 11 or the ranging pixel control unit 16 acquires the ranging data. While the 1R depression is detected, the distance measurement data is continuously acquired at regular intervals.
  • the operation of continuously acquiring the distance measurement data at regular intervals while detecting the 1R depression is an example of the operation. The operation procedure is not limited to this, and during live view shooting, the distance measurement data may be continuously acquired at regular intervals regardless of whether or not the 1R is depressed.
  • the defocus amount is calculated (S5).
  • the focus correction amount calculation unit 12 calculates the defocus amount using the distance measurement data acquired in step S3.
  • the focus lens is driven (S7).
  • the focus lens drive amount calculation unit 15 calculates the drive position or drive amount of the focus lens, and the optical system drive unit 3 moves the focus lens to the focus position.
  • step S9 When the focus lens is driven, it is next determined whether or not 2R (2nd. Release) is depressed (S9). When the user intends to shoot a still image, the user fully presses the release button. In this step, it is determined whether or not the release button has been fully pressed. As a result of the determination, if there is no 2R depression, the process returns to step S3, the above operation is repeated, and the focus lens is maintained at the in-focus position.
  • 2R nd. Release
  • step S9 If the result of determination in step S9 is that 2R has been pressed, the operation moves to still image shooting. That is, exposure control is performed with the aperture value and shutter speed value for proper exposure based on the subject brightness before 2R is pressed, and the pixel signal of the imaging pixel is read out from the imaging device 7 after the exposure time has elapsed, for still image recording. After the image processing is performed, image data is recorded on the recording medium.
  • a focus lens focus adjustment operation focus blur correction operation
  • the imaging pixels of the imaging device 7 photoelectrically convert the subject luminous flux and continue the charge accumulation.
  • the AF pixels 5 or the ranging pixels of the image sensor 7 output ranging data at predetermined time intervals.
  • ranging data is acquired (S11).
  • step S3 distance measurement data is continuously acquired at regular intervals.
  • the defocus amount is calculated (S13).
  • step S5 the focus correction amount calculation unit 12 periodically calculates the defocus amount.
  • step S15 focus lens driving is performed (S15).
  • step S7 the focus lens is periodically moved to the in-focus position by the focus lens drive amount calculation unit 15 and the optical system drive unit 3.
  • step S17 it is next determined whether or not the still image exposure is completed.
  • the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that the still image exposure has not been completed, the process returns to step S11, and steps S11 to S15 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position.
  • steps S11 to S15 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position.
  • steps S3 to S7 and steps S11 to S15 are the same process.
  • distance measurement data can be acquired using the distance measurement pixels of the AF sensor 5 or the image sensor 7 even during still image exposure. For this reason, it is possible to perform out-of-focus correction even during still image exposure.
  • the horizontal axis indicates the passage of time
  • the vertical axis indicates the operation content of the camera, the output of the image sensor 7 or the AF sensor 5, the processing content of the system control CPU 10, and the operation content of the focus lens, respectively. Show.
  • the time point “1R” indicates a time point when the release button is half-pressed, and corresponds to a time point when it is determined that the 1R button is pressed in step S1 of FIG.
  • the system control CPU 10 periodically calculates the defocus amount by using the distance measurement data from the AF sensor 5 or the imaging sensor 7. Then, the focus lens is periodically driven.
  • the downward arrow from the distance measurement data to the defocus amount calculation shown in FIG. 5 indicates the time point at which the defocus amount is calculated periodically using the distance measurement data.
  • “2R” indicates a time point when the release button is fully pressed, and corresponds to a time point when it is determined that 2R is pressed in step S9 of FIG. If 2R depression is determined, an exposure operation is performed for still image shooting, and distance measurement data is periodically acquired in parallel with this. Then, the focus lens is driven using the defocus amount calculated from the distance measurement data.
  • the downward arrow from the distance measurement data to the defocus amount calculation shown in FIG. 5 indicates the timing at which the defocus amount is periodically calculated using the distance measurement data even after the start of still image shooting. While the image sensor exposes the subject image, this exposure operation is independently performed in parallel to detect distance measurement data.
  • the defocus detection unit for example, the AF sensor 5, the distance measurement data detection unit 11, the image sensor 7 and the distance measurement which can detect the defocus amount during exposure.
  • a focus control unit for example, focus lens drive
  • a quantity calculating unit 15 and an optical system driving unit 3 that performs defocus detection during exposure and correction drive of the focus lens position at predetermined time intervals based on the detection result.
  • the above-described defocus detection unit includes an image separation unit (half mirror 4) and a focus sensor (ranging pixel (phase difference sensor) of the image sensor 7).
  • an imaging element having imaging pixels and ranging pixels (pupil eccentricity, pupil division, etc.) on the imaging surface is configured.
  • this pixel readout circuit has a readout circuit for imaging pixels and a readout circuit for pixels for distance measurement, each of which is provided with an independent readout circuit. ing.
  • a pixel output is obtained independently by each readout control. This makes it possible to detect a change in the defocus state (defocus amount and defocus method) with time during still image shooting exposure.
  • the positional relationship between the subject before exposure and the camera is maintained, so that the focus shift due to the shake during the exposure is corrected and the focus accuracy is improved. Furthermore, since it is possible to follow not only camera shake fluctuation in the optical axis direction but also subject fluctuation, the out-of-focus in still image shooting is further reduced.
  • the detection result by the acceleration sensor is used for the focus adjustment of the focus lens.
  • FIG. 6 is a block diagram mainly showing an electrical configuration of the second embodiment. Compared to the block diagram in the first embodiment shown in FIG. 1, an acceleration sensor 8 and an acceleration data detection unit 17 are added in the second embodiment. Furthermore, the difference is that the focus correction amount calculation unit 12 in the first embodiment is replaced with a focus correction amount calculation unit 18.
  • the acceleration sensor 8 detects the translational motion applied to the camera as acceleration and outputs it to the system control CPU 10.
  • the acceleration information includes X-axis information in the horizontal direction of the camera, Y-axis information in the vertical direction, and Z-axis information in the optical axis direction. In the present embodiment, only the Z axis (optical axis) is described, and thus the description of the X axis and the Y axis is omitted.
  • the acceleration data detection unit 17 may have an acceleration data detection circuit, and outputs acceleration information in the optical axis direction based on an electrical signal obtained from the acceleration sensor 8.
  • the acceleration information is integrated once with time, it becomes speed information, and when the speed information is integrated with time once, it becomes distance information. That is, the time change of the distance between the camera and the subject can be detected by integrating the acceleration information twice.
  • the function of the acceleration data detection unit 17 is realized by the system control CPU 10 mainly based on a program.
  • an acceleration data detection circuit is provided as a peripheral circuit of the system control CPU 10 (controller). It may be realized.
  • the acceleration data detection unit 17 functions as a movement amount detection unit that detects the movement amount of the focus lens in the optical axis direction.
  • the system control CPU 10 functions as a focus control unit. Then, the focus control unit calculates the movement amount with respect to the focus driving unit based on the subject distance detected by the distance measurement unit, the movement amount of the focus lens in the optical axis direction detected by the movement amount detection unit, and the focus lens focus position. Output (for example, see S35 to S43 in FIG. 10).
  • the movement amount detection unit described above calculates a correction value based on the acceleration sensor 8 that detects the movement acceleration in the optical axis direction and the acceleration output of the acceleration sensor based on the subject distance detected by the distance measurement unit, and uses the correction value as an acceleration.
  • An acceleration correction unit that calculates by adding or subtracting to the output is included (see, for example, S27 and S37 in FIG. 10).
  • the focus correction amount calculation unit 18 calculates the focus correction amount in the optical axis direction based on the acceleration information in the optical axis direction detected by the acceleration data detection unit 17. As described above, the focus correction amount is calculated by integrating the acceleration information twice over time. Specifically, the focus correction amount is calculated using the distance measurement data to calculate the absolute speed at the start of exposure and the absolute acceleration (reference point correction), respectively, and use it to calculate the focus correction amount. is doing. The calculation of the focus correction amount will be described later with reference to FIGS.
  • FIG. 7 is a block diagram showing a modification of the second embodiment. Many of the blocks shown in FIG. 7 are the same as the block diagram shown in FIG. The only difference is that the acceleration sensor 8 and the acceleration data detection unit 17 are added and replaced with a focus correction amount calculation unit 18. Since this difference is the same as the configuration shown in FIG. 6, detailed description thereof is omitted.
  • FIG. 8 is a block diagram showing details of the function of the focus correction amount calculation unit 18, and includes acceleration data 181, distance measurement data 182, speed conversion unit 183, acceleration conversion unit 184, acceleration data (after reference point correction) 185, And a defocus amount calculation unit 186.
  • each block is realized by software according to the CPU of the system control CPU 10 according to a program, but is not limited thereto, and may be realized by a peripheral circuit, or by a combination of a program and a peripheral circuit. May be.
  • the acceleration data 181 is acceleration information in the optical axis direction of the optical system 2 and is obtained from the acceleration sensor 8 shown in FIGS.
  • the distance measurement data 181 corresponds to data output from the distance measurement data detection unit 11 that processes an electrical signal output from the AF sensor 5.
  • the distance measurement data 181 corresponds to data output from the distance measurement pixel control unit 16 that processes an electric signal from the distance measurement pixel of the image sensor 7.
  • the speed conversion unit 183 can receive the distance measurement data 182 and convert it into speed data by differentiating it at a certain time. For example, if a difference between a certain time and distance measurement data at a time after a predetermined time is divided by a predetermined time, it can be converted into speed data.
  • the acceleration conversion unit 184 can convert the converted velocity data 183 into acceleration data 184 by differentiating the converted velocity data 183 over a certain period. For example, if a difference between speed data at a certain time and speed data at a subsequent time is divided by a predetermined time, it can be converted into acceleration data before reference point correction.
  • the acceleration data calculation unit 185 is generated from the acceleration data 181 generated from the acceleration sensor 8 and the ranging pixels of the AF sensor 5 or the image sensor 7 and further converted by the acceleration conversion unit 184 before the reference point correction.
  • the acceleration data after the reference point correction is calculated using the two acceleration data.
  • the defocus amount calculation unit 186 uses the acceleration data after the reference point correction calculated by the acceleration data calculation unit 185 and the absolute speed at the time immediately before the exposure time converted by the speed conversion unit 183. Calculate the defocus amount.
  • the distance measurement data acquired at a certain distance measurement interval T is defined as X1, X2, and X3 (see A in FIG. 9). From these values, speed data V1 and V2 are calculated by (Expression 1) and (Expression 2).
  • V1 (X2-X1) / T (Equation 1)
  • V2 (X3-X2) / T (Equation 2)
  • V2 is calculated as the speed data immediately before the still image exposure.
  • the speed data V2 is used as an absolute speed at the start of exposure.
  • the acceleration data A_ave is calculated from (Equation 3) using the speed data V1 and V2.
  • A_ave (V2-V1) / T (Equation 3)
  • an acceleration average value a_ave before the still image exposure is obtained using the acceleration data 181 (see B in FIG. 9).
  • the acceleration average value a_ave is an average value of the acceleration data a1 to a13 during the data acquisition period of X1 to X3 used in the distance measurement data (Formula 4). In the example shown in FIG. 9, 13 pieces of acceleration data up to a13 are used.
  • a_ave ⁇ (an) / n (Formula 4)
  • the reference point error can be calculated from (Equation 5).
  • This reference point error is the difference between the acceleration calculated from the distance measurement data and the acceleration calculated from the acceleration data. Originally, it is preferable that both coincide. However, since an offset is superimposed on the output of the acceleration sensor 8, there is a difference (difference) between the acceleration values. In calculating the defocus amount based on the acceleration data, the offset deviation is corrected using the reference point error.
  • Reference point error a_ave-A_ave (5)
  • the defocus amount calculation unit 186 uses the acceleration data after the reference point correction and the absolute velocity V2 immediately before the exposure obtained in (Expression 2), 7), a highly accurate defocus amount can be calculated.
  • Defocus amount ⁇ (V2 + ⁇ an ') (Equation 7)
  • the focus lens drive amount calculation unit 15 converts the calculated defocus amount into an amount necessary for driving the focus lens, and performs drive conversion. Calculate the position or drive amount.
  • this control flow indicates that the system control CPU 10 controls each part in the camera body 1 according to a program stored in the nonvolatile memory.
  • step S21 it is determined whether or not 1R is depressed.
  • 1R is depressed
  • step S1 If the result of determination in step S1 is that 1R has been depressed, an autofocus operation (AF operation) is started, and acceleration data (an) is first acquired (S23).
  • the acceleration data detection unit 17 acquires acceleration data (an) from the acceleration sensor 8 at regular intervals (every distance measurement interval T).
  • ranging data (Xn) is acquired (S25).
  • the ranging data detection unit 11 or the ranging pixel control unit 16 acquires the ranging data.
  • distance measurement data is continuously acquired at regular intervals.
  • FIGS. 9A and 9B the timing for obtaining the distance measurement data and the acceleration data is different. Both acquire data appropriately according to a preset period.
  • a reference point error is calculated (S27).
  • the reference point error is calculated using Equation 5 as described with reference to FIG.
  • the reference point error is sequentially updated using the most recently acquired data.
  • the focus correction amount calculation unit 21 calculates the defocus amount using the distance measurement data (Xn) acquired in step S25.
  • the focus lens is driven (S31).
  • the focus lens drive amount calculation unit 15 calculates the drive position or drive amount of the focus lens, and the optical system drive unit 3 moves the focus lens to the focus position.
  • step S33 it is next determined whether or not the 2R has been depressed (S33), as in step S9.
  • the user When the user intends to shoot a still image, the user fully presses the release button. In this step, it is determined whether or not the release button has been fully pressed. If the result of determination in step S33 is that there has been no 2R depression, processing returns to step S23 and the above operation is repeated to maintain the focus lens in the in-focus position.
  • the operation of continuously acquiring acceleration data and distance measurement data at regular intervals is an example. Without being limited to this operation procedure, during live view shooting, it may be an operation of continuously acquiring acceleration data and distance measurement data at regular intervals regardless of whether or not the 1R button is pressed.
  • step S33 If the result of determination in step S33 is that there has been 2R depression, the operation proceeds to a still image shooting operation, as in the first embodiment.
  • a focus lens focus adjustment operation (focus blur correction operation) is performed in steps S35 to S41.
  • the latest reference point error (immediately before the still image exposure) calculated in S27 and the absolute velocity V2 (calculated by Expression 2) are held while the 1R button is pressed. This is for use in calculating the defocus amount during still image exposure.
  • Acceleration data (an) is acquired (S35). During the still image exposure period after the 2R button is pressed, acceleration data (an) is continuously acquired at regular intervals. Further, acceleration data (an ') is calculated (S37). Here, using the reference point error, acceleration data (an ′) after the reference point correction is calculated using Equation 6.
  • the defocus amount is calculated (S39).
  • the focus correction amount calculation unit 18 periodically calculates the defocus amount using the acceleration data (an ′) acquired in step S35 and the absolute velocity V2 calculated in step S37 according to Equation 7.
  • the focus lens is then driven (S41).
  • the focus lens drive amount calculation unit 15 and the optical system drive unit 3 regularly drive the focus lens of the optical system 2 using the defocus amount obtained in step S39.
  • step S43 When the focus lens is driven, it is next determined whether or not the still image exposure is completed (S43). In this determination, when the exposure time has elapsed and the shutter 6 is closed, it is determined that the still image exposure has been completed. If the result of this determination is that the still image exposure has not been completed, the process returns to step S35, and steps S35 to S41 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position even during still image exposure. To do. When the still image exposure is completed as a result of the determination in step S43, this flow is terminated.
  • the reference point correction of the acceleration data is performed by using the distance measurement data for the focus correction in the optical axis direction, thereby improving the reference point accuracy of the acceleration sensor. Realizes highly accurate focus correction. The problem that the position accuracy of the acceleration sensor in the prior art is low can be solved.
  • the acceleration sensor 8 is used in addition to the distance measuring function (sensor) as the defocus detection unit, and a correction unit that corrects (calibrates) the detection value in a complementary manner is provided.
  • Defocus amount detection during exposure is mainly calculated from the output of the acceleration sensor, but the offset from the acceleration sensor output due to time changes occurs, so information from the distance measurement sensor obtained in the same detection period Is used to calibrate the offset deviation of the acceleration sensor output.
  • detection of a defocus amount based on an acceleration sensor output is used as a main detection unit, and detection unit is used as a sub-defocus amount of a distance measuring sensor.
  • detection unit is used as a sub-defocus amount of a distance measuring sensor.
  • the defocus amount is mainly detected by the output of the acceleration sensor, and the offset deviation that occurs during long-time use is calibrated using the output of the distance measuring sensor.
  • the output of either the distance measurement sensor or the acceleration sensor is selected and detected as a defocus amount detection unit according to the subject brightness and the exposure time.
  • FIG. 11 is a block diagram mainly showing an electrical configuration of the third embodiment. Compared to the block diagram in the second embodiment shown in FIG. 6, in the third embodiment, a photometric sensor 9 and an exposure control unit 19 are added, and the focus correction amount calculation unit 18 is replaced with a focus correction amount calculation unit 20. Is different.
  • the photometric sensor 9 is a sensor that acquires the luminance of the subject, and outputs the acquired luminance information to the exposure control unit 19.
  • a pixel signal from an imaging pixel in the imaging device 7 may be used as an output of the photometric sensor 9.
  • luminance information of the subject can be acquired based on the pixel signal, Further, the photometric sensor 9 can be omitted.
  • the photometric sensor 9 functions as a photometric unit that detects subject luminance.
  • the exposure control unit 19 calculates a shutter speed value, an aperture value, and an ISO sensitivity value during still image shooting based on the luminance information of the subject input from the photometric sensor 9, and controls the shutter 6 and the like based on these values. To do.
  • the role of the exposure control unit 19 in this embodiment is that the focus correction unit uses the focus correction amount obtained from the acceleration sensor 8 based on the shutter speed value at the time of still image shooting, or the AF sensor 5 (described later).
  • the function of the exposure control unit 19 is realized by the system control CPU 10 mainly based on a program.
  • an acceleration data detection circuit is provided as a peripheral circuit of the system control CPU 10 (controller), and is realized by this circuit. May be.
  • the system control CPU 10 functioning as a focus control unit selects whether to select one or both of the detection values of the distance measurement unit and the movement amount detection unit for detection of the subject distance according to the subject luminance. It further has a selection part (for example, refer to S51 of Drawing 15A). Further, the system control CPU 10 functioning as a focus control unit performs control for moving the focus focus drive unit in a direction in which the subject distance and the focus position are matched based on the subject distance detected after being selected by the selection unit. (For example, see S53 to S59 in FIG. 15A and S63 to S71 in FIG. 15B).
  • FIG. 12 is a block diagram showing a modification of the third embodiment. Most of the blocks in FIG. 12 are the same as those in FIG. 7, and the only difference is that a photometric sensor 9 and an exposure control unit 19 are added and replaced with a focus correction amount calculation unit 20. Since this difference is the same as the configuration shown in FIG. 11, detailed description thereof is omitted.
  • FIG. 13 is a block diagram illustrating details of the function of the focus correction amount calculation unit 20.
  • the focus correction amount calculation unit 20 includes distance measurement data 201, a speed conversion unit 202, an acceleration conversion unit 203, acceleration data 204, and acceleration data.
  • a calculation unit (after reference point correction) 205, a defocus amount calculation unit (distance measurement) 206, a defocus amount calculation unit (acceleration) 207, and a focus correction unit switching unit 208 are included.
  • the block diagram shown in FIG. 13 combines FIG. 3 of the first embodiment and FIG. 8 of the second embodiment, and adds a focus correction unit switching unit 208 to this. That is, the distance measurement data 121 and the defocus amount calculation unit 122 in FIG. 3 correspond to the distance measurement data 201 and the defocus amount calculation unit 206 in FIG. In addition, the acceleration data 181, distance measurement data 182, speed conversion unit 183, acceleration conversion unit 184, acceleration data calculation unit 185, and defocus amount calculation unit 186 in FIG. , The speed conversion unit 202, the acceleration conversion unit 203, the acceleration data calculation unit 205, and the defocus amount calculation unit 207. Therefore, only the added focus correction unit switching unit 208 will be described with reference to FIG.
  • the focus correction unit switching unit 208 outputs the defocus amount obtained from the distance measurement data or the defocus amount obtained from the acceleration data based on the shutter speed information acquired from the exposure control unit 19 during still image shooting. Switch whether to do. In some cases, neither of the two is performed and focus correction is not performed. Details of this determination will be described later with reference to FIG.
  • the defocus amount selected and output by the focus correction unit switching unit 208 is converted into a unit for driving the focus lens by the focus lens driving amount calculation unit 15 to calculate a driving position or a driving amount.
  • the lens-side optical system drive unit 3 uses the calculated drive position and drive amount to drive the focus of the optical system 2 to correct the focus in the optical axis direction during still image exposure.
  • the defocus amount calculation method in the third embodiment is a combination of the defocus amount calculation methods in the first embodiment and the second embodiment.
  • the block diagram shown in FIG. 14 differs from the block diagram shown in FIG. 9 of the second embodiment in that the distance measurement data is acquired even during still image exposure. Therefore, the defocus amount calculation part during still image exposure will be described.
  • the focus correction unit switching unit 208 outputs the defocus amount (Xn) obtained from the distance measurement data 201 or the defocus obtained from the acceleration data 204. Switching between output of the quantity (xn) is performed (see the shaded portion in FIG. 14). Details of this determination processing will be described later with reference to FIG.
  • the focus lens driving amount calculation unit 15 performs unit conversion for driving the focus lens based on the defocus amount calculated from the distance measurement data 201 or the acceleration data 204, and calculates a driving position or a driving amount.
  • this control flow is realized by the system control CPU 10 controlling each part in the camera body 1 in accordance with a program stored in the nonvolatile memory.
  • step S33 Immediately after it is determined in step S33 that 2R has been pressed, the latest reference point error (just before the still image exposure) calculated in step S27 and absolute velocity V2 (see Equation 2) are held during 1R pressing. Keep it.
  • the predetermined luminance may be a luminance that allows the distance measurement data to be acquired by the AF sensor 5 or the image sensor 7 in a relatively short time.
  • the predetermined luminance is a subject luminance value that enables the distance measurement data obtained in a relatively short time to be sufficiently distinguished from the noise component.
  • the determination is made based on the brightness.
  • the present invention is not limited to this, and the determination may be made based on the exposure amount Ev, or may be made based on other exposure control values.
  • step S53 distance measurement data (Xn) is acquired (S53).
  • distance measurement data is adopted from the AF sensor 5 (in the modification of the present embodiment, the distance measurement pixel of the image sensor 7).
  • distance measurement data is continuously acquired at regular intervals.
  • the focus correction amount calculation unit 20 periodically calculates the defocus amount using the distance measurement data (Xn) acquired in step S53.
  • the focus lens is then driven in the same manner as in step S15 (see FIG. 4) (S57).
  • the focus lens drive amount calculation unit 15 and the optical system drive unit 3 periodically move the focus lens to the in-focus position.
  • step S17 it is next determined in step S17 (see FIG. 4) whether or not still image exposure has been completed (S51). When the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that the still image exposure has not ended, the process returns to step S53, and steps S53 to S59 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position. When the still image exposure is finished, this flow is finished.
  • step S61 if the result of determination in this step is that the subject is not bright, it is determined whether or not the exposure time is shorter than a predetermined time (S61).
  • the determination is made based on the shutter speed value calculated by the exposure control unit 19 based on the detection result of the photometric sensor 9.
  • the predetermined time may be a time that can maintain the reliability of the acceleration data by correcting the reference point. If the result of this determination is that the exposure time is not shorter than the predetermined time, the flow of out-of-focus during still image exposure is terminated without driving the defocus amount calculation and focusing lens.
  • step S61 If the result of determination in step S61 is that the exposure time is shorter than the predetermined time, acceleration data (an) is acquired as in step S35 (see FIG. 10) (S63).
  • the acceleration data (an) is continuously acquired from the acceleration sensor 8 and the acceleration data detection unit 17 at regular intervals during still image exposure.
  • acceleration data (an) is acquired, next, acceleration data (an ') after the reference point correction is calculated (S65) as in step S37 (see FIG. 10).
  • the acceleration data (an ′) after the reference point correction is calculated according to Equation 6 using the reference point error stored immediately before 2R is pressed.
  • the defocus amount is calculated (S67).
  • the focus correction amount calculation unit 20 periodically calculates the defocus amount according to Equation 7 using the acceleration data (an ′) and the absolute velocity V2.
  • the focus lens is then driven in the same manner as in step S41 (see FIG. 10) (S69).
  • the focus lens drive amount calculation unit 15 and the optical system drive unit 3 periodically drive the focus lens of the optical system 2 using the defocus amount calculated in step S65.
  • step S71 When the focus lens is driven, it is next determined whether or not the still image exposure is completed (S71). When the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that still image exposure has not ended, the process returns to step S63, and steps S63 to S71 are repeated at predetermined time intervals to maintain the focus lens in the in-focus position even during still image exposure. To do. When the still image exposure is finished, this flow is finished.
  • focus adjustment control of the focus lens is performed using the distance measurement data (see S53 to S59).
  • focus adjustment control of the focus lens is performed using the acceleration data (see S63 to S71). If neither is found, neither the distance measurement data nor the acceleration data is used, and the subsequent processing is not performed.
  • the reason for determining whether or not the distance measurement data can be adopted based on the luminance of the subject is that if the subject has a low luminance, it is necessary to lengthen the distance sampling time in order to obtain the required accuracy of the distance measurement data. For this reason, the sampling time of the distance measurement cannot obtain the sampling time necessary for detecting the subject distance change due to the camera shake, and the accuracy of the distance measurement data is remarkably lowered.
  • the reason for determining whether or not to adopt acceleration data based on the exposure time is that if the exposure time is long, the time-dependent change of the reference point of the acceleration sensor greatly affects the reliability of the acceleration data and the accuracy of focus correction This is because of the decrease.
  • the acceleration sensor data is used for the exposure, even under dark conditions. Focus correction is possible.
  • a detection unit for example, see photometric sensor 9 that detects subject brightness
  • an exposure condition calculation (exposure time calculation) unit for example, see exposure control unit 19
  • a change unit for example, a focus correction amount calculation unit 20; see S61 in FIG. 15B
  • the acceleration sensor 8 can detect a posture change (defocus amount) regardless of the exposure amount.
  • the acceleration sensor has a deviation in accuracy for detecting a change in posture due to an offset deviation in output.
  • the advantages and disadvantages of the two defocus detection units can be switched according to the conditions of subject brightness and exposure time.
  • step S61 If it is determined in step S61 (see FIG. 15B) that the exposure time is longer than the predetermined time, the defocus amount is not corrected by the acceleration data in step S63 and subsequent steps. However, it is not limited to this. As an example, the defocus amount is corrected in step S63 and subsequent steps until a predetermined time elapses after the exposure is started. Then, when a predetermined time has elapsed after the start of exposure, the defocus amount correction in step S63 and subsequent steps may not be performed.
  • the defocus amount is exposed during the exposure period in the imaging apparatus having the focus driving unit (for example, the optical system driving unit 3) that drives the focus lens.
  • a defocus detection unit for example, a AF pixel 5 and a distance measuring pixel of the image sensor 7) that detects at a plurality of time points in a period, and a focus control unit that performs focus control using a defocus amount obtained during an exposure period (For example, the focus lens drive amount calculation unit 15). For this reason, it is possible to accurately detect the change of the focus state sequentially during the exposure and correct the focus shift sequentially and accurately.
  • the focus shift can be corrected during exposure, and the in-focus state can be maintained even during shooting.
  • in-focus shooting can be performed even in situations where the aperture value is small and the depth of field is shallow and the camera is susceptible to camera shake.
  • a video signal is detected by an image sensor for a subject image formed by an optical lens including a focus lens, and the video signal is detected for recording (for example, 4 (see S9 Yes in FIG. 4)
  • the subject distance between the imaging device and the subject is detected (see, for example, S11 in FIG. 4)
  • the subject distance obtained by detecting the subject distance and the focus lens focus position Based on the deviation amount, the movement amount with respect to the focus lens is calculated in the direction in which the subject distance and the focus position are matched (for example, S13 in FIG. 4), and the focus lens is moved based on the calculated movement amount. (For example, S15 in FIG. 4).
  • a movement amount detection unit for example, acceleration data detection unit 17 for detecting the movement amount of the focus lens in the optical axis direction is provided, and this movement amount detection is performed.
  • Output is corrected (for example, acceleration data calculation unit 185 in FIG. 8, S37 in FIG. 10). For this reason, when correcting the position of the focus lens based on the amount of movement of the focus lens in the optical axis direction, the focus adjustment is performed accurately even when the output of the movement amount detection unit is offset. Can do.
  • the subject light flux that has passed through the optical system 2 is guided to the AF sensor 5 using the half mirror 4.
  • an optical system different from the optical system 2 may be provided to guide the subject light flux to the AF sensor 5. That is, a distance measuring optical system having an optical system different from the optical lens may be provided, and distance measurement may be performed based on subject light from a different path from the optical lens.
  • the acceleration sensor 8 is provided to detect the movement amount of the focus lens in the optical axis direction.
  • the present invention is not limited to this, and any sensor that can detect the amount of movement of the camera body 1 such as an angular velocity sensor or a gyro may be used.
  • the system control CPU 10 as a controller realizes each part in the system control CPU 10 by a peripheral circuit, a CPU (Central Processing Unit), and a program code.
  • the controller may be realized by a circuit executed by a program code such as DSP (Digital Signal Processor) or the like, and is generated based on a program language described by Verilog.
  • a hardware configuration such as a gate circuit may be used.
  • the controller may be executed by a hardware circuit.
  • a part of the function of the system control CPU 10 may be realized by a circuit executed by a program code such as a DSP, or hardware such as a gate circuit generated based on a program language described by Verilog. It may be configured, or may be realized by a hardware circuit.
  • a digital camera is used as an apparatus for photographing.
  • digital cameras may be digital single-lens reflex cameras, mirrorless cameras, compact digital cameras, video cameras such as video cameras and movie cameras, and mobile phones, smartphones, personal digital assistants, personal computers
  • a computer (PC) a tablet computer, a camera built in a game machine, a medical camera, a camera for a scientific instrument such as a microscope, a car-mounted camera, or a monitoring camera may be used.
  • the present invention can be applied to any device capable of photographing.
  • control mainly described in the flowchart is often settable by a program and may be stored in a recording medium or a recording unit.
  • the recording method for the recording medium and the recording unit may be recorded at the time of product shipment, may be a distributed recording medium, or may be downloaded via the Internet.
  • the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage.
  • various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiment. For example, you may delete some components of all the components shown by embodiment.
  • constituent elements over different embodiments may be appropriately combined.
  • SYMBOLS 1 ... Camera body, 2 ... Optical system, 3 ... Optical system drive part, 4 ... Half mirror, 5 ... AF sensor, 6 ... Shutter, 7 ... Image sensor, DESCRIPTION OF SYMBOLS 8 ... Acceleration sensor, 9 ... Photometry sensor, 10 ... System control CPU, 11 ... Distance measurement data detection part, 12 ... Focus correction amount calculation part, 13 ... Shutter control part, 14 ... Image pickup pixel control unit, 15 ... Focus lens drive amount calculation unit, 16 ... Distance measuring pixel control unit, 17 ... Acceleration data detection unit, 18 ... Focus correction amount calculation unit, 19 ... Exposure control unit, 20 ... Focus correction amount calculation unit

Abstract

Provided are an imaging device and a focus adjusting method which enable focus to be adjusted with good precision. The imaging device comprises: an optical system 2 including a focus lens for changing the focus state of a subject imaged; an optical system driving unit 3 for moving the focus lens in an optical axis direction; an imaging element 7 for exposing a subject image, and converting the exposed subject image to an image signal; a distance measuring pixel control unit 16 for detecting a subject distance by means of a distance measuring pixel included in an imaging element; and a system control CPU 10 for controlling how much the focus lens is moved by the optical system driving unit 3, on the basis of the detected subject distance and a focal point position of the focus lens. The system control CPU 10 controls the focus lens such that the focus lens is moved by the optical system driving unit 3 in a direction in which the subject distance and focal point position are made to coincide on the basis of a deviation amount between the focal point position of the focus lens and the subject distance detected by a distance measuring data detecting unit 11 while the imaging device 7 exposes the subject image by using the image signal as a means for recording.

Description

撮像装置および焦点調節方法Imaging apparatus and focus adjustment method
 本発明は、焦点調節機能を有する撮像装置する。特に画像記録用に露光する動作を行っている間に、焦点調節することができる撮像装置および焦点調節方法に関する。 The present invention is an imaging device having a focus adjustment function. In particular, the present invention relates to an imaging apparatus and a focus adjustment method capable of performing focus adjustment during an exposure operation for image recording.
 従来からフォーカスレンズの焦点調節を自動で行う自動焦点調節装置を備えた撮像装置が市販されている。しかし、従来の撮像装置は、画像記録用に露光動作を行っている時には、フォーカスレンズのピント状態の検出することができない。すなわち従来の撮像装置は、被写体距離が近い近接撮影の際には、撮影をしている状態で、手ぶれによる被写体距離の変動が起こる。しかし、静止画露光中に生じるピントボケ(ピント状態が、合焦状態からずれること)の補正を行うことができない。そこで、撮影をしている状態で、ピントボケの補正をするために、露光動作を行っている時に生じる、カメラの変位量を検出する、加速度センサ等の変位センサを備え、変位センサが検出した変位量に基づいてフォーカスレンズのピントずれを逐次、補正する機能を備える自動合焦機能付きカメラが提案されている(特許文献1参照)。 Conventionally, an imaging apparatus equipped with an automatic focus adjustment device that automatically adjusts the focus lens is commercially available. However, the conventional imaging apparatus cannot detect the focus lens focus state when performing an exposure operation for image recording. That is, in the conventional imaging device, when the subject distance is close, the subject distance fluctuates due to camera shake while the subject is being photographed. However, it is impossible to correct out-of-focus (out-of-focus state shifts from the in-focus state) that occurs during still image exposure. Therefore, a displacement sensor, such as an acceleration sensor, that detects the amount of camera displacement that occurs during the exposure operation to correct out-of-focus in the shooting state, is detected by the displacement sensor. There has been proposed a camera with an automatic focusing function having a function of sequentially correcting the focus lens focus shift based on the amount (see Patent Document 1).
特開平10-312006号公報Japanese Patent Laid-Open No. 10-312006
 前述の特許文献1に開示の自動合焦機能付きカメラは、加速度センサ等の変位センサを用いて焦点調節を行っている。従来、変位センサは、長時間使用していると、次第に検出した変位量が、オフセットずれ等の変位量の誤差が生じることが知られている。このため、変位量の誤差は、焦点調節の精度に悪影響を与えてしまう。 The camera with an automatic focusing function disclosed in Patent Document 1 described above performs focus adjustment using a displacement sensor such as an acceleration sensor. Conventionally, it has been known that when a displacement sensor is used for a long time, an error of the displacement amount such as an offset deviation is generated from the gradually detected displacement amount. For this reason, an error in the amount of displacement adversely affects the accuracy of focus adjustment.
 本発明は、背景技術での、事情を鑑みてなされたものでる。つまり、撮影をしている状態で、精度よく焦点調節を行うことができる、撮像装置および焦点調節方法を提供することを目的とする。 The present invention has been made in view of circumstances in the background art. That is, it is an object of the present invention to provide an imaging apparatus and a focus adjustment method that can perform focus adjustment with high accuracy while photographing.
 本発明の第1の態様に係わる撮像装置は、結像する被写体のフォーカス状態を可変するフォーカスレンズを含む光学レンズと、上記フォーカスレンズを光軸方向に移動させるフォーカス駆動部と、上記光学レンズによって形成された被写体像を露光し、映像信号に変換する撮像素子と、撮像装置と上記被写体との間の被写体距離を検出する測距部と、上記測距部が検出した被写体距離と、上記フォーカスレンズのピント位置とに基づいて、上記フォーカス駆動部による上記フォーカスレンズの移動量を制御するフォーカス制御部と、を有し、上記フォーカス制御部は、上記撮像素子が上記映像信号を記録用に上記被写体像を露光する間、上記測距部により検出された被写体距離と、上記フォーカスレンズのピント位置とのずれ量に基づき、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカス駆動部によって上記フォーカスフォーカスを移動させる制御を行う。 An imaging apparatus according to a first aspect of the present invention includes an optical lens including a focus lens that changes a focus state of a subject to be imaged, a focus driving unit that moves the focus lens in an optical axis direction, and the optical lens. An image sensor that exposes a formed subject image and converts it into a video signal; a distance measuring unit that detects a subject distance between the imaging device and the subject; a subject distance detected by the distance measuring unit; and the focus A focus control unit that controls a movement amount of the focus lens by the focus driving unit based on a focus position of the lens, and the focus control unit is configured to record the video signal by the imaging element for recording the video signal. Based on the amount of deviation between the subject distance detected by the distance measuring unit and the focus position of the focus lens while exposing the subject image, In a direction to match the serial object distance and the focus position, and performs control to move the focus focusing by the focus drive unit.
 本発明の第2の態様に係わる焦点調節方法は、フォーカスレンズを含む光学レンズが結像する被写体像を撮像素子によって映像信号を検出することと、記録用に上記映像信号を検出する際に、撮像装置と被写体との間の被写体距離を検出することと、検出した被写体距離と、上記フォーカスレンズのピント位置とのずれ量とに基づいて、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカスレンズに対する移動量を算出することと、上記算出された移動量に基づいて、上記フォーカスレンズを移動させることを有する。 In the focus adjustment method according to the second aspect of the present invention, when detecting an image signal of an object image formed by an optical lens including a focus lens by an image sensor and detecting the image signal for recording, Based on the detection of the subject distance between the imaging device and the subject and the amount of deviation between the detected subject distance and the focus position of the focus lens, the subject distance and the focus position are set to coincide with each other. , Calculating a movement amount relative to the focus lens, and moving the focus lens based on the calculated movement amount.
 本発明によれば、撮影をしている状態で、精度よく焦点調節を行うことができるようにした撮像装置および焦点調節方法を提供することができる。 According to the present invention, it is possible to provide an imaging apparatus and a focus adjustment method capable of performing focus adjustment with high accuracy while photographing.
本発明の第1実施形態に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which shows mainly an electrical structure of the camera which concerns on 1st Embodiment of this invention. 本発明の第1実施形態の変形例に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which mainly shows the electric constitution of the camera which concerns on the modification of 1st Embodiment of this invention. 本発明の第1実施形態において、ピント補正量算出部12の機能を説明するブロック図である。FIG. 3 is a block diagram illustrating functions of a focus correction amount calculation unit 12 in the first embodiment of the present invention. 本発明の第1実施形態に係るカメラの静止画露光中ピントボケ補正の動作を示すフローチャートである。6 is a flowchart illustrating an operation for correcting out-of-focus blur during still image exposure of the camera according to the first embodiment of the present invention. 本発明の第1実施形態に係るカメラの動作を示すタイミングチャートである。It is a timing chart which shows operation | movement of the camera which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which mainly shows the electrical structure of the camera which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態の変形例に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which mainly shows the electric constitution of the camera which concerns on the modification of 2nd Embodiment of this invention. 本発明の第2実施形態において、ピント補正量算出部18の機能を説明するブロック図である。FIG. 10 is a block diagram illustrating the function of a focus correction amount calculation unit 18 in the second embodiment of the present invention. 本発明の第2実施形態に係るカメラの静止画露光中ピントボケ補正の動作を示すタイミングチャートである。It is a timing chart which shows the operation | movement of a focus blur correction during still image exposure of the camera which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係るカメラの動作を示すフローチャートである。It is a flowchart which shows operation | movement of the camera which concerns on 2nd Embodiment of this invention. 本発明の第3実施形態に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which mainly shows the electrical structure of the camera which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態の変形例に係るカメラの主として電気的構成を示すブロック図である。It is a block diagram which shows mainly an electrical structure of the camera which concerns on the modification of 3rd Embodiment of this invention. 本発明の第3実施形態において、ピント補正量算出部20の機能を説明するブロック図である。FIG. 10 is a block diagram illustrating functions of a focus correction amount calculation unit 20 in the third embodiment of the present invention. 本発明の第3実施形態に係るカメラの動作を示すタイミングチャートである。It is a timing chart which shows operation | movement of the camera which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態に係るカメラの静止画露光中ピントボケ補正の動作を示すフローチャートである。It is a flowchart which shows the operation | movement of a focus blur correction during the still image exposure of the camera which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態に係るカメラの静止画露光中ピントボケ補正の動作を示すフローチャートである。It is a flowchart which shows the operation | movement of a focus blur correction during the still image exposure of the camera which concerns on 3rd Embodiment of this invention.
 以下、本発明の好ましい実施形態としてデジタルカメラ(以下、「カメラ」と称す)に適用した例について説明する。このカメラは、被写体像を画像データに変換する撮像部と、本体の背面に配置した表示部を有する。この表示部は、撮像部が変換した画像データに基づいて、被写体像をライブビュー(Live-view)表示する。ユーザは、ライブビュー表示を観察して、撮影するための画像の構図や、撮影を開始するための時点(シャッタタイミング)を決定する。レリーズ操作時には、撮影をして取得した、静止画の画像データが記録媒体に記録される。記録媒体に記録された画像データは、再生モードを選択すると、表示部に再生表示することができる。 Hereinafter, an example applied to a digital camera (hereinafter referred to as “camera”) as a preferred embodiment of the present invention will be described. This camera includes an imaging unit that converts a subject image into image data, and a display unit that is disposed on the back of the main body. This display unit displays a subject image in a live view based on the image data converted by the imaging unit. The user observes the live view display and determines the composition of the image for shooting and the time point (shutter timing) for starting shooting. At the time of the release operation, still image data acquired by photographing is recorded on a recording medium. The image data recorded on the recording medium can be reproduced and displayed on the display unit when the reproduction mode is selected.
 また、本実施形態においては、撮像素子7とは別にAFセンサ5(本実施形態の変形例においては画素の一部に測距画素が設けられた撮像素子7)を設けている。そして、ユーザがレリーズ釦を全押しすると、静止画の画像データ取得のための露光中に、所定時間間隔で所謂位相差AFによってフォーカスレンズのデフォーカス量を演算する。フォーカスレンズは、この演算結果に基づいて合焦状態を維持するように焦点調節が行われる。 Further, in the present embodiment, an AF sensor 5 (an image sensor 7 in which a ranging pixel is provided in a part of the pixels in the modification of the present embodiment) is provided in addition to the image sensor 7. When the user fully presses the release button, the defocus amount of the focus lens is calculated by so-called phase difference AF at predetermined time intervals during exposure for acquiring still image data. The focus lens is subjected to focus adjustment based on the calculation result so as to maintain the in-focus state.
 図1は、本実施形態における主として電気的構成を示すブロック図である。このカメラは、カメラ本体1、光学系2、光学系駆動部3、ハーフミラー3、AFセンサ5、シャッタ6、撮像素子7、システム制御CPU10を備える。 FIG. 1 is a block diagram mainly showing an electrical configuration in the present embodiment. This camera includes a camera body 1, an optical system 2, an optical system drive unit 3, a half mirror 3, an AF sensor 5, a shutter 6, an image sensor 7, and a system control CPU 10.
 光学系2は、フォーカスレンズを含み、被写体像を撮像素子6上に形成する。フォーカスレンズは光軸方向に移動することができる。光学系駆動部3は、フォーカスレンズを、合焦位置に移動させる。光学系2は、結像する被写体のフォーカス状態を、変更することができるフォーカスレンズを含む、光学レンズとして機能する。 The optical system 2 includes a focus lens and forms a subject image on the image sensor 6. The focus lens can move in the optical axis direction. The optical system drive unit 3 moves the focus lens to the in-focus position. The optical system 2 functions as an optical lens including a focus lens that can change the focus state of a subject to be imaged.
 ハーフミラー4は、ペリクルミラー等の半透明なミラーである。ハーフミラー4は、光学系2の光軸上にあり、光軸に対して45度傾けて固定されている。ハーフミラー4は、光学系2を透過した被写体光束の内の一部を透過させ、残りの光束をAFセンサ5に反射する。なお、本実施形態においては、ハーフミラー4は固定されている。しかし、ハーフミラー4は、固定するものには限らない。例えば、レリーズ釦が全押しされた際に、ハーフミラー4が、光学系2の光軸上に移動する可動する構成としてもよい。ハーフミラー4は、光学系により入射された被写体像の一部を撮像素子の撮像面に入射させる作用と、被写体像の異なる一部を撮像面の入射方向とは異なる方向に分割する作用と、測距部(測距用センサ)に被写体像の一部を導く作用を有する分光部(光路分割器)として機能する。この測距部は、撮像素子とは異なる測距センサによって、被写体距離を検出する。 Half mirror 4 is a translucent mirror such as a pellicle mirror. The half mirror 4 is on the optical axis of the optical system 2 and is fixed by being inclined 45 degrees with respect to the optical axis. The half mirror 4 transmits a part of the subject light beam transmitted through the optical system 2 and reflects the remaining light beam to the AF sensor 5. In the present embodiment, the half mirror 4 is fixed. However, the half mirror 4 is not limited to a fixed one. For example, the half mirror 4 may be configured to move so as to move on the optical axis of the optical system 2 when the release button is fully pressed. The half mirror 4 has an action of causing a part of the subject image incident by the optical system to enter the imaging surface of the image sensor, an action of dividing a different part of the subject image in a direction different from the incident direction of the imaging surface, It functions as a spectroscopic unit (optical path splitter) having an effect of guiding a part of the subject image to the distance measuring unit (ranging sensor). The distance measuring unit detects the subject distance by a distance measuring sensor different from the image sensor.
 AFセンサ5は、ハーフミラー4から分光された被写体光束を受光し、測距データを生成するための電気信号を出力する検出素子である。AFセンサ5は、一例として、偏光素子と2つの検出部を有する位相差検出方式を採用してもよい。位相差検出方式は、ハーフミラー4から分光された光を、さらに偏光素子により、偏光方向の異なる2つの光線に分離する。そして、分離されたそれぞれの光線が結像する位置関係から、デフォーカス量(合焦点ずれ量)を検出する焦点検出方式である。 The AF sensor 5 is a detection element that receives a subject light beam split from the half mirror 4 and outputs an electrical signal for generating distance measurement data. As an example, the AF sensor 5 may employ a phase difference detection method having a polarizing element and two detection units. In the phase difference detection method, the light split from the half mirror 4 is further separated into two light beams having different polarization directions by a polarizing element. The focus detection method detects a defocus amount (a defocus amount) from a positional relationship where each separated light beam forms an image.
 シャッタ6は、光学系2と撮像素子7の間であって、光学系2の光軸上に配置されている。シャッタ6は、シャッタ制御部13の指示に基づいて開閉動作を行う。シャッタ6が、開放状態となると、撮像素子7上に被写体像が形成され、露出状態となる。また、シャッタ6が閉鎖すると、撮像素子7上に被写体像が形成されずに、遮光状態となる。すなわち、シャッタ6は、開放状態の時間を制御することによって、静止画撮影時の撮像素子7の露光時間が制御される。 The shutter 6 is disposed between the optical system 2 and the image sensor 7 and on the optical axis of the optical system 2. The shutter 6 performs an opening / closing operation based on an instruction from the shutter control unit 13. When the shutter 6 is in an open state, a subject image is formed on the image sensor 7 and is in an exposed state. Further, when the shutter 6 is closed, a subject image is not formed on the image sensor 7 and a light shielding state is established. That is, the exposure time of the image sensor 7 at the time of still image shooting is controlled by controlling the time when the shutter 6 is opened.
 撮像素子7は、撮像面に複数の画素が配置されており、各々の画素は、撮像面に結像された被写体像を電気信号(画素信号)に変換する。これら各々の画素から撮像画素制御部14により、画素信号を読み出される。さらに、これら画素信号から映像データが形成される。撮像素子7は、光学レンズによって形成された被写体像を露光し、映像信号に変換する撮像素子として機能する。 The imaging element 7 has a plurality of pixels arranged on the imaging surface, and each pixel converts a subject image formed on the imaging surface into an electrical signal (pixel signal). A pixel signal is read out from each of these pixels by the imaging pixel control unit 14. Further, video data is formed from these pixel signals. The image sensor 7 functions as an image sensor that exposes a subject image formed by an optical lens and converts it into a video signal.
 システム制御CPU10は、CPU(Central Processing Unit)、その周辺回路、および揮発性および不揮発性のメモリを有するコントローラであり、不揮発性メモリに記憶されたプログラムに従って、カメラ本体1内の各部を制御することにより、カメラ全体の制御を行う。システム制御CPU10内には、測距データ検出部11、ピント補正量算出部12、シャッタ制御部13、撮像画素制御部14、フォーカスレンズ駆動量算出部15が設けられている。これらの部位は、システム制御CPU10内の周辺回路および、制御するためのプログラムを実行することによって実現される。システム制御CPU10は、種々の機能を備えている。しかし、本明細書においては、静止画露光中の焦点調節制御を中心に説明し、他の機能について説明を割愛する。 The system control CPU 10 is a controller having a CPU (Central Processing Unit), its peripheral circuits, and volatile and nonvolatile memory, and controls each part in the camera body 1 according to a program stored in the nonvolatile memory. To control the entire camera. In the system control CPU 10, a distance measurement data detection unit 11, a focus correction amount calculation unit 12, a shutter control unit 13, an imaging pixel control unit 14, and a focus lens drive amount calculation unit 15 are provided. These parts are realized by executing a peripheral circuit in the system control CPU 10 and a control program. The system control CPU 10 has various functions. However, in this specification, the focus adjustment control during still image exposure will be mainly described, and description of other functions will be omitted.
 また、システム制御CPU10は、測距部が検出した被写体距離と、フォーカスレンズのピント位置とに基づいて、フォーカス駆動部による上記フォーカスレンズの移動量を制御するフォーカス制御部として機能する(例えば、図4のS11~S17、図10のS35~S43、図15AのS51~S59、図15BのS61~S71参照)。このフォーカス制御部は、記録用の静止画画像を撮影する時に、撮像素子が、被写体像を露光する間に、測距部により検出された被写体距離と、フォーカスレンズのピント位置とのずれ量とに基づき、被写体距離とピント位置とを一致させる方向に、フォーカス駆動部へ駆動信号を与えて、フォーカスレンズを移動させる制御を行う。 The system control CPU 10 functions as a focus control unit that controls the amount of movement of the focus lens by the focus driving unit based on the subject distance detected by the distance measuring unit and the focus position of the focus lens (for example, FIG. S11 to S17 in FIG. 4, S35 to S43 in FIG. 10, S51 to S59 in FIG. 15A, and S61 to S71 in FIG. 15B). The focus control unit detects the amount of deviation between the subject distance detected by the distance measuring unit and the focus lens focus position while the image sensor exposes the subject image when shooting a still image for recording. Based on the above, a drive signal is given to the focus drive unit in a direction to match the subject distance and the focus position, and the focus lens is moved.
 シャッタ制御部13は、シャッタ制御回路を有し、撮像素子14からの画素信号に基づいて、被写体輝度を算出し、この被写体輝度に基づいて適正露光(露出量)を得られるためシャッタ速度値を算出する。そして、ユーザがレリーズ釦を全押しすると、算出されたシャッタ速度値(またはユーザが手動設定したシャッタ速度値)に基づいて、シャッタ6を開閉させて、露光時間を制御する。 The shutter control unit 13 includes a shutter control circuit, calculates subject luminance based on the pixel signal from the image sensor 14, and obtains a shutter speed value to obtain appropriate exposure (exposure amount) based on the subject luminance. calculate. When the user fully presses the release button, the shutter 6 is opened and closed based on the calculated shutter speed value (or the shutter speed value manually set by the user) to control the exposure time.
 撮像画素制御部14は、撮像素子7に設けられた各撮像画素から画素信号を読み出すための撮像制御回路を有する。撮像画素制御部14は、レリーズ釦が全押しされるまでは、ライブビュー表示をするために画素信号を読み出す。また、レリーズ釦が全押しされ、シャッタ6の開閉動作が終了すると、記録用の静止画画像データを読み出す。 The imaging pixel control unit 14 includes an imaging control circuit for reading out pixel signals from each imaging pixel provided in the imaging element 7. The imaging pixel control unit 14 reads pixel signals for live view display until the release button is fully pressed. Further, when the release button is fully pressed and the opening / closing operation of the shutter 6 is completed, still image data for recording is read out.
 測距データ検出部11は、測距データ検出回路を有してもよく、AFセンサ5より取得した電気信号に基づいて、測距データを検出する。ピント補正量算出部12は、ピント補正量算出回路を有してもよく、測距データ検出部11で検出された測距データに基づいて、フォーカスレンズの合焦位置からのずれ量(デフォーカス量)、すなわち光軸方向のピント補正量を算出する。測距データ検出部11は、撮像装置と被写体との間の被写体距離を検出する測距部として機能する。また、この測距部は、撮像素子が被写体像を露光する間に、この露光動作とは独立に並行して、被写体距離を検出する動作をする(例えば、図5、図9、図14参照)。)なお、測距部は、本実施形態においては、光学系2のデフォーカス量を被写体距離として検出している。しかし、これに限らず、例えば、光学系2以外の光学系を通過した被写体光束を用いて、カメラ本体1から被写体までの距離を直接検出し、被写体距離としてもよい。 The ranging data detection unit 11 may have a ranging data detection circuit, and detects the ranging data based on the electrical signal acquired from the AF sensor 5. The focus correction amount calculation unit 12 may include a focus correction amount calculation circuit, and based on the distance measurement data detected by the distance measurement data detection unit 11, the amount of deviation (defocus) from the focus position of the focus lens. Amount), that is, a focus correction amount in the optical axis direction is calculated. The distance measurement data detection unit 11 functions as a distance measurement unit that detects a subject distance between the imaging apparatus and the subject. In addition, the distance measuring unit operates to detect the subject distance in parallel with the exposure operation while the image sensor exposes the subject image (see, for example, FIGS. 5, 9, and 14). ). In the present embodiment, the distance measuring unit detects the defocus amount of the optical system 2 as the subject distance. However, the present invention is not limited to this. For example, the distance from the camera body 1 to the subject may be directly detected using the subject light flux that has passed through the optical system other than the optical system 2, and the subject distance may be used.
 フォーカスレンズ駆動量算出部15は、フォーカスレンズ駆動量算出回路を有してもよく、ピント補正量算出部12で算出されたピント補正量からフォーカスレンズ駆動用へ単位換算を行い、駆動位置または駆動量を算出する。 The focus lens drive amount calculation unit 15 may have a focus lens drive amount calculation circuit, performs unit conversion from the focus correction amount calculated by the focus correction amount calculation unit 12 to focus lens drive, and drives the drive position or drive. Calculate the amount.
 なお、システム制御CPU10は、測距データ検出部11、ピント補正量算出部12、シャッタ制御部13、撮像画素制御部14、およびフォーカスレンズ駆動量算出部15の機能を、プログラムによるソフトウエアによって実現してもよく、また周辺回路によって実現してもよく、またプログラムと周辺回路の組み合わせによって実現してもよい。 The system control CPU 10 realizes the functions of the distance measurement data detection unit 11, the focus correction amount calculation unit 12, the shutter control unit 13, the imaging pixel control unit 14, and the focus lens drive amount calculation unit 15 by software using a program. Alternatively, it may be realized by a peripheral circuit, or may be realized by a combination of a program and a peripheral circuit.
 光学系駆動部3は、駆動用アクチュエータ(例えば、ステッピングモータ)と駆動回路を有し、光学系2のフォーカスレンズを光軸方向に移動させる。光学系駆動部3は、フォーカスレンズ駆動量算出部15より得られた駆動位置または駆動量に基づいて、光学系2のフォーカスレンズのフォーカス制御を行う。光学系駆動部3は、フォーカスレンズを光軸方向に移動させるフォーカス駆動部として機能する。 The optical system drive unit 3 includes a drive actuator (for example, a stepping motor) and a drive circuit, and moves the focus lens of the optical system 2 in the optical axis direction. The optical system drive unit 3 performs focus control of the focus lens of the optical system 2 based on the drive position or drive amount obtained from the focus lens drive amount calculation unit 15. The optical system drive unit 3 functions as a focus drive unit that moves the focus lens in the optical axis direction.
 図2は、第1実施形態の変形例を示すブロック図である。図2の大半のブロックは図1と同じであるため、相違点を中心に説明する。 FIG. 2 is a block diagram showing a modification of the first embodiment. Since most of the blocks in FIG. 2 are the same as those in FIG. 1, differences will be mainly described.
 第1実施形態の変形例にある測距部(測距用センサ)は、図1においてAFセンサ5が担っていた機能を、撮像素子7が担う。撮像素子7は、図1に示した撮像素子7と同じく、撮像面に複数の画素を有している。図1においては、撮像素子7は撮像画素のみを有していた。これに対して、図2において、撮像素子7は撮像画素に加えて像面位相差AF用画素(測距画素)を含んでいる。 The distance measuring unit (ranging sensor) in the modification of the first embodiment has the function of the AF sensor 5 in FIG. The image pickup device 7 has a plurality of pixels on the image pickup surface, like the image pickup device 7 shown in FIG. In FIG. 1, the image sensor 7 has only image pixels. On the other hand, in FIG. 2, the image sensor 7 includes an image plane phase difference AF pixel (ranging pixel) in addition to the image pickup pixel.
 図1における撮像素子7は被写体像を電気信号に変換する機能だけであった。しかし、図2における撮像素子7は、さらに、被写体までの距離に対応した測距データを得るための電気信号に変換する機能もある。すなわち、撮像素子7は、結像面に測距画素と、被写体像から画像信号を出力する撮像画素(非測距画素)を含んでいる。撮像素子7で得られた電気信号のうち、撮像画素からの画素信号は、撮像画素制御部14に出力され、一方、測距画素からの電気信号は測距画素制御部16に出力される。したがって、被写体像の情報は、撮像画素制御部14で処理され、被写体までの距離情報は、測距画素制御部16で処理される。 1 has only a function of converting a subject image into an electrical signal. However, the image sensor 7 in FIG. 2 also has a function of converting it into an electrical signal for obtaining distance measurement data corresponding to the distance to the subject. That is, the image sensor 7 includes distance measuring pixels on the imaging plane and image pickup pixels (non-range pixels) that output image signals from the subject image. Of the electrical signals obtained by the imaging device 7, the pixel signal from the imaging pixel is output to the imaging pixel control unit 14, while the electrical signal from the ranging pixel is output to the ranging pixel control unit 16. Accordingly, the subject image information is processed by the imaging pixel control unit 14, and the distance information to the subject is processed by the ranging pixel control unit 16.
 撮像画素と測距画素は、それぞれの画素に対する露光の開始と終了を独立に操作し、それぞれの画素に対する画素信号を独立に読み出す画素出力回路が形成されている。撮像画素制御部14と、測距画素制御部16とは、露光中にそれぞれ並行して動作をして、撮像画素の出力と測距画素の出力を読み出すことができる。つまり、露光中に、測距画素の出力を得ることが可能な構成である。 The image pickup pixel and the distance measurement pixel are independently operated to start and end the exposure for each pixel, and a pixel output circuit for independently reading the pixel signal for each pixel is formed. The imaging pixel control unit 14 and the ranging pixel control unit 16 can operate in parallel during exposure to read out the output of the imaging pixel and the output of the ranging pixel. That is, it is a configuration that can obtain the output of the ranging pixel during exposure.
 測距画素制御部16は、測距画素読出の読出制御を行う読出制御回路を有してもよく、撮像素子7の測距画素から得られた電気信号を基に、被写体までの測距データを作り出す。また、測距情報は、撮像素子7の測距画素から取得することができるため、AFセンサ5が不要となり、ハーフミラー4も不要となる。このため、光学系2を通過した光束は、分光せずに直接撮像素子7へ結像される。測距部は、非測距画素が露光動作をしている間に、測距画素から出力される画素信号を独立して読み出し、被写体距離を検出し結果を取得することができる。測距部は、第1実施形態の変形例においては、撮像素子7の測距画素と、測距画素制御部16にて構成される。 The ranging pixel control unit 16 may include a readout control circuit that performs readout control of readout of the ranging pixel, and based on the electrical signal obtained from the ranging pixel of the image sensor 7, the ranging data up to the subject To produce. Further, since the distance measurement information can be acquired from the distance measurement pixels of the image sensor 7, the AF sensor 5 is not required, and the half mirror 4 is not required. For this reason, the light beam that has passed through the optical system 2 is directly imaged on the image sensor 7 without being dispersed. The distance measuring unit can independently read out pixel signals output from the distance measuring pixels while the non-range pixels are performing the exposure operation, detect the subject distance, and acquire the result. In the modification of the first embodiment, the distance measuring unit includes a distance measuring pixel of the image sensor 7 and a distance measuring pixel control unit 16.
 図1と図2の違いは、被写体までの測距データの取得方法であり、システム制御CPU10の測距データ検出部11と、測距画素制御部16からの出力は、同じ種類の測距データとなる。図2に示す変形例では、撮像素子7に測距画素を設けることにより、ハーフミラー4とAFセンサ5を省略することができる。 The difference between FIG. 1 and FIG. 2 is a method for acquiring distance measurement data up to the subject. The distance measurement data detection unit 11 and the distance measurement pixel control unit 16 of the system control CPU 10 output the same type of distance measurement data. It becomes. In the modification shown in FIG. 2, the half mirror 4 and the AF sensor 5 can be omitted by providing a distance measuring pixel in the image sensor 7.
 図3は、ピント補正量算出部12の機能の詳細を示すブロック図であり、測距データ121とデフォーカス量算出部122とで構成される。各ブロックは、システム制御CPU10のCPUがプログラムに従ってソフトウエアによって実現してもよく、また周辺回路によって実現してもよく、またプログラムと周辺回路の組み合わせによって実現してもよい。 FIG. 3 is a block diagram showing details of the function of the focus correction amount calculation unit 12, and is composed of distance measurement data 121 and a defocus amount calculation unit 122. Each block may be realized by software by the CPU of the system control CPU 10 according to a program, may be realized by a peripheral circuit, or may be realized by a combination of a program and a peripheral circuit.
 測距データ121は、図1に示す第1実施形態では、AFセンサ5から出力される電気信号を処理する測距データ検出部11から出力される。また図2に示す変形例では、測距データ121は、撮像素子7の測距画素からの電気信号を処理する測距画素制御部16から出力されるデータに対応する。 In the first embodiment shown in FIG. 1, the distance measurement data 121 is output from the distance measurement data detection unit 11 that processes an electrical signal output from the AF sensor 5. In the modification shown in FIG. 2, the distance measurement data 121 corresponds to data output from the distance measurement pixel control unit 16 that processes an electrical signal from the distance measurement pixel of the image sensor 7.
 デフォーカス量算出部122は、静止画露光中に測距データ121に基づいて、デフォーカス量を算出する。 The defocus amount calculation unit 122 calculates the defocus amount based on the distance measurement data 121 during still image exposure.
 フォーカスレンズ駆動量算出部15は、算出されたデフォーカス量に基づいて、フォーカスレンズ駆動用に単位換算を行い、フォーカスレンズの駆動位置または駆動量を算出する。レンズ側の光学系駆動部3は、算出された駆動位置/駆動量を使用し、光学系2のフォーカスレンズを駆動させ、静止画露光中の光軸方向のピント補正をすることができる。 The focus lens driving amount calculation unit 15 performs unit conversion for driving the focus lens based on the calculated defocus amount, and calculates the driving position or driving amount of the focus lens. The lens-side optical system drive unit 3 can use the calculated drive position / drive amount to drive the focus lens of the optical system 2 and perform focus correction in the optical axis direction during still image exposure.
 次に、図4に示すフローチャートを用いて、本実施形態における静止画露光中ピントボケ補正の処理について説明する。この制御フローは、システム制御CPU10が不揮発性メモリに記憶されたプログラムに従って、カメラ本体1内の各部を制御することにより実現する。 Next, the process of correcting the out-of-focus blur during still image exposure in the present embodiment will be described using the flowchart shown in FIG. This control flow is realized by the system control CPU 10 controlling each part in the camera body 1 according to a program stored in the nonvolatile memory.
 図4に示すフローが開始すると、まず、1R(1st. Release)押下げか否かの判定を行う(S1)。ここでは、ユーザが、レリーズ釦を、半押したか否かを判定する。1R押下げが検出されない場合は、待機状態となる。 4. When the flow shown in FIG. 4 starts, it is first determined whether or not 1R (1st. Release) is pressed (S1). Here, it is determined whether or not the user has pressed the release button halfway. If 1R depression is not detected, a standby state is entered.
 ステップS1における判定の結果、1R押下げがあった場合には、オートフォーカス動作(AF動作)を開始し、まず測距データを取得する(S3)。ここでは、測距データ検出部11または測距画素制御部16が、測距データを取得する。1R押下げを検出している間は、測距データを一定周期毎に取得し続ける。なお、1R押下げを検出している間は、測距データを一定周期毎に取得し続ける動作は、一例の動作である。この動作手順に限らずに、ライブビュー撮影中は、1R押下げの有無に関わらずに、測距データを一定周期毎に取得し続ける動作であってもよい。 If the result of determination in step S1 is that 1R has been depressed, an autofocus operation (AF operation) is started, and distance measurement data is first acquired (S3). Here, the ranging data detection unit 11 or the ranging pixel control unit 16 acquires the ranging data. While the 1R depression is detected, the distance measurement data is continuously acquired at regular intervals. The operation of continuously acquiring the distance measurement data at regular intervals while detecting the 1R depression is an example of the operation. The operation procedure is not limited to this, and during live view shooting, the distance measurement data may be continuously acquired at regular intervals regardless of whether or not the 1R is depressed.
 測距データを取得すると、次にデフォーカス量を算出する(S5)。ここでは、ステップS3において取得した測距データを用いて、ピント補正量算出部12がデフォーカス量を算出する。 When the ranging data is acquired, the defocus amount is calculated (S5). Here, the focus correction amount calculation unit 12 calculates the defocus amount using the distance measurement data acquired in step S3.
 デフォーカス量を算出すると、次にフォーカス用レンズ駆動を行う(S7)。ここでは、フォーカスレンズ駆動量算出部15が、フォーカスレンズの駆動位置または駆動量を算出し、光学系駆動部3がフォーカスレンズを合焦位置に移動させる。 After calculating the defocus amount, the focus lens is driven (S7). Here, the focus lens drive amount calculation unit 15 calculates the drive position or drive amount of the focus lens, and the optical system drive unit 3 moves the focus lens to the focus position.
 フォーカス用レンズ駆動を行うと、次に2R(2nd. Release)押下げか否かを判定する(S9)。ユーザは、静止画を撮影しようとする場合には、レリーズ釦の全押しを行う。このステップでは、レリーズ釦が全押しされたか否かを判定する。判定の結果、2R押下げがない場合には、ステップS3に戻り、前述の動作を繰り返し、フォーカスレンズを合焦位置に維持する。 When the focus lens is driven, it is next determined whether or not 2R (2nd. Release) is depressed (S9). When the user intends to shoot a still image, the user fully presses the release button. In this step, it is determined whether or not the release button has been fully pressed. As a result of the determination, if there is no 2R depression, the process returns to step S3, the above operation is repeated, and the focus lens is maintained at the in-focus position.
 ステップS9における判定の結果、2R押下げがあった場合には、静止画撮影する動作に移る。すなわち、2R押下げ前の被写体輝度に基づいて、適正露光となる絞り値およびシャッタ速度値で露出制御を行い、露光時間の経過後に撮像素子7から撮像画素の画素信号を読み出し、静止画記録用の画像処理を行った後、記録媒体に画像データの記録を行う。また、この静止画撮影する動作と並行して、撮像素子7の露光動作中に、ステップS11~S17においてフォーカスレンズの焦点調節動作(ピントボケ補正動作)を行う。従って、静止画撮影する動作中、すなわち2R押下げ(S9Yes)から静止画露光終了まで(S17Yes)の間、撮像素子7の撮像画素は被写体光束を光電変換し、電荷蓄積を継続する。一方、静止画撮影する動作中、AFセンサ5または撮像素子7の測距画素は、所定時間間隔で測距データを出力する。 If the result of determination in step S9 is that 2R has been pressed, the operation moves to still image shooting. That is, exposure control is performed with the aperture value and shutter speed value for proper exposure based on the subject brightness before 2R is pressed, and the pixel signal of the imaging pixel is read out from the imaging device 7 after the exposure time has elapsed, for still image recording. After the image processing is performed, image data is recorded on the recording medium. In parallel with this still image shooting operation, during the exposure operation of the image sensor 7, a focus lens focus adjustment operation (focus blur correction operation) is performed in steps S11 to S17. Therefore, during the still image shooting operation, that is, from the 2R depression (S9 Yes) to the end of the still image exposure (S17 Yes), the imaging pixels of the imaging device 7 photoelectrically convert the subject luminous flux and continue the charge accumulation. On the other hand, during the still image shooting operation, the AF pixels 5 or the ranging pixels of the image sensor 7 output ranging data at predetermined time intervals.
 まず、測距データの取得を行う(S11)。ここでは、ステップS3と同じく、一定周期毎に、測距データを取得し続ける。測距データを取得すると、デフォーカス量算出を行う(S13)。ここでは、ステップS5と同じく、ピント補正量算出部12がデフォーカス量の算出を定期的に算出する。続いて、フォーカス用レンズ駆動を行う(S15)。ここでは、ステップS7と同じく、フォーカスレンズ駆動量算出部15、光学系駆動部3によってフォーカスレンズを合焦位置に定期的に移動させる。 First, ranging data is acquired (S11). Here, as in step S3, distance measurement data is continuously acquired at regular intervals. When the distance measurement data is acquired, the defocus amount is calculated (S13). Here, as in step S5, the focus correction amount calculation unit 12 periodically calculates the defocus amount. Subsequently, focus lens driving is performed (S15). Here, as in step S7, the focus lens is periodically moved to the in-focus position by the focus lens drive amount calculation unit 15 and the optical system drive unit 3.
 フォーカス用レンズの駆動を行うと、次に、静止画露光が終了したか否かを判定する(S17)。露光時間が経過し、シャッタ6が閉じられると、静止画露光が終了となる。この判定の結果、静止画露光が終了していない場合には、ステップS11に戻り、所定時間間隔で、ステップS11~S15を繰り返し行い、フォーカスレンズを合焦位置に維持する。静止画露光が終了すると、このフローを終了する。 Once the focus lens is driven, it is next determined whether or not the still image exposure is completed (S17). When the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that the still image exposure has not been completed, the process returns to step S11, and steps S11 to S15 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position. When the still image exposure is finished, this flow is finished.
 図4に示すフローにおいては、ステップS3~S7とステップS11~S15は同じ処理である。ただし、本実施形態においては、図1と図2で示したように、静止画露光中でも、AFセンサ5また撮像素子7の測距画素を用いて、測距データを取得することができる。このため、静止画露光中でも、ピントボケ補正を実施することが可能である。 In the flow shown in FIG. 4, steps S3 to S7 and steps S11 to S15 are the same process. However, in the present embodiment, as shown in FIGS. 1 and 2, distance measurement data can be acquired using the distance measurement pixels of the AF sensor 5 or the image sensor 7 even during still image exposure. For this reason, it is possible to perform out-of-focus correction even during still image exposure.
 次に、図5に示すタイミングチャートを用いて、図4に示した静止画露光中での、ピントボケ補正の動作について説明する。図5において、横軸は時間の経過を示し、縦軸に、カメラの動作内容と、撮像素子7またはAFセンサ5の出力と、システム制御CPU10の処理内容と、フォーカスレンズの動作内容とをそれぞれ示す。 Next, using the timing chart shown in FIG. 5, the operation of defocus correction during the still image exposure shown in FIG. 4 will be described. In FIG. 5, the horizontal axis indicates the passage of time, and the vertical axis indicates the operation content of the camera, the output of the image sensor 7 or the AF sensor 5, the processing content of the system control CPU 10, and the operation content of the focus lens, respectively. Show.
 図5において「1R」の時点は、レリーズ釦が半押しされた時点を示し、図4のステップS1において1R押下げと判定された時点に相当する。システム制御CPU10は、1R押下げを判定されると、AFセンサ5または撮像センサ7からの測距データを用いて、デフォーカス量を定期的に演算する。そして、フォーカスレンズを定期的に駆動させる。図5に示す、測距データからデフォーカス量演算への下向きの矢印は、定期的に測距データを用いてデフォーカス量を演算する時点を示す。 In FIG. 5, the time point “1R” indicates a time point when the release button is half-pressed, and corresponds to a time point when it is determined that the 1R button is pressed in step S1 of FIG. When it is determined that 1R is depressed, the system control CPU 10 periodically calculates the defocus amount by using the distance measurement data from the AF sensor 5 or the imaging sensor 7. Then, the focus lens is periodically driven. The downward arrow from the distance measurement data to the defocus amount calculation shown in FIG. 5 indicates the time point at which the defocus amount is calculated periodically using the distance measurement data.
 図5において「2R」は、レリーズ釦が全押しされた時点を示し、図4のステップS9において2R押下げと判定された時点に相当する。2R押下げを判定されると、静止画撮影のために露光動作を行うと共に、これと並行して、測距データを定期的に取得し続ける。そして、測距データから算出された、デフォーカス量を用いて、フォーカスレンズを駆動させる。図5に示す、測距データからデフォーカス量演算への下向きの矢印は、静止画の撮影開始後も、定期的に測距データを用いてデフォーカス量を演算するタイミングを示す。撮像素子が被写体像を露光する間に、この露光動作は独立に並行して、測距データを検出している。 In FIG. 5, “2R” indicates a time point when the release button is fully pressed, and corresponds to a time point when it is determined that 2R is pressed in step S9 of FIG. If 2R depression is determined, an exposure operation is performed for still image shooting, and distance measurement data is periodically acquired in parallel with this. Then, the focus lens is driven using the defocus amount calculated from the distance measurement data. The downward arrow from the distance measurement data to the defocus amount calculation shown in FIG. 5 indicates the timing at which the defocus amount is periodically calculated using the distance measurement data even after the start of still image shooting. While the image sensor exposes the subject image, this exposure operation is independently performed in parallel to detect distance measurement data.
 このように、本実施形態およびその変形例においては、露光中にデフォーカス量を検出することができるデフォーカス検出部(例えば、AFセンサ5、測距データ検出部11、撮像素子7と測距画素制御部16、ピント補正量算出部12参照)と、露光中にデフォーカス検知と、この検知結果に基づいてフォーカスレンズ位置の補正駆動を所定時間間隔で行うフォーカス制御部(例えば、フォーカスレンズ駆動量算出部15、光学系駆動部3参照)を有している。 As described above, in the present embodiment and the modification thereof, the defocus detection unit (for example, the AF sensor 5, the distance measurement data detection unit 11, the image sensor 7 and the distance measurement which can detect the defocus amount during exposure). A focus control unit (for example, focus lens drive) that performs defocus detection during exposure and correction drive of the focus lens position at predetermined time intervals based on the detection result. A quantity calculating unit 15 and an optical system driving unit 3).
 上述のデフォーカス検出部は、具体的には、像分離部(ハーフミラー4)と、フォーカスセンサ(撮像素子7の測距画素(位相差センサ))を有している。また、デフォーカス検出部の別の例としては、撮像面に撮影用画素と測距用画素(瞳偏心、瞳分割等)を有する撮像素子によって構成されている。また、撮像素子から画素信号を読み出すための画素読出回路を有し、この画素読出回路は、撮像画素の読出回路と、測距用画素の読出回路を有し、それぞれ独立の読出回路が設けられている。それぞれの読み出し制御によって、独立して画素出力を得る。これによって、静止画の撮影露光中に、デフォーカス状態(デフォーカス量と、デフォーカス方法)の、時間による変化を検出することが可能となる。 Specifically, the above-described defocus detection unit includes an image separation unit (half mirror 4) and a focus sensor (ranging pixel (phase difference sensor) of the image sensor 7). As another example of the defocus detection unit, an imaging element having imaging pixels and ranging pixels (pupil eccentricity, pupil division, etc.) on the imaging surface is configured. In addition, it has a pixel readout circuit for reading out pixel signals from the image sensor, and this pixel readout circuit has a readout circuit for imaging pixels and a readout circuit for pixels for distance measurement, each of which is provided with an independent readout circuit. ing. A pixel output is obtained independently by each readout control. This makes it possible to detect a change in the defocus state (defocus amount and defocus method) with time during still image shooting exposure.
 このように、本実施形態およびその変形例においては、露光前の被写体とカメラの位置関係が保持されることにより、露光中のぶれによるピントずれが補正され、ピント精度が向上する。さらに、光軸方向の手ぶれ変動だけでなく、被写体の変動にも追従することができるため、静止画撮影でのピントボケが、より低減する。 As described above, in the present embodiment and the modification thereof, the positional relationship between the subject before exposure and the camera is maintained, so that the focus shift due to the shake during the exposure is corrected and the focus accuracy is improved. Furthermore, since it is possible to follow not only camera shake fluctuation in the optical axis direction but also subject fluctuation, the out-of-focus in still image shooting is further reduced.
 次に、図6ないし図10を用いて、本発明の第2実施形態およびその変形例について説明する。本実施形態及び変形例は、フォーカスレンズの焦点調節にあたって、測距センサ(AFセンサ5または撮像素子7の測距画素)による検出結果に加えて、加速度センサによる検出結果も用いる。 Next, a second embodiment of the present invention and its modification will be described with reference to FIGS. In this embodiment and the modification, in addition to the detection result by the distance measurement sensor (AF sensor 5 or the distance measurement pixel of the image sensor 7), the detection result by the acceleration sensor is used for the focus adjustment of the focus lens.
 図6は、第2実施形態の、主に電気的構成を示すブロック図である。図1に示した第1実施形態におけるブロック図と比較すると、第2実施形態では、加速度センサ8と加速度データ検出部17が追加されている。さらに、第1実施形態にあるピント補正量算出部12を、ピント補正量算出部18に置き換えている点で相違する。 FIG. 6 is a block diagram mainly showing an electrical configuration of the second embodiment. Compared to the block diagram in the first embodiment shown in FIG. 1, an acceleration sensor 8 and an acceleration data detection unit 17 are added in the second embodiment. Furthermore, the difference is that the focus correction amount calculation unit 12 in the first embodiment is replaced with a focus correction amount calculation unit 18.
 加速度センサ8は、カメラに加わる並進運動を加速度として検出し、システム制御CPU10に出力する。加速度情報としては、カメラの左右方向のX軸情報・上下方向のY軸情報・光軸方向のZ軸情報が存在する。本実施形態では、Z軸(光軸)のみの説明となるため、X軸・Y軸については説明を省略する。 The acceleration sensor 8 detects the translational motion applied to the camera as acceleration and outputs it to the system control CPU 10. The acceleration information includes X-axis information in the horizontal direction of the camera, Y-axis information in the vertical direction, and Z-axis information in the optical axis direction. In the present embodiment, only the Z axis (optical axis) is described, and thus the description of the X axis and the Y axis is omitted.
 加速度データ検出部17は、加速度データ検出回路を有していてもよく、加速度センサ8から得られた電気信号を基に、光軸方向の加速度情報を出力する。加速度情報を1回の時間積分をすると、速度情報となり、さらに速度情報を1回の時間積分をすると、距離情報となる。すなわち、加速度情報を2回の時間積分をすることにより、カメラと、被写体との間の距離の時間変化を検出することができる。なお、本実施形態においては、加速度データ検出部17の機能は、主としてプログラムに基づいてシステム制御CPU10が実現するが、システム制御CPU10(コントローラ)の周辺回路として加速度データ検出回路を設け、この回路によって実現してもよい。 The acceleration data detection unit 17 may have an acceleration data detection circuit, and outputs acceleration information in the optical axis direction based on an electrical signal obtained from the acceleration sensor 8. When the acceleration information is integrated once with time, it becomes speed information, and when the speed information is integrated with time once, it becomes distance information. That is, the time change of the distance between the camera and the subject can be detected by integrating the acceleration information twice. In the present embodiment, the function of the acceleration data detection unit 17 is realized by the system control CPU 10 mainly based on a program. However, an acceleration data detection circuit is provided as a peripheral circuit of the system control CPU 10 (controller). It may be realized.
 加速度データ検出部17は、フォーカスレンズの光軸方向の移動量を検出する移動量検出部として機能する。システム制御CPU10は、フォーカス制御部として機能する。そしてフォーカス制御部は、測距部が検出した被写体距離と、移動量検出部が検出したフォーカスレンズの光軸方向の移動量と、フォーカスレンズのピント位置に基づいて、フォーカス駆動部に対する移動量を出力する(例えば、図10のS35~S43参照)。上述の移動量検出部は、光軸方向の移動加速度を検出する加速度センサ8と、加速度センサの加速度出力を、測距部が検出した被写体距離に基づいて補正値を算出し、補正値を加速度出力に加算または減算して演算する加速度補正部を有している(例えば、図10のS27、S37参照)。 The acceleration data detection unit 17 functions as a movement amount detection unit that detects the movement amount of the focus lens in the optical axis direction. The system control CPU 10 functions as a focus control unit. Then, the focus control unit calculates the movement amount with respect to the focus driving unit based on the subject distance detected by the distance measurement unit, the movement amount of the focus lens in the optical axis direction detected by the movement amount detection unit, and the focus lens focus position. Output (for example, see S35 to S43 in FIG. 10). The movement amount detection unit described above calculates a correction value based on the acceleration sensor 8 that detects the movement acceleration in the optical axis direction and the acceleration output of the acceleration sensor based on the subject distance detected by the distance measurement unit, and uses the correction value as an acceleration. An acceleration correction unit that calculates by adding or subtracting to the output is included (see, for example, S27 and S37 in FIG. 10).
 ピント補正量算出部18は、加速度データ検出部17で検出された光軸方向の加速度情報に基づいて、光軸方向のピント補正量を算出する。ピント補正量は、前述したように、加速度情報を2回の時間積分することにより、算出する。なお、ピント補正量の算出は、具体的には、測距データを用いて、露光開始時の絶対速度と、絶対加速度(基準点補正)とをそれぞれ算出して、ピント補正量の算出に使用している。このピント補正量の算出については、図8および図9を用いて後述する。 The focus correction amount calculation unit 18 calculates the focus correction amount in the optical axis direction based on the acceleration information in the optical axis direction detected by the acceleration data detection unit 17. As described above, the focus correction amount is calculated by integrating the acceleration information twice over time. Specifically, the focus correction amount is calculated using the distance measurement data to calculate the absolute speed at the start of exposure and the absolute acceleration (reference point correction), respectively, and use it to calculate the focus correction amount. is doing. The calculation of the focus correction amount will be described later with reference to FIGS.
 図7は、第2実施形態の変形例を示すブロック図である。図7に示す多くのブロックは、図2に示すブロック図と同じである。また相違点は、加速度センサ8と加速度データ検出部17が追加され、ピント補正量算出部18に置き換えている点のみである。この相違点は、図6に示した構成と同様であることから、詳しい説明を省略する。 FIG. 7 is a block diagram showing a modification of the second embodiment. Many of the blocks shown in FIG. 7 are the same as the block diagram shown in FIG. The only difference is that the acceleration sensor 8 and the acceleration data detection unit 17 are added and replaced with a focus correction amount calculation unit 18. Since this difference is the same as the configuration shown in FIG. 6, detailed description thereof is omitted.
 次に、図8を用いて、ピント補正量算出部18の機能の詳細について説明する。図8は、ピント補正量算出部18の機能の詳細を示すブロック図であり、加速度データ181、測距データ182、速度変換部183、加速度変換部184、加速度データ(基準点補正後)185、デフォーカス量算出部186とで構成される。各ブロックは、本実施形態においては、システム制御CPU10のCPUがプログラムに従ってソフトウエアによって実現するが、これに限らず、また周辺回路によって実現してもよく、またプログラムと周辺回路の組み合わせによって実現してもよい。 Next, the details of the function of the focus correction amount calculation unit 18 will be described with reference to FIG. FIG. 8 is a block diagram showing details of the function of the focus correction amount calculation unit 18, and includes acceleration data 181, distance measurement data 182, speed conversion unit 183, acceleration conversion unit 184, acceleration data (after reference point correction) 185, And a defocus amount calculation unit 186. In the present embodiment, each block is realized by software according to the CPU of the system control CPU 10 according to a program, but is not limited thereto, and may be realized by a peripheral circuit, or by a combination of a program and a peripheral circuit. May be.
 加速度データ181は、光学系2の光軸方向の加速度情報であり、図6と図7に示す加速度センサ8から得られる。 The acceleration data 181 is acceleration information in the optical axis direction of the optical system 2 and is obtained from the acceleration sensor 8 shown in FIGS.
 測距データ181は、図6に示す第2実施形態では、AFセンサ5から出力される電気信号を処理する測距データ検出部11から、出力されるデータに対応する。また図7に示す変形例では、測距データ181は、撮像素子7の測距画素からの電気信号を処理する、測距画素制御部16から出力されるデータに対応する。 In the second embodiment shown in FIG. 6, the distance measurement data 181 corresponds to data output from the distance measurement data detection unit 11 that processes an electrical signal output from the AF sensor 5. In the modification shown in FIG. 7, the distance measurement data 181 corresponds to data output from the distance measurement pixel control unit 16 that processes an electric signal from the distance measurement pixel of the image sensor 7.
 速度変換部183は、測距データ182を入力し、これをある一定時間で微分することにより速度データに変換することができる。例えば、ある時刻と、所定時間後の時刻における測距データとの差分を、所定時間で除算すれば、速度データに変換できる。 The speed conversion unit 183 can receive the distance measurement data 182 and convert it into speed data by differentiating it at a certain time. For example, if a difference between a certain time and distance measurement data at a time after a predetermined time is divided by a predetermined time, it can be converted into speed data.
 加速度変換部184は、変換した速度データ183を一定期間で微分することにより、加速度データ184へ変換することができる。例えば、ある時刻における速度データと、その後の時刻における速度データとの差分を、所定時間で除算すれば、基準点補正前の加速度データに変換できる。 The acceleration conversion unit 184 can convert the converted velocity data 183 into acceleration data 184 by differentiating the converted velocity data 183 over a certain period. For example, if a difference between speed data at a certain time and speed data at a subsequent time is divided by a predetermined time, it can be converted into acceleration data before reference point correction.
 加速度データ算出部185は、加速度センサ8から生成された加速度データ181と、AFセンサ5または撮像素子7の測距画素から生成されて、さらに加速度変換部184で変換された基準点補正前の加速度データとの2つの加速度データを用いて、基準点補正後の加速度データを算出する。 The acceleration data calculation unit 185 is generated from the acceleration data 181 generated from the acceleration sensor 8 and the ranging pixels of the AF sensor 5 or the image sensor 7 and further converted by the acceleration conversion unit 184 before the reference point correction. The acceleration data after the reference point correction is calculated using the two acceleration data.
 デフォーカス量算出部186は、加速度データ算出部185によって算出された基準点補正後の加速度データと、速度変換部183によって変換された露光する時点から、直前の時点での絶対速度を用いて、デフォーカス量を算出する。 The defocus amount calculation unit 186 uses the acceleration data after the reference point correction calculated by the acceleration data calculation unit 185 and the absolute speed at the time immediately before the exposure time converted by the speed conversion unit 183. Calculate the defocus amount.
 次に、図9に示すタイミングチャートを用いて、図8に示した加速度データ算出部185で算出される基準点補正後の加速度データの算出方法と、デフォーカス量算出部186におけるデフォーカス量の算出方法について説明する。 Next, using the timing chart shown in FIG. 9, the method for calculating the acceleration data after the reference point correction calculated by the acceleration data calculation unit 185 shown in FIG. 8 and the defocus amount of the defocus amount calculation unit 186 are calculated. A calculation method will be described.
 まず、静止画露光前に、加速度データ181と測距データ182を用いて行う加速度センサの基準点誤差と絶対速度の算出について説明する。ある測距間隔Tで取得された測距データをX1、X2、X3とする(図9中のA参照)。これらの値から、(式1)と(式2)によって速度データV1とV2を算出する。
 V1 = (X2-X1) / T ・・・・(式1)
 V2 = (X3-X2) / T ・・・・(式2)
First, calculation of the reference sensor error and absolute velocity of the acceleration sensor using the acceleration data 181 and the distance measurement data 182 before still image exposure will be described. The distance measurement data acquired at a certain distance measurement interval T is defined as X1, X2, and X3 (see A in FIG. 9). From these values, speed data V1 and V2 are calculated by (Expression 1) and (Expression 2).
V1 = (X2-X1) / T (Equation 1)
V2 = (X3-X2) / T (Equation 2)
 (式2)によって、静止画露光直前の速度データはV2が算出される。速度データV2は、露光開始時の絶対速度として使用する。また、速度データV1とV2を用いて、(式3)から加速度データA_aveが算出される。
 A_ave = (V2-V1) / T ・・・・(式3)
According to (Expression 2), V2 is calculated as the speed data immediately before the still image exposure. The speed data V2 is used as an absolute speed at the start of exposure. Further, the acceleration data A_ave is calculated from (Equation 3) using the speed data V1 and V2.
A_ave = (V2-V1) / T (Equation 3)
 次に、加速度データ181を用いて、静止画露光前の加速度平均値a_aveを求める(図9中のB参照)。加速度平均値a_aveは、測距データで使用したX1~X3のデータ取得期間中の加速度データa1~a13の平均値となる(式4)。図9に示す例では、加速度データは、 a13 までの13個を使っているが、実際には測距データと加速度データのサンプリングによって異なる。なお、図9中のBにおいて、an(nは1~20)は、基準点補正前の加速度データを表す。
 a_ave = Σ(an)/ n ・・・(式4)
Next, an acceleration average value a_ave before the still image exposure is obtained using the acceleration data 181 (see B in FIG. 9). The acceleration average value a_ave is an average value of the acceleration data a1 to a13 during the data acquisition period of X1 to X3 used in the distance measurement data (Formula 4). In the example shown in FIG. 9, 13 pieces of acceleration data up to a13 are used. In FIG. 9B, an (n is 1 to 20) represents acceleration data before reference point correction.
a_ave = Σ (an) / n (Formula 4)
 式3および式4において算出されたA_aveとa_aveを用いて、基準点誤差は(式5)から算出できる。この基準点誤差は、測距データから算出した加速度と、加速度データから算出した加速度の差分である。本来ならば、両者は一致することが好ましい。しかし、加速度センサ8の出力にはオフセットが重畳しているために、両者の加速度の値にはズレ(相違)が生じている。加速度データに基づいて、デフォーカス量を算出するにあたって、基準点誤差を用いて、このオフセット分のズレを補正する。
 基準点誤差 = a_ave - A_ave ・・・・(式5)
Using A_ave and a_ave calculated in Equations 3 and 4, the reference point error can be calculated from (Equation 5). This reference point error is the difference between the acceleration calculated from the distance measurement data and the acceleration calculated from the acceleration data. Originally, it is preferable that both coincide. However, since an offset is superimposed on the output of the acceleration sensor 8, there is a difference (difference) between the acceleration values. In calculating the defocus amount based on the acceleration data, the offset deviation is corrected using the reference point error.
Reference point error = a_ave-A_ave (5)
 静止画露光開始後の加速度データ(an’)は、下記(式6)より算出できる。すなわち、加速度データ181(図9においてan)から(式5)で算出した基準点誤差を減算した値を、基準点補正後の加速度データとする。すなわち、基準点誤差を用いて、加速度データa1~a13に対して加速度センサの基準点補正を行う。
 an’ = an - 基準点誤差 ・・・・(式6)
 なお、図9中のBにおいて、an’(nは13~20)は、基準点補正後の加速度データを表す。
Acceleration data (an ′) after the start of still image exposure can be calculated from the following (formula 6). That is, a value obtained by subtracting the reference point error calculated by (Equation 5) from the acceleration data 181 (an in FIG. 9) is set as acceleration data after reference point correction. That is, the reference point correction of the acceleration sensor is performed on the acceleration data a1 to a13 using the reference point error.
an '= an-reference point error ... (Formula 6)
Note that in B in FIG. 9, an ′ (n is 13 to 20) represents acceleration data after reference point correction.
 なお、基準点補正後の加速度データを求めることによって、デフォーカス量算出部186は、基準点補正後の加速度データと、(式2)で求めた露光直前の絶対速度V2を用いて、(式7)によって高精度なデフォーカス量を算出することができる。
 デフォーカス量 = ∫(V2 + ∫an’) ・・・・(式7)
Note that by obtaining the acceleration data after the reference point correction, the defocus amount calculation unit 186 uses the acceleration data after the reference point correction and the absolute velocity V2 immediately before the exposure obtained in (Expression 2), 7), a highly accurate defocus amount can be calculated.
Defocus amount = ∫ (V2 + ∫an ') (Equation 7)
 式7において、デフォーカス量が算出された後は、フォーカスレンズ駆動量算出部15は、この算出されたデフォーカス量を、フォーカスレンズを駆動するために必要な量に単位換算を行って、駆動位置または駆動量を算出する。 In Expression 7, after the defocus amount is calculated, the focus lens drive amount calculation unit 15 converts the calculated defocus amount into an amount necessary for driving the focus lens, and performs drive conversion. Calculate the position or drive amount.
 次に、図10に示すフローチャートを用いて、本実施形態における静止画露光中ピントボケ補正の処理について説明する。この制御フローは、第1実施形態の場合と同様に、システム制御CPU10が不揮発性メモリに記憶されたプログラムに従って、カメラ本体1内の各部を制御することを示している。 Next, the processing for correcting the out-of-focus blur during still image exposure according to the present embodiment will be described with reference to the flowchart shown in FIG. As in the case of the first embodiment, this control flow indicates that the system control CPU 10 controls each part in the camera body 1 according to a program stored in the nonvolatile memory.
 図10に示すフローが開始すると、まず、ステップS1と同様に、1R押下げか否かの判定を行う(S21)。ここでは、ユーザがレリーズ釦を半押したか否かを判定する。1R押下げが検出されない場合は、待機状態となる。 When the flow shown in FIG. 10 is started, first, similarly to step S1, it is determined whether or not 1R is depressed (S21). Here, it is determined whether or not the user has pressed the release button halfway. If 1R depression is not detected, a standby state is entered.
 ステップS1における判定の結果、1R押下げがあった場合には、オートフォーカス動作(AF動作)を開始し、まず加速度データ(an)を取得する(S23)。ここでは、加速度データ検出部17が、加速度センサ8から一定周期毎(測距間隔T毎)に加速度データ(an)を取得する。 If the result of determination in step S1 is that 1R has been depressed, an autofocus operation (AF operation) is started, and acceleration data (an) is first acquired (S23). Here, the acceleration data detection unit 17 acquires acceleration data (an) from the acceleration sensor 8 at regular intervals (every distance measurement interval T).
 続いて、測距データ(Xn)を取得する(S25)。ここでは、測距データ検出部11または測距画素制御部16が、測距データを取得する。1Rボタン押下中は、測距データを一定周期毎に取得し続ける。なお、図9のA、Bより分かるように、測距データと加速度データを取得するタイミングは、異なっている。両者は、適宜、予め設定されている周期に従ってデータを取得する。 Subsequently, ranging data (Xn) is acquired (S25). Here, the ranging data detection unit 11 or the ranging pixel control unit 16 acquires the ranging data. While the 1R button is pressed, distance measurement data is continuously acquired at regular intervals. As can be seen from FIGS. 9A and 9B, the timing for obtaining the distance measurement data and the acceleration data is different. Both acquire data appropriately according to a preset period.
 加速度データ(an)と測距データ(Xn)を取得すると、基準点誤差を算出する(S27)。ここでは、図9を用いて説明したように式5を用いて、基準点誤差を算出する。基準点誤差は、直近で取得したデータを用いて、逐次、更新する。 When the acceleration data (an) and distance measurement data (Xn) are acquired, a reference point error is calculated (S27). Here, the reference point error is calculated using Equation 5 as described with reference to FIG. The reference point error is sequentially updated using the most recently acquired data.
 続いて、デフォーカス量を算出する(S29)。ここでは、ステップS25において取得した測距データ(Xn)を用いて、ピント補正量算出部21がデフォーカス量を算出する。 Subsequently, the defocus amount is calculated (S29). Here, the focus correction amount calculation unit 21 calculates the defocus amount using the distance measurement data (Xn) acquired in step S25.
 デフォーカス量を算出すると、次にフォーカス用レンズ駆動を行う(S31)。ここでは、フォーカスレンズ駆動量算出部15が、フォーカスレンズの駆動位置または駆動量を算出し、光学系駆動部3がフォーカスレンズを合焦位置に移動させる。 Once the defocus amount is calculated, the focus lens is driven (S31). Here, the focus lens drive amount calculation unit 15 calculates the drive position or drive amount of the focus lens, and the optical system drive unit 3 moves the focus lens to the focus position.
 フォーカス用レンズ駆動を行うと、次に、ステップS9と同様に、2R押下げか否かを判定する(S33)。ユーザは、静止画を撮影しようとする場合には、レリーズ釦の全押しを行う。このステップでは、レリーズ釦が全押しされたか否かを判定する。ステップS33での判定の結果、2R押下げがない場合には、ステップS23に戻り、前述の動作を繰り返し、フォーカスレンズを合焦位置に維持する。なお、第1実施形態と同様に、1Rボタン押下中は、加速度データと、測距データを一定周期毎に取得し続ける動作は、一例である。この動作手順に限らずに、ライブビュー撮影中は、1Rボタン押下の有無に関わらずに、加速度データと測距データを一定周期毎に取得し続ける動作であってもよい。 Once the focus lens is driven, it is next determined whether or not the 2R has been depressed (S33), as in step S9. When the user intends to shoot a still image, the user fully presses the release button. In this step, it is determined whether or not the release button has been fully pressed. If the result of determination in step S33 is that there has been no 2R depression, processing returns to step S23 and the above operation is repeated to maintain the focus lens in the in-focus position. Note that, as in the first embodiment, while the 1R button is pressed, the operation of continuously acquiring acceleration data and distance measurement data at regular intervals is an example. Without being limited to this operation procedure, during live view shooting, it may be an operation of continuously acquiring acceleration data and distance measurement data at regular intervals regardless of whether or not the 1R button is pressed.
 ステップS33における判定の結果、2R押下げがあった場合には、第1実施形態の場合と同様、静止画撮影動作に移る。また、この静止画撮影動作と並行して、撮像素子7の露光動作中に、ステップS35~S41において、フォーカスレンズの焦点調節動作(ピントボケ補正動作)を行う。2Rボタン押下直後に、1Rボタン押下中にS27で算出した最新(静止画露光直前)の基準点誤差と絶対速度V2(式2で算出)を保持しておく。静止画露光中のデフォーカス量を算出する際に使用するためである。 If the result of determination in step S33 is that there has been 2R depression, the operation proceeds to a still image shooting operation, as in the first embodiment. In parallel with the still image shooting operation, during the exposure operation of the image sensor 7, a focus lens focus adjustment operation (focus blur correction operation) is performed in steps S35 to S41. Immediately after the 2R button is pressed, the latest reference point error (immediately before the still image exposure) calculated in S27 and the absolute velocity V2 (calculated by Expression 2) are held while the 1R button is pressed. This is for use in calculating the defocus amount during still image exposure.
 加速度データ(an)を取得する(S35)。2Rボタンが押下された後の静止画露光期間中は、加速度データ(an)を一定周期毎に取得し続ける。また、加速度データ(an’)を算出する(S37)。ここでは、基準点誤差を用いて、式6を用いて、基準点補正後の加速度データ(an’)を算出しておく。 Acceleration data (an) is acquired (S35). During the still image exposure period after the 2R button is pressed, acceleration data (an) is continuously acquired at regular intervals. Further, acceleration data (an ') is calculated (S37). Here, using the reference point error, acceleration data (an ′) after the reference point correction is calculated using Equation 6.
 次に、デフォーカス量を算出する(S39)。ここでは、ピント補正量算出部18は、式7に従って、ステップS35で取得した加速度データ(an’)と、ステップS37で算出した絶対速度V2を用いて、デフォーカス量を定期的に算出する。 Next, the defocus amount is calculated (S39). Here, the focus correction amount calculation unit 18 periodically calculates the defocus amount using the acceleration data (an ′) acquired in step S35 and the absolute velocity V2 calculated in step S37 according to Equation 7.
 デフォーカス量を算出すると、次に、フォーカス用レンズを駆動する(S41)。ここでは、フォーカスレンズ駆動量算出部15および光学系駆動部3によって、ステップS39において求めたデフォーカス量を用いて、光学系2のフォーカスレンズを定期的に駆動させる。 When the defocus amount is calculated, the focus lens is then driven (S41). Here, the focus lens drive amount calculation unit 15 and the optical system drive unit 3 regularly drive the focus lens of the optical system 2 using the defocus amount obtained in step S39.
 フォーカス用レンズの駆動を行うと、次に、静止画露光が終了したか否かを判定する(S43)。この判定では、露光時間が経過し、シャッタ6が閉じられると、静止画露光が終了されたと判定される。この判定の結果、静止画露光が終了していない場合には、ステップS35に戻り、所定時間間隔で、ステップS35~S41を繰り返し行い、静止画露光中においても、フォーカスレンズを合焦位置に維持する。ステップS43における判定の結果、静止画露光が終了すると、このフローを終了する。 When the focus lens is driven, it is next determined whether or not the still image exposure is completed (S43). In this determination, when the exposure time has elapsed and the shutter 6 is closed, it is determined that the still image exposure has been completed. If the result of this determination is that the still image exposure has not been completed, the process returns to step S35, and steps S35 to S41 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position even during still image exposure. To do. When the still image exposure is completed as a result of the determination in step S43, this flow is terminated.
 このように、第2実施形態およびその変形例においては、光軸方向のピント補正に測距データを用いて、加速度データの基準点補正を行うことにより、加速度センサの基準点精度を向上させ、精度の高いピント補正を実現している。従来技術における加速度センサの位置精度が低いという課題を解決させることができる。 As described above, in the second embodiment and the modification thereof, the reference point correction of the acceleration data is performed by using the distance measurement data for the focus correction in the optical axis direction, thereby improving the reference point accuracy of the acceleration sensor. Realizes highly accurate focus correction. The problem that the position accuracy of the acceleration sensor in the prior art is low can be solved.
 本実施形態やその変形例においては、デフォーカス検出部として、測距機能(センサ)の他に加速度センサ8を用いており、相補的に検出値を補正(校正させる)する補正部を設けている。露光中のデフォーカス量検出は、加速度センサ出力から算出することが主であるが、加速度センサ出力は、時間変化によるオフセットずれが発生するので、同じ検出期間で得られる、測距センサからの情報を用いて、加速度センサ出力のオフセットずれ分を校正している。 In the present embodiment and its modifications, the acceleration sensor 8 is used in addition to the distance measuring function (sensor) as the defocus detection unit, and a correction unit that corrects (calibrates) the detection value in a complementary manner is provided. Yes. Defocus amount detection during exposure is mainly calculated from the output of the acceleration sensor, but the offset from the acceleration sensor output due to time changes occurs, so information from the distance measurement sensor obtained in the same detection period Is used to calibrate the offset deviation of the acceleration sensor output.
 第2実施形態では、加速度センサ出力によるデフォーカス量の検出を主なる検出手段とし、測距センサによるデフォーカス量が副とする検出手段としている。このような使い分けは、次の観点(i)(ii)からである。 In the second embodiment, detection of a defocus amount based on an acceleration sensor output is used as a main detection unit, and detection unit is used as a sub-defocus amount of a distance measuring sensor. Such proper use is from the following viewpoints (i) and (ii).
 (i) 測距センサを使用する場合には、露出量が低い(露出時間が短い(APEX値で示すTv値が短い)、被写体輝度Bが低い(APEX値で示すBv値が小さい))等の撮影条件の状態では、測距センサのセンサ信号出力が低くなるため、デフォーカス量の検出が難しくなる。このため、サンプリング時間(蓄積時間)を増やなければならなくなる。 (I) When using a distance measuring sensor, the exposure amount is low (exposure time is short (Tv value indicated by APEX value is short), subject luminance B is low (Bv value indicated by APEX value is small)), etc. In the state of the photographing condition, the sensor signal output of the distance measuring sensor becomes low, and it becomes difficult to detect the defocus amount. For this reason, it is necessary to increase the sampling time (accumulation time).
 (ii) 加速度センサを用いる場合には、撮影条件に関わりなく姿勢変化(デフォーカス量)を検出できる。しかし、加速度センサは、検出時間が長くなると、加速度センサのオフセットずれによる検出精度のずれが生じてしまう。 (Ii) When an acceleration sensor is used, a change in posture (defocus amount) can be detected regardless of shooting conditions. However, when the detection time of the acceleration sensor becomes long, the detection accuracy shifts due to the offset shift of the acceleration sensor.
 このため、第2実施形態においては、加速度センサ出力によるデフォーカス量の検出を主とし、長時間の使用の際に生ずるオフセットずれを測距センサの出力を用いて校正している。 For this reason, in the second embodiment, the defocus amount is mainly detected by the output of the acceleration sensor, and the offset deviation that occurs during long-time use is calibrated using the output of the distance measuring sensor.
 次に、図11ないし図15Bを用いて、本発明の第3実施形態およびその変形例について説明する。本実施形態及び変形例は、被写体輝度と露出時間に応じて、デフォーカス量の検出部として、測距センサまたは加速度センサのいずれかの出力を選択して検出するようにしている。 Next, a third embodiment of the present invention and its modification will be described with reference to FIGS. 11 to 15B. In the present embodiment and the modification, the output of either the distance measurement sensor or the acceleration sensor is selected and detected as a defocus amount detection unit according to the subject brightness and the exposure time.
 図11は、第3実施形態の主として電気的構成を示すブロック図である。図6に示した第2実施形態におけるブロック図と比較すると、第3実施形態において、測光センサ9と露出制御部19が追加され、ピント補正量算出部18をピント補正量算出部20に置き換えている点で相違する。 FIG. 11 is a block diagram mainly showing an electrical configuration of the third embodiment. Compared to the block diagram in the second embodiment shown in FIG. 6, in the third embodiment, a photometric sensor 9 and an exposure control unit 19 are added, and the focus correction amount calculation unit 18 is replaced with a focus correction amount calculation unit 20. Is different.
 測光センサ9は、被写体の輝度を取得するセンサであり、取得した輝度情報を露出制御部19に出力する。なお、測光センサ9の出力としては、撮像素子7中の撮像画素からの画素信号を用いるようにしてもよく、この場合には、画素信号に基づいて被写体の輝度情報を取得することができ、また測光センサ9を省略することができる。測光センサ9は、被写体輝度を検出する測光部として機能する。 The photometric sensor 9 is a sensor that acquires the luminance of the subject, and outputs the acquired luminance information to the exposure control unit 19. In addition, as an output of the photometric sensor 9, a pixel signal from an imaging pixel in the imaging device 7 may be used. In this case, luminance information of the subject can be acquired based on the pixel signal, Further, the photometric sensor 9 can be omitted. The photometric sensor 9 functions as a photometric unit that detects subject luminance.
 露出制御部19は、測光センサ9から入力した被写体の輝度情報を基に、静止画撮影時のシャッタ速度値・絞り値・ISO感度値を算出し、これらの値に基づいてシャッタ6等を制御する。本実施形態での露出制御部19の役割は、ピント補正部が、静止画撮影時のシャッタ速度値に基づいて、加速度センサ8から求めたピント補正量を使用するのか、またはAFセンサ5(後述する第3実施形態の変形例においては撮像素子7中の測距素子)から求めたピント補正量を使用するのかを判定することである。この判定処理はピント補正量算出部20で実施され、詳細は、図13~図15を用いて後述する。なお、本実施形態においては、露出制御部19の機能は、主としてプログラムに基づいてシステム制御CPU10が実現するが、システム制御CPU10(コントローラ)の周辺回路として加速度データ検出回路を設け、この回路によって実現してもよい。 The exposure control unit 19 calculates a shutter speed value, an aperture value, and an ISO sensitivity value during still image shooting based on the luminance information of the subject input from the photometric sensor 9, and controls the shutter 6 and the like based on these values. To do. The role of the exposure control unit 19 in this embodiment is that the focus correction unit uses the focus correction amount obtained from the acceleration sensor 8 based on the shutter speed value at the time of still image shooting, or the AF sensor 5 (described later). In the modification of the third embodiment, it is determined whether to use the focus correction amount obtained from the distance measuring element in the image sensor 7. This determination processing is performed by the focus correction amount calculation unit 20, and details will be described later with reference to FIGS. In the present embodiment, the function of the exposure control unit 19 is realized by the system control CPU 10 mainly based on a program. However, an acceleration data detection circuit is provided as a peripheral circuit of the system control CPU 10 (controller), and is realized by this circuit. May be.
 フォーカス制御部として機能するシステム制御CPU10は、被写体輝度に応じて、被写体距離の検出を、測距部と、移動量検出部とによる検出値のどちらか一方、あるいは双方を選択するかを選択する選択部をさらに有する(例えば、図15AのS51参照)。また、フォーカス制御部として機能するシステム制御CPU10は、選択部により選択された後に検出された被写体距離に基づいて、被写体距離とピント位置とを一致させる方向に、フォーカスフォーカス駆動部を移動させる制御を行う(例えば、図15AのS53~S59、図15BのS63~S71参照)。 The system control CPU 10 functioning as a focus control unit selects whether to select one or both of the detection values of the distance measurement unit and the movement amount detection unit for detection of the subject distance according to the subject luminance. It further has a selection part (for example, refer to S51 of Drawing 15A). Further, the system control CPU 10 functioning as a focus control unit performs control for moving the focus focus drive unit in a direction in which the subject distance and the focus position are matched based on the subject distance detected after being selected by the selection unit. (For example, see S53 to S59 in FIG. 15A and S63 to S71 in FIG. 15B).
 図12は、第3実施形態の変形例を示すブロック図である。図12の大半のブロックは図7と同じであり、相違点は、測光センサ9と露出制御部19が追加され、ピント補正量算出部20に置き換えている点のみである。この相違点は、図11に示した構成と同様であることから詳しい説明を省略する。 FIG. 12 is a block diagram showing a modification of the third embodiment. Most of the blocks in FIG. 12 are the same as those in FIG. 7, and the only difference is that a photometric sensor 9 and an exposure control unit 19 are added and replaced with a focus correction amount calculation unit 20. Since this difference is the same as the configuration shown in FIG. 11, detailed description thereof is omitted.
 次に、図13を用いて、ピント補正量算出部20の機能の詳細について説明する。図13は、ピント補正量算出部20の機能の詳細を示すブロック図であり、ピント補正量算出部20は、測距データ201、速度変換部202、加速度変換部203、加速度データ204、加速度データ算出部(基準点補正後)205、デフォーカス量算出部(測距)206、デフォーカス量算出部(加速度)207、およびピント補正部切替部208を有する。 Next, details of the function of the focus correction amount calculation unit 20 will be described with reference to FIG. FIG. 13 is a block diagram illustrating details of the function of the focus correction amount calculation unit 20. The focus correction amount calculation unit 20 includes distance measurement data 201, a speed conversion unit 202, an acceleration conversion unit 203, acceleration data 204, and acceleration data. A calculation unit (after reference point correction) 205, a defocus amount calculation unit (distance measurement) 206, a defocus amount calculation unit (acceleration) 207, and a focus correction unit switching unit 208 are included.
 図13に示すブロック図は、第1実施形態の図3と第2実施形態の図8を合成し、これにピント補正部切替部208を追加している。すなわち、図3における測距データ121、デフォーカス量算出部122は、図13において測距データ201、デフォーカス量算出部206に相当する。また、図8における加速度データ181、測距データ182、速度変換部183、加速度変換部184、加速度データ算出部185、デフォーカス量算出部186は、図13において、加速度データ204、測距データ201、速度変換部202、加速度変換部203、加速度データ算出部205、デフォーカス量算出部207に相当する。そこで、図13において、追加されたピント補正部切替部208のみについて説明する。 The block diagram shown in FIG. 13 combines FIG. 3 of the first embodiment and FIG. 8 of the second embodiment, and adds a focus correction unit switching unit 208 to this. That is, the distance measurement data 121 and the defocus amount calculation unit 122 in FIG. 3 correspond to the distance measurement data 201 and the defocus amount calculation unit 206 in FIG. In addition, the acceleration data 181, distance measurement data 182, speed conversion unit 183, acceleration conversion unit 184, acceleration data calculation unit 185, and defocus amount calculation unit 186 in FIG. , The speed conversion unit 202, the acceleration conversion unit 203, the acceleration data calculation unit 205, and the defocus amount calculation unit 207. Therefore, only the added focus correction unit switching unit 208 will be described with reference to FIG.
 ピント補正部切替部208は、静止画撮影時に露出制御部19から取得したシャッタ速度情報に基づいて、測距データから求めたデフォーカス量を出力するのか、加速度データから求めたデフォーカス量を出力するのかの切り替えを行う。また、両者のどちらでもなく、ピント補正を行わない場合もある。この判定の詳細については、図15を用いて後述する。 The focus correction unit switching unit 208 outputs the defocus amount obtained from the distance measurement data or the defocus amount obtained from the acceleration data based on the shutter speed information acquired from the exposure control unit 19 during still image shooting. Switch whether to do. In some cases, neither of the two is performed and focus correction is not performed. Details of this determination will be described later with reference to FIG.
 ピント補正部切替部208によって選択され、出力されたデフォーカス量は、フォーカスレンズ駆動量算出部15でフォーカスレンズ駆動用に単位換算を行い、駆動位置または駆動量を算出する。レンズ側の光学系駆動部3は、算出された駆動位置・駆動量を使用し、光学系2のフォーカスを駆動させることにより、静止画露光中の光軸方向のピントを補正する。 The defocus amount selected and output by the focus correction unit switching unit 208 is converted into a unit for driving the focus lens by the focus lens driving amount calculation unit 15 to calculate a driving position or a driving amount. The lens-side optical system drive unit 3 uses the calculated drive position and drive amount to drive the focus of the optical system 2 to correct the focus in the optical axis direction during still image exposure.
 次に、図14に示すタイミングチャートを用いて、図13に示した加速度データ算出部(測距)206と、デフォーカス量算出部(加速度)207によって算出されたデフォーカス量の選択について説明する。第3実施形態におけるデフォーカス量の算出方法は、第1実施形態と第2実施形態におけるデフォーカス量の算出方法を合成している。このため、図14に示すブロック図は、第2実施形態の図9に示すブロック図に対して、静止画露光中でも測距データを取得するという部分が異なる。そのため、静止画露光中のデフォーカス量算出部分から説明する。 Next, selection of the defocus amount calculated by the acceleration data calculation unit (ranging) 206 and the defocus amount calculation unit (acceleration) 207 shown in FIG. 13 will be described using the timing chart shown in FIG. . The defocus amount calculation method in the third embodiment is a combination of the defocus amount calculation methods in the first embodiment and the second embodiment. For this reason, the block diagram shown in FIG. 14 differs from the block diagram shown in FIG. 9 of the second embodiment in that the distance measurement data is acquired even during still image exposure. Therefore, the defocus amount calculation part during still image exposure will be described.
 図14において、2Rボタン押下直後(静止画露光開始直後)、ピント補正部切替部208は、測距データ201から求めたデフォーカス量(Xn)を出力するのか、加速度データ204から求めたデフォーカス量(xn)を出力するのかの切り替えを行う(図14中、網掛けの部分参照)。この判定処理の詳細については、図15を用いて後述する。 In FIG. 14, immediately after the 2R button is pressed (immediately after the start of still image exposure), the focus correction unit switching unit 208 outputs the defocus amount (Xn) obtained from the distance measurement data 201 or the defocus obtained from the acceleration data 204. Switching between output of the quantity (xn) is performed (see the shaded portion in FIG. 14). Details of this determination processing will be described later with reference to FIG.
 フォーカスレンズ駆動量算出部15は、測距データ201もしくは加速度データ204から算出されたデフォーカス量に基づいて、フォーカスレンズ駆動用に単位換算を行い、駆動位置または駆動量を算出する。 The focus lens driving amount calculation unit 15 performs unit conversion for driving the focus lens based on the defocus amount calculated from the distance measurement data 201 or the acceleration data 204, and calculates a driving position or a driving amount.
 次に、図15A及び図15Bに示すフローチャートを用いて、本実施形態における静止画露光中ピントボケ補正の処理について説明する。この制御フローは、第1および第2実施形態の場合と同様に、システム制御CPU10が不揮発性メモリに記憶されたプログラムに従って、カメラ本体1内の各部を制御することにより実現する。 Next, the process of correcting the out-of-focus blur during still image exposure in the present embodiment will be described using the flowcharts shown in FIGS. 15A and 15B. Similar to the first and second embodiments, this control flow is realized by the system control CPU 10 controlling each part in the camera body 1 in accordance with a program stored in the nonvolatile memory.
 図15Aに示すフローが開始すると、まず、1R押下げか否かの判定した後、2R押下げか否かの判定を行うまでは、図10に示すフローのステップS21~S33と同様であるので、同一の処理を行うステップには同一のステップ番号を付し、詳しい説明を省略する。 When the flow shown in FIG. 15A is started, the process is the same as steps S21 to S33 in the flow shown in FIG. 10 until it is determined whether or not 1R is pressed and whether or not 2R is pressed. Steps for performing the same processing are given the same step numbers, and detailed description is omitted.
 ステップS33において、2R押下げが有ったと判定された直後に、1R押下げ中にステップS27で算出した最新(静止画露光直前)の基準点誤差と絶対速度V2(式2参照)を保持しておく。 Immediately after it is determined in step S33 that 2R has been pressed, the latest reference point error (just before the still image exposure) calculated in step S27 and absolute velocity V2 (see Equation 2) are held during 1R pressing. Keep it.
 続いて、被写体が所定輝度より高輝度か否かの判定を行う(S51)。ここでは、測光センサ9の検出結果に基づいて露出制御部19が算出した被写体輝度に基づいて判定する。所定輝度としては、測距データをAFセンサ5または撮像素子7によって比較的短時間で取得できる程度の輝度であればよい。つまり、低輝度であるほどノイズ成分の影響を受け易いことから、所定輝度は、比較的短時間で得られた測距データがノイズ成分に対して十分に区別可能となる被写体輝度の値である。なお、本実施形態においては、輝度で判定しているが、これに限らず、露光量Evで判定してもよく、また他の露出制御値等で判定してもよい。 Subsequently, it is determined whether or not the subject has a brightness higher than a predetermined brightness (S51). Here, the determination is made based on the subject brightness calculated by the exposure control unit 19 based on the detection result of the photometric sensor 9. The predetermined luminance may be a luminance that allows the distance measurement data to be acquired by the AF sensor 5 or the image sensor 7 in a relatively short time. In other words, since the lower the luminance, the more easily affected by the noise component, the predetermined luminance is a subject luminance value that enables the distance measurement data obtained in a relatively short time to be sufficiently distinguished from the noise component. . In the present embodiment, the determination is made based on the brightness. However, the present invention is not limited to this, and the determination may be made based on the exposure amount Ev, or may be made based on other exposure control values.
 ステップS51における判定の結果、被写体が高輝度であれば、測距データ(Xn)を取得する(S53)。ここでは、AFセンサ5(本実施形態の変形例においては撮像素子7の測距画素)から測距データを採用する。静止画露光中、測距データを一定周期毎に取得し続ける。 If the result of determination in step S51 is that the subject has high brightness, distance measurement data (Xn) is acquired (S53). Here, distance measurement data is adopted from the AF sensor 5 (in the modification of the present embodiment, the distance measurement pixel of the image sensor 7). During still image exposure, distance measurement data is continuously acquired at regular intervals.
 測距データ(Xn)を取得すると、次に、ステップS13(図4参照)と同様に、デフォーカス量を算出する(S55)。ここでは、ピント補正量算出部20は、ステップS53において取得した測距データ(Xn)を用いてデフォーカス量を定期的に算出する。 When the ranging data (Xn) is acquired, the defocus amount is calculated next, similarly to step S13 (see FIG. 4) (S55). Here, the focus correction amount calculation unit 20 periodically calculates the defocus amount using the distance measurement data (Xn) acquired in step S53.
 デフォーカス量を算出すると、次に、ステップS15(図4参照)と同様に、フォーカス用レンズを駆動する(S57)。ここでは、フォーカスレンズ駆動量算出部15、光学系駆動部3によってフォーカスレンズを合焦位置に定期的に移動させる。 Once the defocus amount has been calculated, the focus lens is then driven in the same manner as in step S15 (see FIG. 4) (S57). Here, the focus lens drive amount calculation unit 15 and the optical system drive unit 3 periodically move the focus lens to the in-focus position.
 フォーカス用レンズの駆動を行うと、次に、ステップS17(図4参照)、静止画露光が終了したか否かを判定する(S51)。露光時間が経過し、シャッタ6が閉じられると、静止画露光が終了となる。この判定の結果、静止画露光が終了していない場合には、ステップS53に戻り、所定時間間隔で、ステップS53~S59を繰り返し行い、フォーカスレンズを合焦位置に維持する。静止画露光が終了すると、このフローを終了する。 Once the focus lens is driven, it is next determined in step S17 (see FIG. 4) whether or not still image exposure has been completed (S51). When the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that the still image exposure has not ended, the process returns to step S53, and steps S53 to S59 are repeated at predetermined time intervals to maintain the focus lens at the in-focus position. When the still image exposure is finished, this flow is finished.
 ステップS51に戻り、このステップにおける判定の結果、被写体が高輝度でない場合には、露出時間が所定時間よりも短いか否かを判定する(S61)。ここでは、測光センサ9の検出結果に基づいて露出制御部19が算出したシャッタ速度値に基づいて判定する。所定時間としては、基準点補正によって加速度データの信頼性を保てる程度の時間であればよい。この判定の結果、露出時間が所定時間よりも短くない場合には、デフォーカス量算出およびフォーカス用レンズを駆動することなく、静止画露光中ピントボケのフローを終了する。 Returning to step S51, if the result of determination in this step is that the subject is not bright, it is determined whether or not the exposure time is shorter than a predetermined time (S61). Here, the determination is made based on the shutter speed value calculated by the exposure control unit 19 based on the detection result of the photometric sensor 9. The predetermined time may be a time that can maintain the reliability of the acceleration data by correcting the reference point. If the result of this determination is that the exposure time is not shorter than the predetermined time, the flow of out-of-focus during still image exposure is terminated without driving the defocus amount calculation and focusing lens.
 ステップS61における判定の結果、露出時間が所定時間よりも短い場合には、ステップS35(図10参照)と同様に、加速度データ(an)を取得する(S63)。ここでは、加速度センサ8、加速度データ検出部17より、静止画露光中、加速度データ(an)を一定周期毎に取得し続ける。 If the result of determination in step S61 is that the exposure time is shorter than the predetermined time, acceleration data (an) is acquired as in step S35 (see FIG. 10) (S63). Here, the acceleration data (an) is continuously acquired from the acceleration sensor 8 and the acceleration data detection unit 17 at regular intervals during still image exposure.
 加速度データ(an)を取得すると、次に、ステップS37(図10参照)と同様に、基準点補正後の加速度データ(an’)を算出する(S65)。ここでは、2R押下げ直前に記憶した、基準点誤差を用いて、式6に従って基準点補正後の加速度データ(an’)を算出する。 Once the acceleration data (an) is acquired, next, acceleration data (an ') after the reference point correction is calculated (S65) as in step S37 (see FIG. 10). Here, the acceleration data (an ′) after the reference point correction is calculated according to Equation 6 using the reference point error stored immediately before 2R is pressed.
 続いて、ステップS39(図10参照)と同様に、デフォーカス量を算出する(S67)。ここでは、ピント補正量算出部20が、加速度データ(an’)と絶対速度V2を用いて、式7に従ってデフォーカス量を定期的に算出する。 Subsequently, as in step S39 (see FIG. 10), the defocus amount is calculated (S67). Here, the focus correction amount calculation unit 20 periodically calculates the defocus amount according to Equation 7 using the acceleration data (an ′) and the absolute velocity V2.
 デフォーカス量を算出すると、次に、ステップS41(図10参照)と同様に、フォーカス用レンズを駆動する(S69)。ここでは、フォーカスレンズ駆動量算出部15、光学系駆動部3が、ステップS65で算出したデフォーカス量を用いて、光学系2のフォーカスレンズを定期的に駆動させる。 Once the defocus amount has been calculated, the focus lens is then driven in the same manner as in step S41 (see FIG. 10) (S69). Here, the focus lens drive amount calculation unit 15 and the optical system drive unit 3 periodically drive the focus lens of the optical system 2 using the defocus amount calculated in step S65.
 フォーカス用レンズの駆動を行うと、次に、静止画露光が終了したか否かを判定する(S71)。露光時間が経過し、シャッタ6が閉じられると、静止画露光が終了となる。この判定の結果、静止画露光が終了していない場合には、ステップS63に戻り、所定時間間隔で、ステップS63~S71を繰り返し行い、静止画露光中においても、フォーカスレンズを合焦位置に維持する。静止画露光が終了すると、このフローを終了する。 When the focus lens is driven, it is next determined whether or not the still image exposure is completed (S71). When the exposure time elapses and the shutter 6 is closed, the still image exposure is completed. If the result of this determination is that still image exposure has not ended, the process returns to step S63, and steps S63 to S71 are repeated at predetermined time intervals to maintain the focus lens in the in-focus position even during still image exposure. To do. When the still image exposure is finished, this flow is finished.
 このように、図15Aおよび図15Bに示すフローにおいては、被写体が高輝度の場合には(S51Yes)、測距データを用いてフォーカスレンズの焦点調節制御を行う(S53~S59参照)。一方、被写体が高輝度でなく、且つ、静止画撮影の露光時間が短い場合には(S51No→S61Yes)、加速度データを用いてフォーカスレンズの焦点調節制御を行う(S63~S71参照)。どちらでも無い場合、測距データと加速度データのどちらも使用されず、この後の処理は実施されない。 As described above, in the flow shown in FIGS. 15A and 15B, when the subject has high brightness (Yes in S51), focus adjustment control of the focus lens is performed using the distance measurement data (see S53 to S59). On the other hand, when the subject is not bright and the exposure time for still image shooting is short (S51 No → S61 Yes), focus adjustment control of the focus lens is performed using the acceleration data (see S63 to S71). If neither is found, neither the distance measurement data nor the acceleration data is used, and the subsequent processing is not performed.
 被写体の輝度で測距データの採用可否を判断する理由は、被写体が低輝度の場合、必要な測距データの精度を得るために、測距のサンプリング時間を長くする必要がある。このため、測距のサンプリング時間が手ブレによる被写体距離変化を検出するのに必要なサンプリング時間が得られずに測距データの精度が格段に低下するためである。また、露光時間により加速度データの採用可否を判断する理由は、露光時間が長い場合、加速度センサの基準点の経時変化が大きく影響してしまい、加速度データの信頼性が低くなり、ピント補正の精度が低下してしまうためである。 The reason for determining whether or not the distance measurement data can be adopted based on the luminance of the subject is that if the subject has a low luminance, it is necessary to lengthen the distance sampling time in order to obtain the required accuracy of the distance measurement data. For this reason, the sampling time of the distance measurement cannot obtain the sampling time necessary for detecting the subject distance change due to the camera shake, and the accuracy of the distance measurement data is remarkably lowered. The reason for determining whether or not to adopt acceleration data based on the exposure time is that if the exposure time is long, the time-dependent change of the reference point of the acceleration sensor greatly affects the reliability of the acceleration data and the accuracy of focus correction This is because of the decrease.
 このように、第3実施形態およびその変形例においては、被写体が低輝度で測距のサンプリング時間が長くなってしまう場合には、加速度センサのデータに置き換えることで、暗い条件下でも露光中のピント補正が可能となる。 As described above, in the third embodiment and the modification thereof, when the subject has low luminance and the distance sampling time becomes long, the acceleration sensor data is used for the exposure, even under dark conditions. Focus correction is possible.
 また、第3実施形態およびその変形例においては、被写体輝度を検出する検出部(例えば、測光センサ9参照)、露出条件の算出(露出時間算出)部(例えば、露出制御部19参照)、および露出時間に応じて、加速度センサの出力に基づいてデフォーカス量を算出することを変更する変更部(例えば、ピント補正量算出部20、図15BのS61参照)を有している。このように構成することによって、検出部で検出した被写体輝度に基づいて、例えばプログラム露出モードで任意の露出時間を設定する場合であっても、加速度センサの出力または測距データのどちらかを、最適なデフォーカス量が得られる方を選択してデフォーカス量を算出することにより、露出条件に依存することなく露光中のピント補正をすることができる。 Further, in the third embodiment and its modification, a detection unit (for example, see photometric sensor 9) that detects subject brightness, an exposure condition calculation (exposure time calculation) unit (for example, see exposure control unit 19), and A change unit (for example, a focus correction amount calculation unit 20; see S61 in FIG. 15B) that changes the calculation of the defocus amount based on the output of the acceleration sensor according to the exposure time is provided. By configuring in this way, based on the subject luminance detected by the detection unit, for example, even when setting an arbitrary exposure time in the program exposure mode, either the output of the acceleration sensor or the distance measurement data, By calculating the defocus amount by selecting the one that provides the optimum defocus amount, it is possible to perform focus correction during exposure without depending on the exposure conditions.
 測距センサ(AFセンサ5、撮像素子7の測距画素)は、低露出量(露出時間が短い、被写体輝度が低い状態)では、デフォーカス量の検出が難しい。このため、検出時間(積分時間)を伸ばす必要がある。一方、加速度センサ8は、露出量に関係なく、姿勢変化(デフォーカス量)を検出することができる。ただし、露出時間が長くなると、加速度センサは、出力のオフセットずれによる、姿勢変化を検出する精度のずれが生じる。本実施形態は、2つのデフォーカス検出部の長所、短所を、被写体輝度と露出時間の条件に応じて切り替えを対応することができる。 It is difficult to detect the defocus amount of the distance measuring sensor (AF sensor 5 and the distance measuring pixel of the image sensor 7) with a low exposure amount (a short exposure time and a low subject luminance). For this reason, it is necessary to extend the detection time (integration time). On the other hand, the acceleration sensor 8 can detect a posture change (defocus amount) regardless of the exposure amount. However, when the exposure time becomes longer, the acceleration sensor has a deviation in accuracy for detecting a change in posture due to an offset deviation in output. In this embodiment, the advantages and disadvantages of the two defocus detection units can be switched according to the conditions of subject brightness and exposure time.
 なお、ステップS61(図15B参照)において、露出時間が所定時間より長い場合と判断された場合には、ステップS63以下において、加速度データによるデフォーカス量の補正が行われることがない。しかし、これに限らない。一例として、露光が開始されてから、所定時間が経過するまではステップS63以下におけるデフォーカス量の補正を行う。そして、露光が開始されてから、所定時間が経過した場合は、ステップS63以下のデフォーカス量の補正を行わないようにしても構わない。 If it is determined in step S61 (see FIG. 15B) that the exposure time is longer than the predetermined time, the defocus amount is not corrected by the acceleration data in step S63 and subsequent steps. However, it is not limited to this. As an example, the defocus amount is corrected in step S63 and subsequent steps until a predetermined time elapses after the exposure is started. Then, when a predetermined time has elapsed after the start of exposure, the defocus amount correction in step S63 and subsequent steps may not be performed.
 以上説明したように、本発明の各実施形態や変形例においては、フォーカスレンズを駆動するフォーカス駆動部(例えば、光学系駆動部3)を有する撮像装置において、露出期間中にデフォーカス量を露出期間に複数の時点で検出するデフォーカス検出部(例えば、AFセンサ5、撮像素子7の測距画素)と、露出期間中に得られたデフォーカス量を用いて、フォーカス制御を行うフォーカス制御部(例えば、フォーカスレンズ駆動量算出部15)を有している。このため、露光中に逐次ピント状態の変化を正確に検出して、ピントずれを逐次、正確に補正することができる。加速度センサ8を用いなくても、露光中にピントズレを補正することができ、撮影中でも合焦状態を保持することができる。特に、近接撮影の際に、絞り値が小さく被写界深度が浅く、手ブレの影響を受けやすいような状況であっても、ピントの合った撮影を行うことができる。 As described above, in each embodiment and modification of the present invention, the defocus amount is exposed during the exposure period in the imaging apparatus having the focus driving unit (for example, the optical system driving unit 3) that drives the focus lens. A defocus detection unit (for example, a AF pixel 5 and a distance measuring pixel of the image sensor 7) that detects at a plurality of time points in a period, and a focus control unit that performs focus control using a defocus amount obtained during an exposure period (For example, the focus lens drive amount calculation unit 15). For this reason, it is possible to accurately detect the change of the focus state sequentially during the exposure and correct the focus shift sequentially and accurately. Even if the acceleration sensor 8 is not used, the focus shift can be corrected during exposure, and the in-focus state can be maintained even during shooting. In particular, in close-up shooting, in-focus shooting can be performed even in situations where the aperture value is small and the depth of field is shallow and the camera is susceptible to camera shake.
 また、本発明の各実施形態や変形例においては、フォーカスレンズを含む光学レンズが結像する被写体像を撮像素子によって映像信号を検出して、記録用に映像信号を検出する際に(例えば、図4のS9Yes参照)、撮像装置と被写体との間の被写体距離を検出し(例えば、図4のS11参照)、被写体距離を検出することによって取得した被写体距離と、フォーカスレンズのピント位置とのずれ量とに基づいて、被写体距離とピント位置とを一致させる方向に、上記フォーカスレンズに対する移動量を算出し(例えば、図4のS13)、算出された移動量に基づいて、フォーカスレンズを移動させている(例えば、図4のS15)。 Further, in each embodiment or modification of the present invention, when a video signal is detected by an image sensor for a subject image formed by an optical lens including a focus lens, and the video signal is detected for recording (for example, 4 (see S9 Yes in FIG. 4), the subject distance between the imaging device and the subject is detected (see, for example, S11 in FIG. 4), and the subject distance obtained by detecting the subject distance and the focus lens focus position Based on the deviation amount, the movement amount with respect to the focus lens is calculated in the direction in which the subject distance and the focus position are matched (for example, S13 in FIG. 4), and the focus lens is moved based on the calculated movement amount. (For example, S15 in FIG. 4).
 また、本発明の第2及び第3実施形態や変形例においては、フォーカスレンズの光軸方向の移動量を検出する移動量検出部(例えば、加速度データ検出部17)を設け、この移動量検出部の出力を補正している(例えば、図8の加速度データ算出部185、図10のS37)。このため、フォーカスレンズの光軸方向の移動量に基づいて、フォーカスレンズの位置を補正するにあたって、移動量検出部の出力にオフセットが生ずるような場合であっても、精度よく焦点調節を行うことができる。 Further, in the second and third embodiments and modifications of the present invention, a movement amount detection unit (for example, acceleration data detection unit 17) for detecting the movement amount of the focus lens in the optical axis direction is provided, and this movement amount detection is performed. Output is corrected (for example, acceleration data calculation unit 185 in FIG. 8, S37 in FIG. 10). For this reason, when correcting the position of the focus lens based on the amount of movement of the focus lens in the optical axis direction, the focus adjustment is performed accurately even when the output of the movement amount detection unit is offset. Can do.
 なお、本発明の各実施形態においては、光学系2を透過した被写体光束を、ハーフミラー4を用いてAFセンサ5に導いていた。しかし、これに限らず、例えば、光学系2とは別の光学系を設けてAFセンサ5に被写体光束を導くようにしても構わない。すなわち、光学レンズとは異なる光学系を有する測距光学系を設け、光学レンズとは異なる経路からの被写体光に基づいて測距するようにしてもよい。 In each embodiment of the present invention, the subject light flux that has passed through the optical system 2 is guided to the AF sensor 5 using the half mirror 4. However, the present invention is not limited to this. For example, an optical system different from the optical system 2 may be provided to guide the subject light flux to the AF sensor 5. That is, a distance measuring optical system having an optical system different from the optical lens may be provided, and distance measurement may be performed based on subject light from a different path from the optical lens.
 また、本発明の第2及び第3実施形態や変形例において、加速度センサ8を設けて、フォーカスレンズの光軸方向の移動量を検出していた。しかし、これに限らず、角速度センサ、ジャイロ等、カメラ本体1の移動量を検出できるセンサであれば勿論かまわない。 In the second and third embodiments and modifications of the present invention, the acceleration sensor 8 is provided to detect the movement amount of the focus lens in the optical axis direction. However, the present invention is not limited to this, and any sensor that can detect the amount of movement of the camera body 1 such as an angular velocity sensor or a gyro may be used.
 また、本発明の各実施形態や変形例においては、コントローラとしてのシステム制御CPU10は、システム制御CPU10内の各部を周辺回路とCPU(Central Processing Unit)とプログラムコードによって実現していた。しかし、これに限らず、コントローラは、DSP(Digital Signal Processor)等のプログラムコードで実行される回路で実現するようにしてもよく、ヴェリログ(Verilog)によって記述されたプログラム言語に基づいて生成されたゲート回路等のハードウエア構成でもよい。また、コントローラは、ハードウエア回路によって実行するようにしても勿論かまわない。また、システム制御CPU10の機能の一部を、DSP等のプログラムコードで実行される回路で実現するようにしてもよく、ヴェリログによって記述されたプログラム言語に基づいて生成されたゲート回路等のハードウエア構成でもよく、またハードウエア回路によって実現するようにしてもよい。 In each embodiment and modification of the present invention, the system control CPU 10 as a controller realizes each part in the system control CPU 10 by a peripheral circuit, a CPU (Central Processing Unit), and a program code. However, the present invention is not limited to this, and the controller may be realized by a circuit executed by a program code such as DSP (Digital Signal Processor) or the like, and is generated based on a program language described by Verilog. A hardware configuration such as a gate circuit may be used. Of course, the controller may be executed by a hardware circuit. Further, a part of the function of the system control CPU 10 may be realized by a circuit executed by a program code such as a DSP, or hardware such as a gate circuit generated based on a program language described by Verilog. It may be configured, or may be realized by a hardware circuit.
 また、本発明の各実施形態や変形例においては、撮影のための機器として、デジタルカメラを用いて説明した。デジタルカメラの実施例としては、デジタル一眼レフカメラでもミラーレスカメラでもコンパクトデジタルカメラでもよく、ビデオカメラ、ムービーカメラのような動画用のカメラでもよく、さらに、携帯電話、スマートフォン、携帯情報端末、パーソナルコンピュータ(PC)、タブレット型コンピュータ、ゲーム機器等に内蔵されるカメラ、医療用カメラ、顕微鏡等の科学機器用のカメラ、自動車搭載用カメラ、監視用カメラでも構わない。いずれにしても、撮影が可能な機器であれば、本発明を適用することができる。 In each embodiment and modification of the present invention, a digital camera is used as an apparatus for photographing. Examples of digital cameras may be digital single-lens reflex cameras, mirrorless cameras, compact digital cameras, video cameras such as video cameras and movie cameras, and mobile phones, smartphones, personal digital assistants, personal computers A computer (PC), a tablet computer, a camera built in a game machine, a medical camera, a camera for a scientific instrument such as a microscope, a car-mounted camera, or a monitoring camera may be used. In any case, the present invention can be applied to any device capable of photographing.
 また、本明細書において説明した技術のうち、主にフローチャートで説明した制御に関しては、プログラムで設定可能であることが多く、記録媒体や記録部に収められる場合もある。この記録媒体、記録部への記録の仕方は、製品出荷時に記録してもよく、配布された記録媒体を利用してもよく、インターネットを介してダウンロードしたものでもよい。 Of the techniques described in this specification, the control mainly described in the flowchart is often settable by a program and may be stored in a recording medium or a recording unit. The recording method for the recording medium and the recording unit may be recorded at the time of product shipment, may be a distributed recording medium, or may be downloaded via the Internet.
 また、特許請求の範囲、明細書、および図面中の動作フローに関して、便宜上「まず」、「次に」等の順番を表現する言葉を用いて説明したとしても、特に説明していない箇所では、この順で実施することが必須であることを意味するものではない。 In addition, regarding the operation flow in the claims, the specification, and the drawings, even if it is described using words expressing the order such as “first”, “next”, etc. It does not mean that it is essential to implement in this order.
 本発明は、上記実施形態にそのまま限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で構成要素を変形して具体化できる。また、上記実施形態に開示されている複数の構成要素の適宜な組み合わせにより、種々の発明を形成できる。例えば、実施形態に示される全構成要素の幾つかの構成要素を削除してもよい。さらに、異なる実施形態にわたる構成要素を適宜組み合わせてもよい。 The present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage. In addition, various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiment. For example, you may delete some components of all the components shown by embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.
1・・・カメラ本体、2・・・光学系、3・・・光学系駆動部、4・・・ハーフミラー、5・・・AFセンサ、6・・・シャッタ、7・・・撮像素子、8・・・加速度センサ、9・・・測光センサ、10・・・システム制御CPU、11・・・測距データ検出部、12・・・ピント補正量算出部、13・・・シャッタ制御部、14・・・撮像画素制御部、15・・・フォーカスレンズ駆動量算出部、16・・・測距画素制御部、17・・・加速度データ検出部、18・・・ピント補正量算出部、19・・・露出制御部、20・・・ピント補正量算出部 DESCRIPTION OF SYMBOLS 1 ... Camera body, 2 ... Optical system, 3 ... Optical system drive part, 4 ... Half mirror, 5 ... AF sensor, 6 ... Shutter, 7 ... Image sensor, DESCRIPTION OF SYMBOLS 8 ... Acceleration sensor, 9 ... Photometry sensor, 10 ... System control CPU, 11 ... Distance measurement data detection part, 12 ... Focus correction amount calculation part, 13 ... Shutter control part, 14 ... Image pickup pixel control unit, 15 ... Focus lens drive amount calculation unit, 16 ... Distance measuring pixel control unit, 17 ... Acceleration data detection unit, 18 ... Focus correction amount calculation unit, 19 ... Exposure control unit, 20 ... Focus correction amount calculation unit

Claims (14)

  1.  結像する被写体のフォーカス状態を可変するフォーカスレンズを含む光学レンズと、
     上記フォーカスレンズを光軸方向に移動させるフォーカス駆動部と、
     上記光学レンズによって形成された被写体像を露光し、映像信号に変換する撮像素子と、
     撮像装置と上記被写体との間の被写体距離を検出する測距部と、
     上記測距部が検出した被写体距離と、上記フォーカスレンズのピント位置とに基づいて、上記フォーカス駆動部による上記フォーカスレンズの移動量を制御するフォーカス制御部と、
     を有し、
     上記フォーカス制御部は、上記撮像素子が上記映像信号を記録用に上記被写体像を露光する間、上記測距部により検出された被写体距離と、上記フォーカスレンズのピント位置とのずれ量に基づき、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカス駆動部によって上記フォーカスフォーカスを移動させる制御を行うことを特徴とする撮像装置。
    An optical lens including a focus lens that changes a focus state of a subject to be imaged;
    A focus drive unit for moving the focus lens in the optical axis direction;
    An image sensor that exposes a subject image formed by the optical lens and converts it into a video signal;
    A distance measuring unit for detecting a subject distance between the imaging device and the subject;
    A focus control unit that controls the amount of movement of the focus lens by the focus driving unit based on the subject distance detected by the distance measuring unit and the focus position of the focus lens;
    Have
    The focus control unit is based on the amount of deviation between the subject distance detected by the distance measuring unit and the focus position of the focus lens while the image sensor exposes the subject image for recording the video signal. An image pickup apparatus, wherein the focus drive unit is controlled to move the focus focus in a direction in which the subject distance and the focus position are matched.
  2.  上記測距部は、上記撮像素子が上記被写体像を露光する間に、この露光動作とは独立に並行して、被写体距離を検出する動作をすることを特徴とする請求項1に記載の撮像装置。 2. The imaging according to claim 1, wherein the distance measuring unit performs an operation of detecting a subject distance in parallel with the exposure operation while the imaging element exposes the subject image. apparatus.
  3.  上記撮像素子は、結像面に測距画素と、被写体像から画像信号を出力する非測距画素を含むものであり、
     上記測距画素による検出信号と、上記非測距画素に検出信号を、それぞれ独立に読み出す画素出力回路を含み、
     上記測距部は、上記非測距画素が露光動作をしている間に、上記測距画素から出力される画素信号を独立して読み出し、被写体距離を検出し結果を取得できることを特徴とする請求項1また2に記載の撮像装置。
    The imaging element includes a ranging pixel on the imaging surface and a non-ranging pixel that outputs an image signal from the subject image.
    A pixel output circuit that independently reads the detection signal from the ranging pixel and the detection signal to the non-ranging pixel;
    The distance measuring unit can independently read out a pixel signal output from the distance measuring pixel while the non-range pixel is performing an exposure operation, detect a subject distance, and obtain a result. The imaging device according to claim 1 or 2.
  4.  光学系により入射された被写体像の一部を上記撮像素子の撮像面に入射させ、被写体像の異なる一部を上記撮像面の入射方向とは異なる方向に分割して、上記測距部に上記被写体像の一部を導く分光部とをさらに有し、
     上記測距部は、上記撮像素子とは異なる測距センサによって、被写体距離を検出することを特徴とする請求項1または2に記載の撮像装置。
    A part of the subject image incident by the optical system is incident on the imaging surface of the image sensor, and a different part of the subject image is divided in a direction different from the incident direction of the imaging surface, A spectroscopic unit for guiding a part of the subject image;
    The imaging apparatus according to claim 1, wherein the distance measuring unit detects a subject distance by a distance measuring sensor different from the image sensor.
  5.  上記測距部は、上記光学レンズとは異なる光学系を有する測距光学系をさらに有し、
     上記光学レンズとは異なる経路からの被写体光に基づいて測距することを特徴とする請求項1または2に記載の撮像装置。
    The distance measuring unit further includes a distance measuring optical system having an optical system different from the optical lens,
    The imaging apparatus according to claim 1, wherein the distance is measured based on subject light from a different path from the optical lens.
  6.  上記フォーカスレンズの光軸方向の移動量を検出する移動量検出部をさらに有し、
     上記フォーカス制御部は、上記測距部が検出した被写体距離と、上記移動量検出部が検出した上記フォーカスレンズの光軸方向の移動量と、上記フォーカスレンズのピント位置に基づいて、上記フォーカス駆動部に対する移動量を出力することを特徴とする請求項1ないし5のいずれか一項に記載の撮像装置。
    A movement amount detector for detecting the movement amount of the focus lens in the optical axis direction;
    The focus control unit is configured to drive the focus based on the subject distance detected by the distance measuring unit, the movement amount of the focus lens in the optical axis direction detected by the movement amount detection unit, and the focus position of the focus lens. The imaging apparatus according to claim 1, wherein a movement amount with respect to the unit is output.
  7.  上記移動量検出部は、
     光軸方向の移動加速度を検出する加速度センサと、
     上記加速度センサの加速度出力を、上記測距部が検出した被写体距離に基づいて補正値を算出し、上記補正値を上記加速度出力に加算または減算して演算する加速度補正部と、
     をさらに含むことを特徴とする請求項6に記載の撮像装置。
    The movement amount detection unit is
    An acceleration sensor for detecting movement acceleration in the optical axis direction;
    An acceleration correction unit for calculating an acceleration output of the acceleration sensor based on a subject distance detected by the distance measuring unit, and calculating by adding or subtracting the correction value to or from the acceleration output;
    The imaging apparatus according to claim 6, further comprising:
  8.  上記被写体輝度を検出する測光部と、
     を更に有し、
     上記フォーカス制御部は、上記被写体輝度に応じて、被写体距離の検出を、上記測距部と、上記移動量検出部とによる検出値のどちらか一方、あるいは双方を選択するかを選択する選択部を、さらに有し、
     上記フォーカス制御部は、上記選択部により選択された後に検出された被写体距離に基づいて、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカス駆動部を移動させる制御を行うことを特徴とする請求項6または7に記載の撮像装置。
    A metering unit for detecting the subject brightness;
    Further comprising
    The focus control unit is a selection unit that selects whether one or both of the detection values of the distance measurement unit and the movement amount detection unit are selected for the detection of the subject distance according to the subject luminance. And further
    The focus control unit performs control to move the focus driving unit in a direction in which the subject distance matches the focus position based on a subject distance detected after being selected by the selection unit. The imaging apparatus according to claim 6 or 7.
  9.  フォーカスレンズを含む光学レンズが結像する被写体像を撮像素子によって映像信号の検出し、
     記録用に上記映像信号を検出する際に、撮像装置と被写体との間の被写体距離を検出し、
     検出した被写体距離と、上記フォーカスレンズのピント位置とのずれ量とに基づいて、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカスレンズに対する移動量を算出し、
     上記算出された移動量に基づいて、上記フォーカスレンズを移動させる、
     ことを特徴とする撮像装置の焦点調節方法。
    A video signal is detected by the image sensor using the image sensor, which is formed by the optical lens including the focus lens.
    When detecting the video signal for recording, the subject distance between the imaging device and the subject is detected,
    Based on the detected subject distance and the shift amount between the focus position of the focus lens, the amount of movement with respect to the focus lens is calculated in a direction in which the subject distance and the focus position are matched,
    Moving the focus lens based on the calculated amount of movement;
    A focus adjustment method for an image pickup apparatus.
  10.  上記撮像素子が上記被写体像を露光する間に、この露光動作とは独立に並行して、被写体距離を検出する動作をすることを特徴とする請求項9に記載の焦点調節方法。 10. The focus adjustment method according to claim 9, wherein an operation for detecting a subject distance is performed in parallel with the exposure operation while the image pickup device exposes the subject image.
  11.  上記撮像素子は、結像面に測距画素と、被写体像から画像信号を出力する非測距画素を含むものであり、
     上記測距画素による検出信号と、上記非測距画素に検出信号を、それぞれ独立に読み出し、
     上記非測距画素が露光動作をしている間に、上記測距画素から出力される画素信号を独立して読み出し、被写体距離を検出し結果を取得することを特徴とする請求項9または10に記載の焦点調節方法。
    The imaging element includes a ranging pixel on the imaging surface and a non-ranging pixel that outputs an image signal from the subject image.
    The detection signal from the ranging pixel and the detection signal to the non-ranging pixel are read out independently,
    11. The pixel signal output from the ranging pixel is read independently while the non-ranging pixel is performing an exposure operation, the subject distance is detected, and the result is obtained. The focus adjustment method described in 1.
  12.  上記フォーカスレンズの光軸方向の移動量を検出し、
     検出した上記被写体距離と、検出した上記フォーカスレンズの光軸方向の移動量と、上記フォーカスレンズのピント位置に基づいて、上記フォーカス駆動部に対する移動量を出力することを特徴とする請求項9に記載の焦点調節方法。
    Detect the amount of movement of the focus lens in the optical axis direction,
    The movement amount with respect to the focus driving unit is output based on the detected subject distance, the detected movement amount of the focus lens in the optical axis direction, and the focus position of the focus lens. The focus adjustment method described.
  13.  光軸方向の移動加速度を検出する加速度センサを有し、
     上記加速度センサの加速度出力を、検出した上記被写体距離に基づいて補正値を算出し、上記補正値を上記加速度出力に加算または減算して演算することを特徴とする請求項12に記載の焦点調節方法。
    It has an acceleration sensor that detects movement acceleration in the optical axis direction,
    13. The focus adjustment according to claim 12, wherein the acceleration output of the acceleration sensor is calculated by calculating a correction value based on the detected subject distance and adding or subtracting the correction value to or from the acceleration output. Method.
  14.  上記被写体輝度を検出する測光部を有し、
     上記被写体輝度に応じて、上記映像信号または上記加速度センサの何れかの検出値のどちらか一方、あるいは双方に基づいて、被写体距離の検出を行うかを選択し、
     上記選択された後に検出された被写体距離に基づいて、上記被写体距離と上記ピント位置とを一致させる方向に、上記フォーカスレンズを移動させる制御を行うことを特徴とする請求項13に記載の焦点調節方法。
    A metering unit for detecting the subject brightness;
    According to the subject brightness, select whether to detect the subject distance based on one or both of the detection value of the video signal or the acceleration sensor,
    The focus adjustment according to claim 13, wherein the focus lens is controlled to move in a direction in which the subject distance and the focus position coincide with each other based on the subject distance detected after the selection. Method.
PCT/JP2017/046203 2017-01-07 2017-12-22 Imaging device and focus adjusting method WO2018128098A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-001481 2017-01-07
JP2017001481A JP6739357B2 (en) 2017-01-07 2017-01-07 Imaging device and focus adjustment method

Publications (1)

Publication Number Publication Date
WO2018128098A1 true WO2018128098A1 (en) 2018-07-12

Family

ID=62790870

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046203 WO2018128098A1 (en) 2017-01-07 2017-12-22 Imaging device and focus adjusting method

Country Status (2)

Country Link
JP (1) JP6739357B2 (en)
WO (1) WO2018128098A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115242939A (en) * 2021-03-24 2022-10-25 维克多哈苏有限公司 Distance detection device and imaging device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04147108A (en) * 1990-10-09 1992-05-20 Olympus Optical Co Ltd Image pickup device
JP2009003208A (en) * 2007-06-22 2009-01-08 Casio Comput Co Ltd Camera device, focus control method and focus control program
JP2010145493A (en) * 2008-12-16 2010-07-01 Canon Inc Camera system
JP2015154409A (en) * 2014-02-18 2015-08-24 キヤノン株式会社 Imaging apparatus, control method of imaging apparatus, program, and storage medium
JP2016014805A (en) * 2014-07-03 2016-01-28 キヤノン株式会社 Imaging device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04147108A (en) * 1990-10-09 1992-05-20 Olympus Optical Co Ltd Image pickup device
JP2009003208A (en) * 2007-06-22 2009-01-08 Casio Comput Co Ltd Camera device, focus control method and focus control program
JP2010145493A (en) * 2008-12-16 2010-07-01 Canon Inc Camera system
JP2015154409A (en) * 2014-02-18 2015-08-24 キヤノン株式会社 Imaging apparatus, control method of imaging apparatus, program, and storage medium
JP2016014805A (en) * 2014-07-03 2016-01-28 キヤノン株式会社 Imaging device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115242939A (en) * 2021-03-24 2022-10-25 维克多哈苏有限公司 Distance detection device and imaging device

Also Published As

Publication number Publication date
JP6739357B2 (en) 2020-08-12
JP2018112592A (en) 2018-07-19

Similar Documents

Publication Publication Date Title
US10175451B2 (en) Imaging apparatus and focus adjustment method
JP6749791B2 (en) Imaging device and automatic focusing method
US10244157B2 (en) Interchangeable lens apparatus and image capturing apparatus capable of acquiring in-focus state at different image heights, and storage medium storing focusing program
US9781330B2 (en) Focus detection apparatus and control method for focus detection apparatus
WO2016035643A1 (en) Imaging device, imaging device body, and lens barrel
JP5393300B2 (en) Imaging device
JP2001042207A (en) Electronic camera
JP2011013645A5 (en)
JP2008152150A (en) Imaging apparatus, its control method and program, and storage medium
US8013896B2 (en) Imaging apparatus including shaking correction for a second imaging sensor
JP2006254413A (en) Imaging apparatus and camera body
US10498967B2 (en) Image pickup apparatus that performs photometric control by using image sensor, control method therefor, and storage medium
WO2018128098A1 (en) Imaging device and focus adjusting method
US9692979B2 (en) Image pickup apparatus
US11330179B2 (en) Imaging device and control method thereof
JP5170266B2 (en) Imaging device and camera body
JP4847352B2 (en) Imaging apparatus and control method thereof
JP2006251033A (en) Single-lens reflex electronic camera
JP2017021177A (en) Range-finding point upon lens vignetting, range-finding area transition method
JP2016006940A (en) Camera with contrast af function
JP2005140851A (en) Autofocus camera
JP2014035505A (en) Lens device, image pickup device, and control method for these devices
JP5454546B2 (en) Autofocus device
JP2011112731A (en) Image pickup device
JP2007033997A (en) Focal point detecting apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17889989

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17889989

Country of ref document: EP

Kind code of ref document: A1