WO2018180909A1 - Interchangeable lens and camera body - Google Patents

Interchangeable lens and camera body Download PDF

Info

Publication number
WO2018180909A1
WO2018180909A1 PCT/JP2018/011477 JP2018011477W WO2018180909A1 WO 2018180909 A1 WO2018180909 A1 WO 2018180909A1 JP 2018011477 W JP2018011477 W JP 2018011477W WO 2018180909 A1 WO2018180909 A1 WO 2018180909A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
blur
image blur
correction
interchangeable lens
Prior art date
Application number
PCT/JP2018/011477
Other languages
French (fr)
Japanese (ja)
Inventor
英志 三家本
豪 松本
大樹 中島
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2018180909A1 publication Critical patent/WO2018180909A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • G03B17/14Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets interchangeably
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an interchangeable lens and a camera body.
  • Patent Document 1 A technique for suppressing image blur due to camera shake is known (see Patent Document 1). However, the image blur at the center of the screen is corrected.
  • an interchangeable lens that can be attached to and detached from a camera body including an image sensor that captures a subject image includes an imaging optical system that forms the subject image on an image plane, and the interchangeable lens or the camera.
  • An input unit to which a blur amount detected on at least one of the bodies is input; and a receiver unit that receives information used to calculate an off-axis correction amount for correcting off-axis blur in the image plane;
  • a drive unit that drives at least a part of the movable unit of the imaging optical system in a plane orthogonal to the optical axis based on at least the information and the amount of blurring.
  • the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on the image plane by the imaging optical system, and an optical axis outside the image plane.
  • the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on an image plane by the imaging optical system, and the imaging optical system or the imaging element.
  • an imaging apparatus equipped with an image blur correction apparatus will be described with reference to the drawings.
  • an interchangeable lens digital camera hereinafter referred to as a camera 1
  • the camera 1 is a single lens reflex type having a mirror 24 in a camera body 2 and is not provided with a mirror 24. It may be a type.
  • the camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
  • the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
  • FIG. 1 is a diagram illustrating a main configuration of the camera 1.
  • the camera 1 includes a camera body 2 and an interchangeable lens 3.
  • the interchangeable lens 3 is attached to the camera body 2 via a mount unit (not shown).
  • the camera body 2 and the interchangeable lens 3 are electrically connected, and communication between the camera body 2 and the interchangeable lens 3 becomes possible. Communication between the camera body 2 and the interchangeable lens 3 may be performed by wireless communication.
  • the light from the subject enters in the negative direction of the Z axis.
  • the front direction perpendicular to the Z axis is defined as the X axis plus direction
  • the upward direction perpendicular to the Z axis and the X axis is defined as the Y axis plus direction.
  • the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes of FIG.
  • the interchangeable lens 3 has an imaging optical system (imaging optical system), and forms a subject image on the imaging surface of the imaging element 22 provided in the camera body 2.
  • the imaging optical system includes a zoom optical system 31, a focus (focus adjustment) optical system 32, a shake correction optical system 33, and a diaphragm 34.
  • the interchangeable lens 3 further includes a zoom drive mechanism 35, a focus drive mechanism 36, a shake correction drive mechanism 37, a diaphragm drive mechanism 38, and a shake sensor (motion detection unit, shake detection unit) 39.
  • the zoom drive mechanism 35 adjusts the magnification of the imaging optical system by moving the zoom optical system 31 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the zoom optical system 31.
  • the focus drive mechanism 36 adjusts the focus of the imaging optical system by moving the focus optical system 32 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2.
  • the signal output from the CPU 21 at the time of focus adjustment includes information indicating the moving direction, moving amount, moving speed, and the like of the focus optical system 32.
  • the diaphragm driving mechanism 38 controls the aperture diameter of the diaphragm 34 based on a signal output from the CPU 21 of the camera body 2.
  • the blur correction drive mechanism 37 Based on a signal output from the CPU 21 of the camera body 2, the blur correction drive mechanism 37 cancels blurring of the subject image on the imaging surface of the imaging element 22 (referred to as image blur) within a plane that intersects the optical axis L1.
  • the blur correction optical system 33 is moved back and forth in the direction to suppress image blur.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the blur correction optical system 33.
  • the shake sensor 39 detects the shake of the camera 1 when the camera 1 swings due to hand shake or the like.
  • the shake sensor 39 includes an angular velocity sensor 39a and an acceleration sensor 39b. It is assumed that image blur is caused by camera shake.
  • the angular velocity sensor 39a detects an angular velocity generated by the rotational movement of the camera 1.
  • the angular velocity sensor 39a detects, for example, rotation around each axis of an axis parallel to the X axis, an axis parallel to the Y axis, and an axis parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2.
  • the angular velocity sensor 39a is also referred to as a gyro sensor.
  • the acceleration sensor 39b detects acceleration generated by the translational motion of the camera 1.
  • the acceleration sensor 39b detects, for example, accelerations in an axis direction parallel to the X axis, an axis parallel to the Y axis, and an axis direction parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2.
  • the acceleration sensor 39b is also referred to as a G sensor.
  • the shake sensor 39 is provided in the interchangeable lens 3 is illustrated, but the shake sensor 39 may be provided in the camera body 2. Further, the shake sensor 39 may be provided on both the camera body 2 and the interchangeable lens 3.
  • the camera body 2 includes a CPU 21, an image sensor 22, a shutter 23, a mirror 24, an AF sensor 25, a shake correction drive mechanism 26, a signal processing circuit 27, a memory 28, an operation member 29, and a liquid crystal display. Part 30.
  • the CPU 21 includes a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and controls each unit of the camera 1 based on a control program.
  • the CPU 21 includes a shake correction unit (correction amount calculation unit) 21a.
  • the blur correction unit 21a calculates the image motion accompanying the rotational movement of the camera 1 and the translational movement of the camera 1.
  • the CPU 21 moves the blur correction optical system 33 by the blur correction drive mechanism (blur correction drive unit) 37 based on the calculation result by the blur correction unit 21 a or the image pickup element by the blur correction drive mechanism (blur correction drive unit) 26. 22 is moved.
  • image blurring is suppressed by moving the blur correction optical system 33 of the interchangeable lens 3 constituting the imaging optical system or the imaging element 22.
  • image blur correction is also referred to as image blur correction. Details of the image blur correction will be described later.
  • the imaging element 22 receives the light beam that has passed through the imaging optical system at the imaging surface, and photoelectrically converts (captures) the subject image. Electric charges are generated according to the amount of received light in each of the plurality of pixels arranged on the imaging surface of the imaging element 22 by photoelectric conversion. A signal based on the generated charge is read from the image sensor 22 and sent to the signal processing circuit 27.
  • the shutter 23 controls the exposure time of the image sensor 22.
  • the exposure time of the image sensor 22 can be controlled by a method for controlling the charge accumulation time in the image sensor 22 (so-called electronic shutter control).
  • the shutter 23 is opened and closed by a shutter drive unit (not shown).
  • a semi-transmissive quick return mirror (hereinafter referred to as a mirror) 24 is driven by a mirror driving unit (not shown), so that the mirror 24 is moved down on the optical path (illustrated in FIG. 1), and the mirror 24 is in the optical path. Move between up-positions that retract to the outside. For example, before the release, the subject light is reflected by the mirror 24 moved to the down position to a finder unit (not shown) provided upward (Y-axis plus direction). Part of the subject light transmitted through the mirror 24 is bent downward (Y-axis minus direction) by the sub mirror 24 a and guided to the AF sensor 25. Immediately after the release switch is pressed, the mirror 24 is rotated to the up position. Thereby, the subject light is guided to the image sensor 22 via the shutter 23.
  • the AF sensor 25 detects the focus adjustment state of the interchangeable lens 3 by the imaging optical system.
  • the CPU 21 performs a known phase difference type focus detection calculation using a detection signal from the AF sensor 25.
  • the CPU 21 obtains the defocus amount by the imaging optical system by this calculation, and calculates the movement amount of the focus optical system 32 based on the defocus amount.
  • the CPU 21 transmits the calculated movement amount of the focus optical system 32 to the focus drive mechanism 36 together with the movement direction and movement speed.
  • the blur correction drive mechanism 26 moves the image sensor 22 forward and backward in a direction that cancels image blur in a plane that intersects the optical axis L ⁇ b> 1. Reduce blurring.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the image sensor 22.
  • the signal processing circuit 27 generates image data related to the subject image based on the image signal read from the image sensor 22.
  • the signal processing circuit 27 performs predetermined image processing on the generated image data.
  • the image processing includes known image processing such as gradation conversion processing, color interpolation processing, contour enhancement processing, and white balance processing.
  • the memory 28 includes, for example, an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a flash memory, or the like.
  • EEPROM Electrically-Erasable-Programmable-Read-Only Memory
  • flash memory or the like.
  • adjustment value information such as a detection gain set in the shake sensor 39 is recorded. Data recording to the memory 28 and data reading from the memory 28 are performed by the CPU 21.
  • the operation member 29 includes a release button, a recording button, a live view button, various setting switches, and the like, and outputs an operation signal corresponding to each operation to the CPU 21.
  • the liquid crystal display unit 30 displays an image based on image data, information relating to shooting such as a shutter speed and an aperture value, a menu operation screen, and the like.
  • the recording medium 50 is composed of, for example, a memory card that can be attached to and detached from the camera body 2. Image data, audio data, and the like are recorded on the recording medium 50. Recording of data on the recording medium 50 and reading of data from the recording medium 50 are performed by the CPU 21.
  • the camera 1 can perform image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3 and image blur correction performed by operating the blur correction drive mechanism 26 of the camera body 2. It is configured.
  • the CPU 21 operates one of the blur correction drive mechanisms. For example, when the interchangeable lens 3 having the blur correction drive mechanism 37 is attached to the camera body 2, the CPU 21 operates the blur correction drive mechanism 37 of the interchangeable lens 3 to perform image blur correction, and the blur correction drive mechanism. When the interchangeable lens 3 that does not include the lens 37 is attached to the camera body 2, the blur correction drive mechanism 26 of the camera body 2 is operated to perform image blur correction. Note that, as in a third embodiment to be described later, the shake correction drive mechanism of the interchangeable lens 3 and the camera body 2 may be operated simultaneously.
  • image blur generated by the camera 1 is classified into image blur (also referred to as angle blur) accompanying the rotational movement of the camera 1 and image blur accompanying translational movement of the camera 1 (also referred to as translation blur).
  • the blur correction unit 21 a calculates an image blur due to the rotational movement of the camera 1 and an image blur due to the translational movement of the camera 1.
  • FIG. 2 is a diagram illustrating the blur correction unit 21a.
  • the shake correction unit 21 a includes an angle shake calculation unit 201, a translational shake calculation unit 202, and a shake correction optical system target position calculation unit (selection unit) 203.
  • the angle blur calculation unit 201 calculates the image blur in the Y-axis direction due to the rotational motion using the detection signal around the axis parallel to the X-axis (Pitch direction) by the angular velocity sensor 39a. Further, the angle blur calculation unit 201 calculates an image blur in the X-axis direction due to the rotational motion using a detection signal around the axis parallel to the Y-axis (Yaw direction) by the angular velocity sensor 39a.
  • the translation blur calculation unit 202 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 202 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the blur correction optical system target position calculation unit 203 is an image blur in the X axis direction and the Y axis direction calculated by the angle blur calculation unit 201 and an image in the X axis direction and the Y axis direction calculated by the translation blur calculation unit 202.
  • the image blur in the X-axis direction and the Y-axis direction is calculated by adding the blur for each axis. For example, when the image blur calculated by the angle blur calculation unit 201 and the image blur direction calculated by the translation blur calculation unit 202 are the same in a certain axial direction, the image blur increases due to the addition, but the calculated 2 When the directions of the two image blurs are different, the image blur is reduced by the addition. In this way, addition calculation is performed by adding positive and negative signs depending on the image blur direction of each axis.
  • the blur correction optical system target position calculator 203 adds the image blur in the X-axis direction and the Y-axis direction after addition, the photographing magnification (calculated based on the position of the zoom optical system 31), and the camera 1. Based on the distance to the subject 80 (calculated based on the position of the focus optical system 32), the image blur amount at a predetermined position of the image plane (imaging plane of the image sensor 22) is calculated.
  • the blur correction optical system target position calculation unit 203 moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount when the blur correction drive mechanism 37 of the interchangeable lens 3 is operated to perform the image blur correction.
  • the target position of the blur correction optical system 33 is calculated.
  • the blur correction optical system target position calculation unit 203 when operating the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction, performs imaging for moving the imaging element 22 in a direction that cancels the calculated image blur amount.
  • the target position of the element 22 is calculated.
  • the shake correction optical system target position calculation unit 203 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3 or the shake correction drive mechanism 26 of the camera body 2.
  • the shake correction optical system target position calculation unit 203 can also send a signal indicating the target position to the interchangeable lens 3 and the shake correction drive mechanism of the camera body 2, respectively.
  • the shake correction drive mechanism 37 of the interchangeable lens 3 notifies the CPU 21 of the camera body 2 to that effect. Also good.
  • the CPU 21 can take measures such as issuing an alarm notifying that the allowable range of image blur correction has been exceeded.
  • FIG. 3 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging device 22).
  • the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates
  • the optical axis L1 of the interchangeable lens 3 is the Z axis
  • the image plane 70 is represented as the XY plane.
  • the optical axis L1 intersects the center of the imaging surface.
  • the interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70.
  • the angular velocity sensor 39a detects, for example, the rotation angle ⁇ around the axis (small-x axis) parallel to the X axis (Pitch direction).
  • the symbol f in FIGS. 3 and 4 represents the focal length.
  • FIG. 4 is a diagram for explaining the image blur ⁇ y2 in FIG. 3, and represents the YZ plane in FIG.
  • the image blur ⁇ y2 is expressed by the following formula (1).
  • ⁇ y2 f ⁇ tan ( ⁇ + tan ⁇ 1 (yp / f)) ⁇ yp (1)
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is ⁇ .
  • the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
  • the image blur ⁇ y1 of the image of the subject 80 positioned at the coordinate (0, 0) of the center of the image plane 70 before the camera 1 shakes will be described. It is assumed that the angle of rotation of the interchangeable lens 3 in the pitch direction is the same as the above.
  • the image of the subject 80 located at the coordinate (0, 0) on the image plane 70 before the shake moves in the Y-axis minus direction after the shake.
  • the position of the image of the moved subject 80 is the coordinates (0, ⁇ y1).
  • the image blur ⁇ y1 is expressed by the following formula (2).
  • ⁇ y1 f ⁇ tan ⁇ (2)
  • the rotation angle ⁇ is generally about 0.5 degrees, so that it is considered that ⁇ y1 ⁇ y2. be able to.
  • the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different. Even so, the image blur can be regarded as almost the same. This means that the position of the image plane 70 can be determined anywhere to calculate the image blur.
  • the image of the subject 80 positioned at the center of the image plane 70 is also a subject at a position away from the center of the image plane 70.
  • Any of the 80 images can suppress image blurring.
  • the focal length f is not sufficiently larger than yp, as in the case where the interchangeable lens 3 is a wide-angle lens, ⁇ y1 ⁇ y2. Therefore, it is necessary to calculate the image blur by setting the position on the image plane 70 to any one. For example, if image blur correction is performed based on the image blur calculated at the center of the image plane 70, the image blur of the subject 80 located at the center of the image plane 70 can be suppressed, but it is far from the center of the image plane 70. This is because the image blur corresponding to the difference between ⁇ y2 and ⁇ y1 remains without being suppressed for the image of the subject 80 at the selected position. The difference between ⁇ y2 and ⁇ y1 increases as the position at which image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
  • the CPU 21 in the first embodiment determines a position where the image of the main subject is highly likely to exist on the image plane 70, as will be described later.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the CPU 21 selects one of the following methods (1) to (3) in order to determine a position where image blur is calculated.
  • the CPU 21 resets (updates) the position for calculating the image blur, for example, when the amount of motion of the camera 1 associated with the composition change is detected.
  • the shake sensor 39 also functions as a motion amount detection unit.
  • FIG. 5 is a diagram illustrating a focus area formed on the imaging screen 90.
  • the focus area is an area where the AF sensor 25 detects the focus adjustment state, and is also referred to as a focus detection area, a distance measuring point, and an autofocus (AF) point.
  • eleven focus areas 25P-1 to 25P-11 are provided in the imaging screen 90 in advance.
  • the CPU 21 can obtain the defocus amount in 11 focus areas.
  • the number of focus areas 25P-1 to 25P-11 is an example, and the number may be larger or smaller than 11.
  • the CPU 21 determines a position for calculating the image blur on the image plane 70 as a position corresponding to the selected focus area. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the reason why the image blur calculation position on the image plane 70 is set to the position corresponding to the selected focus area is that there is a high possibility that the main subject is present at the position for obtaining the defocus amount for focus adjustment. is there.
  • the CPU 21 may select the focus area based on the operation signal from the operation member 29, or the CPU 21 may select the focus area corresponding to the subject 80 close to the camera 1.
  • the CPU 21 can select a focus area corresponding to the subject 80 close to the camera 1 based on the position of the focus optical system 32. Further, the CPU 21 may select a focus area corresponding to the subject 80 having a high contrast among the images of the subject 80, or may select a focus area corresponding to the subject 80 having a high luminance value among the images of the subject 80.
  • the second method is to calculate image blur at the position of the object (subject 80).
  • the CPU 21 recognizes an object appearing as the subject 80 in the live view image by a known object recognition process, and sets the position of the object (subject 80) in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the live view image is a monitor image acquired at a predetermined interval (for example, 60 fps) by the image sensor 22 before the main imaging is performed.
  • a predetermined interval for example, 60 fps
  • the CPU 21 maintains the state where the mirror 24 is rotated to the up position, and starts the acquisition of the live view image by the image sensor 22.
  • the CPU 21 can also display the live view image on the liquid crystal display unit 30.
  • the CPU 21 can also track the moving object (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image.
  • the image blur correction unit 201 performs image blur correction on the moving object (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21. Do. Further, even when the camera 1 is panned, the CPU 21 can track the moving object (subject 80) by sequentially updating the position of the main subject in each frame of the live view image.
  • the CPU 21 selects the second method and starts object recognition processing. You may do it.
  • the object recognition target may be switched according to an imaging scene mode such as “landscape”, “cooking”, “flower”, “animal” set in the camera 1.
  • the third method is to calculate image blur at the position of the face (subject 80).
  • the CPU 21 recognizes the face shown as the subject 80 in the live view image by a known face recognition process, and sets the position of the face in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • CPU21 keeps the state which rotated the mirror 24 to the up position, for example, when the live view button which comprises the operation member 29 is operated, and starts acquisition of a live view image with the image pick-up element 22.
  • the CPU 21 can also track the moving face (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image, as in (2) above.
  • the angle blur calculation unit 201 performs image blur correction on the moving face (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21.
  • the CPU 21 may select the third method and start the face recognition process when the imaging scene mode of the camera 1 is set to “portrait”, for example.
  • ⁇ When there are multiple positions to calculate image blur> the case where only one position where the image blur is calculated on the image plane 70 is determined.
  • a plurality of positions may be candidates for positions where image blur is calculated. Specifically, when a plurality of focus areas are selected in (1) above, or when a plurality of objects (subject 80) are recognized in (2) above, or a plurality of focus areas in (3) above are described. This is the case when a face is recognized. In such a case, the CPU 21 selects the following method (4) or (5).
  • FIG. 6 is a diagram illustrating an example in which one representative position is determined from a plurality of candidates. For example, on the image plane 70, a position P-1 corresponding to the focus area 25P-1 in FIG. 5, a position P-2 corresponding to the focus area 25P-2, and a position P-4 corresponding to the focus area 25P-4.
  • the CPU 21 determines the absolute value of the distance between the plurality of candidate positions and the X axis (FIG. 3) and the absolute value of the distance between the plurality of candidate positions and the Y axis (FIG. 3).
  • an average position P is obtained, and the position P is set as a representative position. Then, the position where the image blur is calculated on the image plane 70 is determined as the representative position P. In this way, the representative position P is obtained by averaging the absolute values of the distances on the axis (X axis, Y axis) of the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at the representative position P, and performs image blur correction based on the image blur.
  • the case where a plurality of focus areas are selected is exemplified, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized.
  • the CPU 21 determines the representative position P as described above based on the positions of the recognized objects and the positions of the recognized faces.
  • the angle blur calculation unit 201 calculates image blur for the representative position P determined by the CPU 21, and performs image blur correction based on the image blur.
  • the fifth method is to calculate one image blur based on a plurality of image blurs. Referring to the example of FIG. 6, for example, on the image plane 70, the position P-1 corresponding to the focus area 25P-1 in FIG. 5, the position P-2 corresponding to the focus area 25P-2, and the focus area 25P-4. Three points with the position P-4 corresponding to are candidates.
  • the CPU 21 determines a plurality of positions as positions where image blur is calculated on the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at the position P-1, the position P-2, and the position P-4 on the image plane 70, respectively.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • the average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
  • the case where a plurality of focus areas are selected is illustrated, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized.
  • the CPU 21 determines the positions of the plurality of recognized objects and the positions of the plurality of recognized faces as positions at which image blur is calculated on the image plane 70, respectively.
  • the angle blur calculation unit 201 calculates an image blur for each position on the image plane 70.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • one subject may be selected from a plurality of subjects.
  • a subject with a large image blur is selected from a plurality of subjects.
  • a subject with a large image blur that is close to the camera 1 from a plurality of subjects is selected.
  • a subject having a high image height from the optical axis L1 of the interchangeable lens 3 is selected from a plurality of subjects.
  • image blur correction in the first embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the description of the first embodiment described above is representative of the correction in the Y-axis direction when the camera 1 rotates in the pitch direction.
  • correction similar to the correction described above is necessary for the X-axis direction. Since the correction in the Y-axis direction when the camera 1 rotates in the pitch direction and the correction in the X-axis direction when the camera 1 rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • image blur calculated by the translation blur calculation unit 202 is treated as substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different.
  • the outline of the first embodiment is as follows.
  • the angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70.
  • the translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the shake correction device of the camera 1 detects the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the camera 1 and the output of the shake sensor 39. It includes a blur correction unit 21 a that calculates and a CPU 21 that determines a position on the image plane 70.
  • the blur correction unit 21a calculates the image blur ⁇ y2 in the Y-axis direction based on the position determined by the CPU 21 and the shake in the Y-axis direction detected by the shake sensor 39, for example.
  • the image blur can be appropriately suppressed.
  • the focal length f of the interchangeable lens 3 is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
  • the blur correction unit 21a calculates a larger blur amount on the image plane 70 as the distance from the X-axis direction axis intersecting the Y-axis direction to the determined position is longer. Therefore, image blur can be suppressed appropriately even at a position where the image height is high.
  • the shake correction unit 21a calculates the shake amount based on the output of the shake sensor 39, the distance, and the focal length of the imaging optical system, so the focal length f differs. Even when the interchangeable lens 3 is replaced, image blur can be appropriately suppressed.
  • the CPU 21 uses the position of the focus area that is the target of focus adjustment of the imaging optical system on the image plane 70 as the determined position. Therefore, it is possible to appropriately suppress image blurring at a position where there is a high possibility of image blurring.
  • the CPU 21 determines the determined position based on the contrast information of the subject image, so that the image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
  • the CPU 21 determines the determined position based on the luminance value information of the image of the subject 80, so that the main subject is highly likely to exist. Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position based on the subject recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position based on the face recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position according to the set imaging scene mode, so that an image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
  • the CPU 21 sets the position designated by the user operation on the image plane 70 as the determined position, so that the image blur is appropriately performed at the position desired by the user. Can be suppressed.
  • the CPU 21 sets the position corresponding to the subject 80 close to the camera 1, for example, based on the shooting distance information, so that it corresponds to the main subject.
  • the image blur can be appropriately suppressed at the position where the image is to be moved.
  • the blur correction device includes a CPU 21 that detects the amount of movement due to composition change based on the output of the shake sensor 39, and the CPU 21 moves after the determined position is determined by the CPU 21.
  • the shake correction unit 21b calculates the shake amount based on the position where the determined position is changed based on the amount of motion. Accordingly, it is possible to appropriately suppress image blur at a position where there is a high possibility that the main subject exists after the composition change.
  • the CPU 21 determines the axis of the image plane 70 based on the positions of the plurality of focus areas ( Since the center of gravity (representative position P) of the absolute value of the distance on the X-axis and Y-axis) is the determined position, image blurring is appropriately performed so that image blurring at a plurality of focus area positions is approximately the same. Can be suppressed.
  • the CPU 21 is on the axis (X axis, Y axis) of the image plane 70 based on the positions of the plurality of subjects. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of subjects becomes approximately the same.
  • the CPU 21 is based on the positions of the plurality of faces on the axes (X axis, Y axis) of the image plane 70. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of faces becomes approximately the same.
  • the CPU 21 sets the positions of the plurality of focus areas as the determined positions, and the blur correction unit 21b calculates an average value of a plurality of shake amounts calculated based on a plurality of determined positions. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • the CPU 21 sets the positions of the plurality of main subjects as the determined positions, and the blur correction unit 21b determines the plurality of determinations. An average value of a plurality of shake amounts calculated based on the position is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • the shake correction apparatus of (8) when there are a plurality of faces based on the face recognition information, the CPU 21 sets the positions of the plurality of faces as the determined positions, and the shake correction unit 21b sets the determined positions to the determined positions. An average value of a plurality of blur amounts calculated based on the above is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • Modification 1 In the first embodiment, the image blur correction performed by the camera 1 by operating the blur correction drive mechanism 37 of the interchangeable lens 3 has been described as an example. Instead, in the first modification of the first embodiment, the camera 1 operates the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction. Image blur correction according to the first modification of the first embodiment can be performed in the same manner as in the first embodiment, and the same effects as those in the first embodiment can be obtained.
  • Modification 2 In the third method (3) described in the first embodiment, that is, when the image blur at the position of the face (subject 80) to be captured is calculated, for example, when the face appears large on the screen.
  • the CPU 21 may select the fifth method of (5) above.
  • FIG. 7 is a diagram for explaining a modification 2 of the first embodiment. An example in which one representative position is determined from a plurality of candidates will be described with reference to FIG. According to FIG. 7, a large face (subject) is shown on the image plane 70. On the image plane 70, the CPU 21 sets, for example, two points, that is, the detected left edge position Pa and the right edge position Pb of the face.
  • the CPU 21 determines the two candidate positions as positions where image blur is calculated on the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at a position Pa and a position Pb on the image plane 70, respectively.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • the average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
  • image blur correction can be performed so that the image blur at both ends of the face becomes the same.
  • size of image blur differs by right and left of a face, the discomfort seen from the user can be suppressed.
  • the camera 1 may be a single lens reflex type illustrated in FIG. 1 or a mirrorless type without the mirror 24.
  • the camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
  • the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
  • FIG. 8 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging element 22).
  • the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates
  • the optical axis L1 of the interchangeable lens 3 is the Z axis
  • the image plane 70 is represented as the XY plane.
  • the optical axis L1 intersects the center of the imaging surface.
  • the interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70.
  • the angular velocity sensor 39a detects, for example, the rotation angle ⁇ around the axis (small-x axis) parallel to the X axis (Pitch direction).
  • the symbol f in FIGS. 3 and 4 represents the focal length.
  • the image of the subject 80 located at the coordinates (xp, yp) on the image plane 70 before the shake moves in the Y-axis minus direction and the X-axis plus direction after the shake. Accordingly, the coordinates of the image of the subject 80 are (xp + ⁇ x2, yp ⁇ y2).
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above expression (1), as in the case described in the first embodiment.
  • the image blur ⁇ x2 in the X-axis direction is expressed by the following formula (3).
  • ⁇ x2 f ⁇ xp / [(f 2 + yp 2 ) 1/2 ⁇ cos ( ⁇ + tan ⁇ 1 (yp / f))] ⁇ xp (3)
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is ⁇ .
  • the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
  • the rotation angle ⁇ (camera shake angle) is generally about 0.5 degrees, so it is regarded that ⁇ x2 ⁇ 0. be able to. In other words, whether the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different.
  • the image blur when the rotation angle ⁇ is detected in the pitch direction need only consider the Y-axis direction, and can be ignored in the X-axis direction.
  • the image of the subject 80 positioned at the center of the image plane 70 is also separated from the center of the image plane 70. Any image of the subject 80 at the selected position can suppress image blurring.
  • the interchangeable lens 3 is a wide-angle lens and the focal length f cannot be said to be sufficiently larger than yp, ⁇ x2 ⁇ 0 according to the above equation (3). Therefore, when the rotation angle ⁇ in the pitch direction is detected, not only the image blur in the Y-axis direction is calculated by the above equation (1) but also the image blur in the X-axis direction can be calculated by the above equation (3). I need it. Otherwise, the image blur in the X-axis direction corresponding to the image blur ⁇ x2 according to the above equation (3) cannot be suppressed and remains.
  • the image blur ⁇ x2 increases as the position at which the image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
  • the CPU 21 determines the position where image blur is calculated on the image plane 70 in the same manner as in the first embodiment. That is, the CPU 21 selects one of the methods (4) from the method (1), and determines the position where the image blur is calculated on the image plane 70. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21. The blur correction optical system target position calculation unit 203 calculates an image blur amount based on the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the image blur correction in the second embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the image blur correction in the second embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the position of the image blur calculated by the translation blur calculation unit 202 on the image plane 70 is different.
  • the outline of the second embodiment is as follows.
  • the angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70. At this time, for example, when the rotation angle ⁇ in the pitch direction is detected, not only the image blur in the Y axis direction is calculated by the above equation (1) but also the image blur in the X axis direction is calculated by the above equation (3). .
  • the translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the shake correction device of the camera 1 includes a shake sensor 39 that detects a shake in the Y-axis direction of the device, and an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 39. And a blur correction unit 21a that calculates the amount of blurring.
  • the blur correction unit 21a calculates image blur in the X-axis direction that intersects the Y-axis direction. As a result, it is possible to suppress image blurring in the direction of the X-axis that intersects the Y-axis in which the shake is detected.
  • the blur correction unit 21a calculates the image blur in the Y-axis direction, and thus can suppress the image blur in the Y-axis direction in which the shake is detected.
  • the blur correction apparatus further includes a CPU 21 that determines a position on the image plane 70.
  • the blur correction unit 21 a calculates the image blur amount in the X-axis direction and the Y-axis direction based on the determined position by the CPU 21 and the rotation angle in the Y-axis direction detected by the shake sensor 39. Thereby, even when the position of the image plane 70 determined by the CPU 21 is a position other than the center of the image plane 70, the image blur can be appropriately suppressed.
  • FIG. 9 is a diagram illustrating an example in which distortion (for example, barrel shape) is generated by the interchangeable lens 3.
  • distortion for example, barrel shape
  • a large number of solid circles represent images of the subject 80 when it is assumed that the interchangeable lens 3 has no distortion.
  • a large number of hatched circles indicate images of the subject 80 that are distorted by the influence of barrel distortion based on the optical characteristics of the interchangeable lens 3.
  • the distortion aberration of the interchangeable lens 3 varies depending on the design, but is often large in a wide-angle lens having a short focal length.
  • the amount of distortion increases as the distance from the optical axis L1 of the imaging optical system increases (when the center O of the image plane 70 is aligned with the optical axis L1, the distance from the center O of the image plane 70). growing.
  • the distortion amount appears as a positional deviation between the solid line circle and the hatched circle shown in FIG.
  • the positional deviation between the solid circle and the hatched circle becomes the largest at a position where the distance from the center O of the image plane 70 is long (in other words, the image height is high).
  • the positional deviation is ⁇ x in the X-axis direction and ⁇ y in the Y-axis direction.
  • the schematic diagram illustrated in FIG. 8 is represented as having no distortion due to the imaging optical system, as indicated by a solid circle in FIG. Therefore, for example, when the position at which image blur is calculated on the image plane 70 is set at a position away from the center O of the image plane 70, if the image blur correction described in the second embodiment is performed as it is, distortion aberration exists. In this case, image blur that cannot be corrected occurs.
  • FIG. Image blur correction is performed assuming that there is distortion due to the imaging optical system, such as a hatched circle in FIG.
  • distortion aberration information indicating which position on the image plane 70 is in which direction and in which amount of distortion is known as design information of the interchangeable lens 3. . Therefore, information on distortion aberration of the interchangeable lens 3 attached to the camera body 2 is recorded in the memory 28 in advance.
  • the CPU 21 detects that the interchangeable lens 3 having a large distortion aberration is attached, the CPU 21 reads out the corresponding distortion aberration information from the memory 28 and uses it for the above-described calculation to calculate the image blur.
  • the blur correction optical system target position calculation unit 203 of the blur correction unit 21a is configured to perform image blur calculated by the angle blur calculation unit 201 and image blur calculated by the translation blur calculation unit 202 for each of the X axis and the Y axis. Then, depending on the direction of the distortion aberration information read from the memory 28, a positive / negative sign is added to perform the addition calculation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the interchangeable lens 3A is attached to the camera body 2A.
  • the interchangeable lens 3A is different from the interchangeable lens 3 in that a shake correction unit 40 is added.
  • a detection signal from the shake sensor 39 is sent to the shake correction unit 40.
  • the camera body 2A is different from the camera body 2 in that a shake sensor (motion detection unit, shake detection unit) 31 is added.
  • a detection signal from the shake sensor 31 is sent to the CPU 21 (blur correction unit 21a).
  • the shake sensor 31 has the same function as the shake sensor 39.
  • the interchangeable lens 3A including the blur correction drive mechanism 37 when the interchangeable lens 3A including the blur correction drive mechanism 37 is attached to the camera body 2A, image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3A, and the camera Image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is also used.
  • the interchangeable lens 3A that does not include the shake correction drive mechanism 37 is attached to the camera body 2A, the shake correction drive mechanism 26 of the camera body 2A is operated to modify the first embodiment. Image blur correction similar to 1 is performed.
  • FIG. 10 is a diagram showing a main configuration of a camera 1A according to the third embodiment.
  • the camera 1A includes a camera body 2A and an interchangeable lens 3A.
  • the interchangeable lens 3A is attached to the camera body 2A via a mount portion (not shown).
  • the camera body 2A and the interchangeable lens 3A are electrically connected, and communication is possible between the camera body 2A and the interchangeable lens 3A. Communication between the camera body 2A and the interchangeable lens 3A may be performed by wireless communication.
  • FIG. 10 the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
  • FIG. 11 is a diagram illustrating the blur correction unit 40 of the interchangeable lens 3A.
  • the shake correction unit 40 includes an angle shake calculation unit 401, a translational shake calculation unit 402, and a shake correction optical system target position calculation unit 403.
  • the angle blur calculation unit 401 uses the detection signal around the axis parallel to the X axis (Pitch direction) detected by the angular velocity sensor 39a, and the image blur in the Y axis direction due to the rotational motion and, if necessary, the image in the X axis direction. Calculate blur.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) detected by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
  • the translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402.
  • the image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
  • the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), an image blur amount at a position to be described later on the image plane 70 is calculated.
  • the blur correction optical system target position calculation unit 403 calculates the target position of the blur correction optical system 33 based on the calculated image blur amount in order to operate the blur correction drive mechanism 37 of the interchangeable lens 3A. Then, the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the camera 1 ⁇ / b> A may be a single-lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A are integrated. You may comprise as a lens integrated camera.
  • the image blur calculation by the angle blur calculation unit 201 and the image blur calculation by the translation blur calculation unit 202 are the same as those in the first embodiment and the second embodiment. However, it differs from the first embodiment and the second embodiment in the following points.
  • One of the differences is that the center of the image plane 70 is selected as the position for calculating the image blur in the image blur correction using the interchangeable lens 3A, and any position on the image plane 70 is used as the position for calculating the image blur in the image blur correction using the camera body 2A. It is a point to choose the position.
  • Another difference is that the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A are performed based on the sharing ratio determined by the CPU 21 of the camera body 2A. The explanation of the sharing ratio will be described later.
  • the CPU 21 determines the position at which the blur correction unit 40 of the interchangeable lens 3A calculates image blur at the center of the image plane 70, for example, and sets the position at which the blur correction unit 21a of the camera body 2A calculates image blur on the image plane 70. Set in any position. Accordingly, the angle blur calculation unit 401 of the interchangeable lens 3A calculates the blur correction amount (L) based on the image blur at the center position of the image plane 70 and the sharing ratio of the interchangeable lens 3A determined by the CPU 21. .
  • the angle blur calculation unit 201 of the camera body 2A is based on the image blur at a position different from the center of the image plane 70 determined by the CPU 21 and the share ratio of the camera body 2A determined by the CPU 21. ) Is calculated.
  • the CPU 21 determines the position by any one of methods (1) to (4) in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
  • the CPU 21 determines a sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A. For example, the CPU 21 of this example sets the sharing ratio to 50:50. This ratio may be 70:30 or 40:60.
  • the angle blur calculation unit 401 of the interchangeable lens 3A obtains an image blur V (L) to be shared by the interchangeable lens 3A as shown in the following equation (4).
  • the reason for halving the right side is that the sharing ratio is set to 50%.
  • ⁇ y1 is an image blur in the Y-axis direction at the center of the image plane 70.
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is defined as ⁇ .
  • the symbol f represents the focal length of the interchangeable lens 3A.
  • the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (5).
  • d ⁇ y2 ⁇ y1.
  • ⁇ y2 is an image blur in the Y-axis direction at a position different from the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402.
  • the target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
  • the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
  • the shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
  • the image blur correction based on the image blur calculated by the angle blur calculation unit 401 at the center position of the image plane 70 is performed by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated by the angle blur calculation unit 201 at a position different from the center of the image plane 70 is performed by the image blur correction by the camera body 2A.
  • the image blur correction in the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the description of the third embodiment described above is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction. Since the correction in the Y-axis direction when the camera 1A is rotated in the pitch direction and the correction in the X-axis direction when the camera 1A is rotated in the Yaw direction are the same except for the direction, the description in the X-axis direction is the same. Is omitted.
  • the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is similar to that in the first and second embodiments. Even if the position is different, it is treated as almost constant.
  • the outline of the third embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 ⁇ / b> A calculates image blur at the center position of the image plane 70.
  • the angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 ⁇ / b> A calculates image blur at a position different from the center of the image plane 70.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, the sharing ratio of 50%) as 1/2 of the image blur ⁇ y1 at the center of the image plane 70, and the camera body 2A.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d. d is the difference between the image blur ⁇ y2 at a position different from the center of the image plane 70 and the ⁇ y1.
  • the translation blur calculation unit 402 of the interchangeable lens 3 ⁇ / b> A sets the image blur to be assigned to the interchangeable lens 3 ⁇ / b> A (for example, a sharing ratio of 50%) to, for example, 1 ⁇ 2 of the image blur at the center of the image plane 70.
  • the translation blur calculation unit 202 of the camera body 2 ⁇ / b> A sets the image blur shared by the camera body 2 ⁇ / b> A to, for example, half of the image blur at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3 ⁇ / b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the center position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at a position different from the center of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39.
  • the interchangeable lens 3 ⁇ / b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40.
  • a shake sensor 31 that detects a shake of the apparatus
  • a shake correction unit 21 b that calculates a shake amount of an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a
  • a blur correction drive mechanism 26 that moves the image sensor 22 that captures an image of the subject 80 on the image plane 70 in a direction that suppresses the blur amount
  • a CPU 21 that determines a position on the image plane 70 are provided. Provided in the camera body 2A.
  • the blur correction unit 40 of the interchangeable lens 3A calculates an image blur ⁇ y1 based on a first position (the center of the image plane 70) predetermined on the image plane 70 and the shake detected by the shake sensor 39.
  • the blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to 1 ⁇ 2 of the image blur ⁇ y1.
  • the blur correction unit 21 b of the camera body 2 ⁇ / b> A has an image blur ⁇ y ⁇ b> 2 based on the second position (a position different from the center) determined by the CPU 21 and the shake detected by the shake sensor 31, and a predetermined image plane 70.
  • An image blur ⁇ y1 based on one position (the center of the image plane 70) and the shake detected by the shake sensor 31 is calculated.
  • the blur correction unit 21b further calculates a difference d between the image blur ⁇ y2 and the image blur ⁇ y1.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d. Thereby, even when the position determined by the CPU 21 is other than the center of the image plane 70, the image blur can be appropriately suppressed.
  • the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur ⁇ y1 to the blur correction drive mechanism 37, and the blur correction unit 21a of the camera body 2A The remaining 50% of the image blur ⁇ y1 and the difference d are output to the correction drive mechanism 26.
  • the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
  • the sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A.
  • the blur V (B) is d. d is the difference between the image blur ⁇ y 2 at a position different from the center of the image plane 70 and the image blur ⁇ y 1 at the center of the image plane 70.
  • the CPU 21 determines, for example, two positions (referred to as a first position and a second position) on the image plane 70 as positions for calculating the image blur.
  • the angle blur calculation unit 401 of the interchangeable lens 3A calculates image blur for the first position determined by the CPU 21.
  • the angle blur calculation unit 201 of the camera body 2A calculates image blur for the first position and the second position determined by the CPU 21.
  • the CPU 21 determines the first position and the second position for calculating the image blur by one of the methods (1) to (4) in the first embodiment.
  • Modification 4 of the third embodiment is different from the third embodiment in that the first position and the second position include a case where both the first position and the second position are different from the center of the image plane 70.
  • the CPU 21 determines the sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A, as in the third embodiment. It is.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction can be expressed by the above formula (2) as described in the first embodiment. ). Further, when the first position and the second position for calculating the image blur are positions different from the center of the image plane 70, the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is as described in the first embodiment. The above formula (1).
  • the angle blur calculation unit 401 of the interchangeable lens 3A performs image blur V (L) to be shared by the interchangeable lens 3A as shown in the following formula (6).
  • V (L) ⁇ y2a / 2 (6)
  • ⁇ y2a is an image blur in the Y-axis direction at a first position different from the center of the image plane 70.
  • the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (7).
  • Ask. V (B) ⁇ y2a / 2 + d2 (7)
  • d2 ⁇ y2b ⁇ y2a.
  • ⁇ y2b is an image blur in the Y-axis direction at a second position different from the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402.
  • the target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
  • the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
  • the shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
  • the image blur correction unit 401 performs image blur correction based on the image blur calculated at the first position of the image plane 70 by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated at the second position of the image plane 70 by the angle blur calculation unit 201 is performed by the image blur correction by the camera body 2A.
  • the image blur correction in the fourth modification of the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction. Including.
  • the description of Modification 4 of the above-described third embodiment is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction.
  • the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is the same as in the first to third embodiments. Even if the positions on the image plane 70 (the imaging plane of the imaging device 22) are different, they are treated as being substantially constant.
  • the outline of Modification 4 of the third embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 ⁇ / b> A calculates the image blur at the first position on the image plane 70.
  • the angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 ⁇ / b> A calculates image blur at the second position on the image plane 70.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) as 1/2 of the image blur ⁇ y2a at the first position of the image plane 70, and
  • the angle blur calculation unit 201 of the body 2A sets the image blur V (B) shared by the camera body 2A to V (L) + d2.
  • d2 is the difference between the image blur ⁇ y2b at the second position of the image plane 70 and the ⁇ y2a.
  • the translation blur calculation unit 402 of the interchangeable lens 3 ⁇ / b> A sets the image blur to be assigned to the interchangeable lens 3 ⁇ / b> A (for example, a sharing ratio of 50%) to, for example, 1 ⁇ 2 of the image blur at the center of the image plane 70.
  • the translation blur calculation unit 202 of the camera body 2 ⁇ / b> A sets the image blur shared by the camera body 2 ⁇ / b> A to, for example, half of the image blur at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3 ⁇ / b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the first position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39.
  • the interchangeable lens 3 ⁇ / b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40.
  • a shake sensor 31 that detects the shake of the apparatus
  • a shake correction unit 21 a that calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a, the image pickup device 22 that picks up the image of the subject 80 on the image plane 70 is moved in a direction that suppresses the amount of shake, and the first position and the second position on the image plane 70.
  • the camera body 2A is provided with a CPU 21 that determines the above.
  • the blur correction unit 40 of the interchangeable lens 3A calculates an image blur ⁇ y2a based on the first position and the shake detected by the shake sensor 39.
  • the blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to 1 ⁇ 2 of the image blur ⁇ y2a.
  • the camera shake correction unit 21a of the camera body 2A generates an image blur ⁇ y2a based on the first position and the shake detected by the shake sensor 31, and an image blur ⁇ y2b based on the second position and the shake detected by the shake sensor 31. Calculate.
  • the blur correction unit 21b further calculates a difference d2 between the image blur ⁇ y2a and the image blur ⁇ y2b.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d2.
  • image blur can be appropriately suppressed at the second position determined by the CPU 21 other than the center of the image plane 70.
  • the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur ⁇ y2a to the blur correction drive mechanism 37, and the blur correction unit 21b of the camera body 2A The remaining 50% of ⁇ y2a and the difference d2 are output to the correction drive mechanism 26.
  • the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
  • the sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A.
  • the blur V (B) is d2.
  • d2 is the difference between the image blur ⁇ y2a at a first position different from the center of the image plane 70 and the image blur ⁇ y2b at a second position different from the center of the image plane 70.
  • Image blur correction calculation based on the image blur V (B) of the above formula (5) and the above formula (7) is performed by the blur correction unit 40 of the interchangeable lens 3A, and the image blur of the above formula (4) and the above formula (6) is performed.
  • the blur correction calculation based on V (L) may be performed by the blur correction unit 21a of the camera body 2A.
  • the position of the image plane 70 for calculating the image blur for the image blur correction by the interchangeable lens 3A and the image blur for the image blur correction by the camera body 2A are calculated.
  • the position of the image plane 70 to be switched can be exchanged with the case of the third embodiment or the fourth modification of the third embodiment.
  • the angle blur calculation unit 201 and the angle blur calculation unit 401 perform addition calculation by adding a positive / negative sign to the X axis and the Y axis depending on the blur direction.
  • image blur correction is performed exclusively by the interchangeable lens 3A using the camera 1A of FIG.
  • the camera 1 ⁇ / b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
  • the CPU 21 of the camera body 2A in the fourth embodiment may cause an image of the main subject on the image plane 70 by, for example, any one of the methods (1) to (4) in the first embodiment. Determine the high position. Then, the CPU 21 transmits information indicating the position determined on the image plane 70 to the blur correction unit 40 of the interchangeable lens 3A.
  • the timing at which the CPU 21 of the camera body 2A transmits information on the position at which image blur is calculated on the image plane 70 to the blur correction unit 40 is determined, for example, by the CPU 21 at which the image blur is calculated on the image plane 70 (newly determined). And the case of updating).
  • the CPU 21 includes the position information in steady communication between the camera body 2A and the interchangeable lens 3A, or includes the position information in communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A. The position information is immediately notified to the shake correction unit 40.
  • the angle blur calculation unit 401 of the blur correction unit 40 calculates the image blur at the position indicated by the information received from the CPU 21, and performs the image blur correction based on the image blur.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
  • the image blur correction in the fourth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction.
  • the same correction as that described above with respect to the X-axis direction is necessary. Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different. Treat as.
  • the outline of the fourth embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates the image blur by setting the position where the image blur is calculated on the image plane 70 to the position notified from the CPU 21 of the camera body 2A.
  • the translation blur calculation unit 402 calculates an image blur at the center of the image plane 70, for example.
  • the blur correction optical system target position calculator 403 determines whether the image blur calculated by the angle blur calculator 401 and the image blur calculated by the translation blur calculator 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 notified from the CPU 21 of the camera body 2A is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and information on the position determined by the CPU 21 as an interchangeable lens.
  • a camera body 2A having a CPU 21 for transmission to 3A, a blur correction optical system 33 for blur correction, a blur correction unit 40 for receiving position information from the camera body 2A, and a position and shake sensor received from the camera body 2A.
  • An interchangeable lens 3A having a blur correction unit 40 that calculates an image blur ⁇ y2 based on the shake detected in 39, and a blur correction drive mechanism 37 that moves the blur correction optical system 33 in a direction to suppress the image blur ⁇ y2. .
  • image blur can be appropriately suppressed at a position other than the center of the image plane 70 determined by the CPU 21 of the camera body 2A.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the image blur ⁇ y2 based on the output of the shake sensor 39 and the focal length f of the interchangeable lens 3A.
  • the image blur ⁇ y2 can be appropriately calculated at a position other than the center of the image plane 70, and the image blur can be appropriately suppressed based on the image blur ⁇ y2.
  • the camera 1A shown in FIG. 10 is used, as in the fourth embodiment.
  • the image blur correction according to the fifth embodiment is performed exclusively by operating the blur correction drive mechanism 37 of the interchangeable lens 3A.
  • the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A It differs from the fourth embodiment in that both perform computations.
  • the camera 1 ⁇ / b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
  • the CPU 21 of the camera body 2A determines a position where the image of the main subject is highly likely to exist on the image plane 70 by, for example, any one of methods (1) to (4) in the first embodiment. Then, the CPU 21 sets the center of the image plane 70 as the first position, and sets the position determined as described above as the second position.
  • the blur correction unit 21 a of the CPU 21 calculates image blur at the first position and the second position of the image plane 70.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the X axis (Pitch direction) from the angular velocity sensor of the shake sensor 31 and the image blur in the Y axis direction due to the rotational motion and necessary. In such a case, the image blur in the X-axis direction is calculated.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) from the angular velocity sensor of the shake sensor 31 to cause image blur in the X axis direction due to rotational motion, and when necessary. Calculates the image blur in the Y-axis direction.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is as described in the first embodiment.
  • the blur correction unit 21a of the CPU 21 further calculates a ratio g between the image blur ⁇ y1 at the first position and the image blur ⁇ y2 at the second position by the following equation (8).
  • g ⁇ y2 / ⁇ y1 (8)
  • the g is referred to as a correction coefficient g.
  • the CPU 21 transmits information indicating the correction coefficient g to the blur correction unit 40 of the interchangeable lens 3A.
  • the CPU 21 may transmit information indicating the difference between ⁇ y2 and ⁇ y1 to the blur correction unit 40 of the interchangeable lens 3A instead of the information indicating the ratio between ⁇ y2 and ⁇ y1.
  • the timing at which the CPU 21 of the camera body 2A transmits the information indicating the correction coefficient g to the blur correction unit 40 is determined by, for example, a first position and a second position at which the CPU 21 calculates image blur on the image plane 70 (newly determined). And the case where the correction coefficient g is calculated.
  • the CPU 21 includes the information of the correction coefficient g in the steady communication between the camera body 2A and the interchangeable lens 3A, or the correction coefficient in the communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A.
  • Information on the correction coefficient g is promptly notified to the shake correction unit 40 by including information on g.
  • the angle blur calculation unit 401 of the camera shake correction unit 40 uses a detection signal around the axis parallel to the X axis (Pitch direction) by the angular velocity sensor 39a.
  • image blur in the Y-axis direction due to rotational movement and image blur in the X-axis direction are calculated if necessary.
  • the angle blur calculation unit 401 uses the detection signal around the axis parallel to the Y axis (Yaw direction) by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
  • the blur correction unit 40 in the fifth embodiment calculates the image blur at the same position as the first position defined by the CPU 21 of the camera body 2A, in this example, the center of the image plane 70. Since the position where the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the angle blur calculation unit 401 multiplies the image blur ⁇ y1 in the Y-axis direction by a correction coefficient g based on information received from the camera body 2A by the receiving unit, thereby obtaining an image in the Y-axis direction of the second position of the image plane 70.
  • the blur ⁇ y2 is calculated.
  • the angle blur calculation unit 401 calculates the image blur ⁇ y2 by adding the received information to the image blur ⁇ y1.
  • the translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402.
  • the image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
  • the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), the image blur amount at the second position of the image plane 70 is calculated.
  • the blur correction optical system target position calculation unit 403 operates the blur correction drive mechanism 37 of the interchangeable lens 3A to perform image blur correction, and therefore moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount.
  • the target position of the blur correction optical system 33 is calculated.
  • the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the image blur correction in the fifth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction.
  • the same correction as that described above with respect to the X-axis direction is necessary. Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different.
  • the outline of the fifth embodiment is as follows.
  • the angle blur calculation unit 201 of the blur correction unit 21a of the camera body 2A calculates image blurs ⁇ y1 and ⁇ y2 at the first position (center of the image plane 70) and the second position of the image plane 70.
  • the blur correction unit 21a calculates a correction coefficient g that is a ratio between the image blur ⁇ y1 at the first position and the image blur ⁇ y2 at the second position, and information indicating the correction coefficient g is used as the blur correction unit 40 of the interchangeable lens 3A. Send to.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates an image blur at the first position of the image plane 70 (the center of the image plane 70).
  • the angle blur calculation unit 401 further calculates the image blur at the second position of the image plane 70 by multiplying the image blur at the first position by the correction coefficient g based on the information received from the camera body 2A by the receiving unit.
  • the translation blur calculation unit 402 of the blur correction unit 40 calculates image blur at the first position, for example.
  • the shake correction optical system target position calculation unit 403 of the shake correction unit 40 determines whether the image shake at the second position and the image shake calculated by the translational shake calculation unit 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and a first position that is predetermined on the image plane 70. Based on (the center of the image plane 70) and the second position determined by the CPU 21 and the shake detected by the shake sensor 31, the image blur ⁇ y1 and the image blur ⁇ y2 at the first position (the center of the image plane 70) and the second position are obtained.
  • a camera body 2A having a camera shake correction unit 21a that calculates, a CPU 21 that transmits information about a correction coefficient g or a difference between the image camera shake ⁇ y1 and the image camera shake ⁇ y2 to the interchangeable lens 3A, and a camera shake correction optical system that performs camera shake correction.
  • the image blur ⁇ y1 at the first position (center of the image plane 70) of the image sensor 22 based on the first position (center of the image plane 70) and the shake detected by the shake sensor 39.
  • the shake correction unit 40 that calculates, the shake correction unit 40 that receives information from the camera body 2A, and the image blur ⁇ y1 calculated by the shake correction unit 40 are corrected to suppress the corrected image blur.
  • An interchangeable lens 3 ⁇ / b> A having a shake correction drive mechanism 37 that moves the shake correction optical system 33 in the direction is provided.
  • the blur correction unit 40 of the interchangeable lens 3A can appropriately suppress the image blur at the second position determined by the CPU 21 of the camera body 2A, for example.
  • the blur correction unit 21a of the camera body 2A calculates the image blur ⁇ y1 and the image blur ⁇ y2 based on the output of the shake sensor 31 and the focal length f of the interchangeable lens 3A, and the blur correction unit 40 of the interchangeable lens 3A
  • the image blur ⁇ y1 is calculated from the output of the shake sensor 39 and the focal length f. Accordingly, the blur correction unit 40 of the interchangeable lens 3A can appropriately calculate the image blur ⁇ y2 at the second position other than the center of the image plane 70, and can appropriately suppress the image blur based on the image blur ⁇ y2.
  • Image blur correction is performed by operating both the camera shake correction drive mechanism 26 of the camera body 2A and the camera shake correction drive mechanism 37 of the interchangeable lens 3A, which is the same as in the fourth modification of the third embodiment. Also, the point that both the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A perform the calculation is common to the fifth embodiment.
  • the CPU 21 of the camera body 2A sends to the shake correction unit 40 of the interchangeable lens 3A (a) information on the first position for calculating image blur on the image plane 70, and (b) image blur correction by the interchangeable lens 3A and the camera.
  • Information indicating a sharing ratio with the image blur correction by the body 2A is transmitted.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the image blur at the first position of the image plane 70, and obtains the image blur V (L) shared by the interchangeable lens 3A by the above equation (6).
  • the blur correction unit 21a of the camera body 2A calculates the angular blur at the first position of the image plane 70 and the image blur at the second position of the image plane 70, and then calculates the camera body 2A by the above equation (7). Find the image blur V (B) to be shared.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the target position of the blur correction optical system 33 based on the calculated image blur V (L) and the image blur calculated by the translation blur calculation unit 402, thereby replacing the interchangeable lens.
  • the image blur correction is performed by operating the blur correction drive mechanism 37 of 3A.
  • the blur correction unit 21a of the camera body 2A calculates the target position of the image sensor 22 based on the calculated image blur (B) and the image blur calculated by the translational blur calculation unit 202, so that the camera body 2A.
  • the image blur correction performed by operating the image blur correction drive mechanism 26 is performed.
  • image blur at a position where blur is desired to be corrected is corrected. For this reason, it may be assumed that image blur at a position determined by the CPU 21 on the image plane 70 is suppressed while image blur remains at other positions on the image plane 70. In such a case, image restoration by image processing may be combined.
  • the CPU 21 sends an instruction to the signal processing circuit 27 to make the image blurring conspicuous, for example, by strongly applying edge enhancement processing to the data corresponding to the other positions among the image data generated by the signal processing circuit 27.
  • the image restoration process to be eliminated is executed.

Abstract

An interchangeable lens is attachable to and detachable from a camera body provided with an imaging element for capturing a subject image, and is provided with: an imaging optical system which forms the subject image on an image surface; an input unit to which a blur amount detected by the interchangeable lens and/or the camera body is inputted; a reception unit which receives information that is used for calculating an off-axis correction amount for correcting blur outside an optical axis on the image surface; and a drive unit which, on the basis of at least the information and the blur amount, drives a movable portion that is at least part of the imaging optical system within a plane orthogonal to the optical axis.

Description

交換レンズおよびカメラボディInterchangeable lens and camera body
 本発明は、交換レンズおよびカメラボディに関する。 The present invention relates to an interchangeable lens and a camera body.
 カメラの振れによる像ブレを抑える技術が知られている(特許文献1参照)。しかしながら、画面の中央部における像ブレが補正されるものであった。 A technique for suppressing image blur due to camera shake is known (see Patent Document 1). However, the image blur at the center of the screen is corrected.
日本国特開2007-235806号公報Japanese Unexamined Patent Publication No. 2007-235806
 本発明の第1の態様によると、被写体像を撮像する撮像素子を備えるカメラボディに着脱可能な交換レンズは、前記被写体像を像面に結像させる撮像光学系と、前記交換レンズまたは前記カメラボディの少なくとも一方で検出されたブレ量が入力される入力部と、前記像面における光軸外でのブレを補正するための軸外補正量を算出するために用いる情報を受信する受信部と、少なくとも前記情報と前記ブレ量とに基づいて、前記撮像光学系の少なくとも一部の可動部を光軸と直交する面内において駆動する駆動部と、を備える。
 本発明の第2の態様によると、撮像光学系を着脱可能なカメラボディは、前記撮像光学系により像面に結像される被写体像を撮像する撮像素子と、前記像面における光軸外でのブレを補正するための軸外補正量を算出するために用いられる情報を前記撮像光学系に送信する送信部と、を備える。
 本発明の第3の態様によると、撮像光学系を着脱可能なカメラボディは、前記撮像光学系により像面に結像される被写体像を撮像する撮像素子と、前記撮像光学系または前記撮像素子の少なくとも一方のブレ量が入力される入力部と、前記像面における光軸外でのブレを補正するための軸外補正量を、前記ブレ量に基づいて算出する算出部と、を備える。
According to the first aspect of the present invention, an interchangeable lens that can be attached to and detached from a camera body including an image sensor that captures a subject image includes an imaging optical system that forms the subject image on an image plane, and the interchangeable lens or the camera. An input unit to which a blur amount detected on at least one of the bodies is input; and a receiver unit that receives information used to calculate an off-axis correction amount for correcting off-axis blur in the image plane; A drive unit that drives at least a part of the movable unit of the imaging optical system in a plane orthogonal to the optical axis based on at least the information and the amount of blurring.
According to the second aspect of the present invention, the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on the image plane by the imaging optical system, and an optical axis outside the image plane. A transmission unit that transmits information used to calculate an off-axis correction amount for correcting the blur of the image to the imaging optical system.
According to the third aspect of the present invention, the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on an image plane by the imaging optical system, and the imaging optical system or the imaging element. An input unit to which at least one of the blur amounts is input, and a calculation unit that calculates an off-axis correction amount for correcting the off-axis blur on the image plane based on the blur amount.
第1の実施の形態によるカメラの要部構成を示す図である。It is a figure which shows the principal part structure of the camera by 1st Embodiment. ブレ補正部を説明する図である。It is a figure explaining a blurring correction part. 角速度の検出方向と像面における像ブレを説明する模式図である。It is a schematic diagram explaining the detection direction of an angular velocity and image blurring in the image plane. 図3の像ブレを説明する図である。It is a figure explaining the image blurring of FIG. 撮像画面に形成されたフォーカスエリアを例示する図である。It is a figure which illustrates the focus area formed on the imaging screen. 複数の候補から1つの代表位置を定める例を説明する図である。It is a figure explaining the example which defines one representative position from a plurality of candidates. 第1の実施の形態の変形例2を説明する図である。It is a figure explaining the modification 2 of 1st Embodiment. 角速度の検出方向と像面における像ブレを説明する模式図である。It is a schematic diagram explaining the detection direction of an angular velocity and image blurring in the image plane. 歪曲収差が生じている例を説明する図である。It is a figure explaining the example which distortion has arisen. 第3の実施の形態によるカメラの要部構成を示す図である。It is a figure which shows the principal part structure of the camera by 3rd Embodiment. 交換レンズのブレ補正部を説明する図である。It is a figure explaining the blur correction part of an interchangeable lens.
(第1の実施の形態)
 第1の実施の形態による像ブレ補正装置を搭載する撮像装置について、図面を参照して説明する。撮像装置の一例として、レンズ交換式のデジタルカメラ(以下、カメラ1と称する)を例示するが、カメラ1は、カメラボディ2にミラー24を備えた一眼レフタイプでも、ミラー24を備えないミラーレスタイプでもよい。
 また、カメラ1を、交換レンズ3とカメラボディ2とを一体にしたレンズ一体型として構成してもよい。
 さらにまた、撮像装置はカメラ1に限らず、撮像センサを備えたレンズ鏡筒や、撮像機能を備えたスマートフォン等であってもよい。
(First embodiment)
An imaging apparatus equipped with an image blur correction apparatus according to a first embodiment will be described with reference to the drawings. As an example of the imaging apparatus, an interchangeable lens digital camera (hereinafter referred to as a camera 1) is illustrated, but the camera 1 is a single lens reflex type having a mirror 24 in a camera body 2 and is not provided with a mirror 24. It may be a type.
The camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
Furthermore, the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
<カメラの要部構成>
 図1は、カメラ1の要部構成を示す図である。カメラ1は、カメラボディ2と交換レンズ3とで構成される。交換レンズ3は、不図示のマウント部を介してカメラボディ2に装着される。交換レンズ3がカメラボディ2に装着されると、カメラボディ2と交換レンズ3とが電気的に接続され、カメラボディ2と交換レンズ3との間で通信が可能になる。
 なお、カメラボディ2と交換レンズ3との通信を、無線通信によって行ってもよい。
<Main camera configuration>
FIG. 1 is a diagram illustrating a main configuration of the camera 1. The camera 1 includes a camera body 2 and an interchangeable lens 3. The interchangeable lens 3 is attached to the camera body 2 via a mount unit (not shown). When the interchangeable lens 3 is attached to the camera body 2, the camera body 2 and the interchangeable lens 3 are electrically connected, and communication between the camera body 2 and the interchangeable lens 3 becomes possible.
Communication between the camera body 2 and the interchangeable lens 3 may be performed by wireless communication.
 図1において、被写体からの光は、Z軸マイナス方向に向かって入射する。また、座標軸に示すように、Z軸に直交する紙面手前方向をX軸プラス方向、Z軸およびX軸に直交する上方向をY軸プラス方向とする。以降のいくつかの図においては、図1の座標軸を基準として、それぞれの図の向きがわかるように座標軸を表示する。 In FIG. 1, the light from the subject enters in the negative direction of the Z axis. Further, as shown in the coordinate axes, the front direction perpendicular to the Z axis is defined as the X axis plus direction, and the upward direction perpendicular to the Z axis and the X axis is defined as the Y axis plus direction. In the following several figures, the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes of FIG.
<交換レンズ>
 交換レンズ3は、撮像光学系(結像光学系)を有し、カメラボディ2に設けられた撮像素子22の撮像面上に被写体像を結像する。撮像光学系は、ズーム光学系31と、フォーカス(焦点調節)光学系32と、ブレ補正光学系33と、絞り34とを含む。交換レンズ3はさらに、ズーム駆動機構35と、フォーカス駆動機構36と、ブレ補正駆動機構37と、絞り駆動機構38と、振れセンサ(動き検出部、振れ検出部)39とを有する。
<Interchangeable lens>
The interchangeable lens 3 has an imaging optical system (imaging optical system), and forms a subject image on the imaging surface of the imaging element 22 provided in the camera body 2. The imaging optical system includes a zoom optical system 31, a focus (focus adjustment) optical system 32, a shake correction optical system 33, and a diaphragm 34. The interchangeable lens 3 further includes a zoom drive mechanism 35, a focus drive mechanism 36, a shake correction drive mechanism 37, a diaphragm drive mechanism 38, and a shake sensor (motion detection unit, shake detection unit) 39.
 ズーム駆動機構35は、カメラボディ2のCPU21から出力される信号に基づき、ズーム光学系31を光軸L1の方向に進退移動させて撮像光学系の倍率を調節する。CPU21から出力される信号には、ズーム光学系31の移動方向や移動量、移動速度などを示す情報が含まれる。 The zoom drive mechanism 35 adjusts the magnification of the imaging optical system by moving the zoom optical system 31 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2. The signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the zoom optical system 31.
 フォーカス駆動機構36は、カメラボディ2のCPU21から出力される信号に基づき、フォーカス光学系32を光軸L1の方向に進退移動させて撮像光学系の焦点調節を行う。焦点調節時にCPU21から出力される信号には、フォーカス光学系32の移動方向や移動量、移動速度などを示す情報が含まれる。
 また、絞り駆動機構38は、カメラボディ2のCPU21から出力される信号に基づき、絞り34の開口径を制御する。
The focus drive mechanism 36 adjusts the focus of the imaging optical system by moving the focus optical system 32 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2. The signal output from the CPU 21 at the time of focus adjustment includes information indicating the moving direction, moving amount, moving speed, and the like of the focus optical system 32.
The diaphragm driving mechanism 38 controls the aperture diameter of the diaphragm 34 based on a signal output from the CPU 21 of the camera body 2.
 ブレ補正駆動機構37は、カメラボディ2のCPU21から出力される信号に基づき、光軸L1と交差する平面内で、撮像素子22の撮像面上の被写体像のブレ(像ブレと称する)を打ち消す向きにブレ補正光学系33を進退移動させて像ブレを抑える。CPU21から出力される信号には、ブレ補正光学系33の移動方向や移動量、移動速度などを示す情報が含まれる。 Based on a signal output from the CPU 21 of the camera body 2, the blur correction drive mechanism 37 cancels blurring of the subject image on the imaging surface of the imaging element 22 (referred to as image blur) within a plane that intersects the optical axis L1. The blur correction optical system 33 is moved back and forth in the direction to suppress image blur. The signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the blur correction optical system 33.
 振れセンサ39は、手振れ等によってカメラ1が揺動する場合のカメラ1の振れを検出する。振れセンサ39は、角速度センサ39aおよび加速度センサ39bによって構成される。像ブレは、カメラ1の振れによって生じるものとする。 The shake sensor 39 detects the shake of the camera 1 when the camera 1 swings due to hand shake or the like. The shake sensor 39 includes an angular velocity sensor 39a and an acceleration sensor 39b. It is assumed that image blur is caused by camera shake.
 角速度センサ39aは、カメラ1の回転運動によって発生する角速度を検出する。角速度センサ39aは、例えばX軸と平行な軸、Y軸と平行な軸、Z軸と平行な軸の各軸回りの回転をそれぞれ検出し、検出信号をカメラボディ2のCPU21へ送出する。角速度センサ39aは、ジャイロセンサとも称される。 The angular velocity sensor 39a detects an angular velocity generated by the rotational movement of the camera 1. The angular velocity sensor 39a detects, for example, rotation around each axis of an axis parallel to the X axis, an axis parallel to the Y axis, and an axis parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2. The angular velocity sensor 39a is also referred to as a gyro sensor.
 また、加速度センサ39bは、カメラ1の並進運動で発生する加速度を検出する。加速度センサ39bは、例えばX軸と平行な軸、Y軸と平行な軸、およびZ軸と平行な軸方向の加速度をそれぞれ検出し、検出信号をカメラボディ2のCPU21へ送出する。加速度センサ39bは、Gセンサとも称される。
 本例では、振れセンサ39を交換レンズ3に設ける場合を例示するが、振れセンサ39をカメラボディ2に設けてもよい。また、振れセンサ39をカメラボディ2と交換レンズ3の双方に設けてもよい。
The acceleration sensor 39b detects acceleration generated by the translational motion of the camera 1. The acceleration sensor 39b detects, for example, accelerations in an axis direction parallel to the X axis, an axis parallel to the Y axis, and an axis direction parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2. The acceleration sensor 39b is also referred to as a G sensor.
In this example, the case where the shake sensor 39 is provided in the interchangeable lens 3 is illustrated, but the shake sensor 39 may be provided in the camera body 2. Further, the shake sensor 39 may be provided on both the camera body 2 and the interchangeable lens 3.
<カメラボディ>
 カメラボディ2は、CPU21と、撮像素子22と、シャッター23と、ミラー24と、AFセンサ25と、ブレ補正駆動機構26と、信号処理回路27と、メモリ28と、操作部材29と、液晶表示部30とを備える。
<Camera body>
The camera body 2 includes a CPU 21, an image sensor 22, a shutter 23, a mirror 24, an AF sensor 25, a shake correction drive mechanism 26, a signal processing circuit 27, a memory 28, an operation member 29, and a liquid crystal display. Part 30.
 CPU21は、CPU、RAM(Random Access Memory)、ROM(Read Only Memory)等により構成され、制御プログラムに基づいてカメラ1の各部を制御する。CPU21は、ブレ補正部(補正量算出部)21aを含む。 The CPU 21 includes a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and controls each unit of the camera 1 based on a control program. The CPU 21 includes a shake correction unit (correction amount calculation unit) 21a.
 ブレ補正部21aは、カメラ1の回転運動や、カメラ1の並進移動に伴う像ブレを算出する。CPU21は、ブレ補正部21aによる演算結果に基づいて、ブレ補正駆動機構(ブレ補正駆動部)37によってブレ補正光学系33を移動させたり、ブレ補正駆動機構(ブレ補正駆動部)26によって撮像素子22を移動させたりする。 The blur correction unit 21a calculates the image motion accompanying the rotational movement of the camera 1 and the translational movement of the camera 1. The CPU 21 moves the blur correction optical system 33 by the blur correction drive mechanism (blur correction drive unit) 37 based on the calculation result by the blur correction unit 21 a or the image pickup element by the blur correction drive mechanism (blur correction drive unit) 26. 22 is moved.
 第1の実施の形態では、撮像光学系を構成する交換レンズ3のブレ補正光学系33、または、撮像素子22を移動させることによって像ブレを抑える。このような像ブレの抑制は、像ブレ補正とも称される。像ブレ補正の詳細については後述する。 In the first embodiment, image blurring is suppressed by moving the blur correction optical system 33 of the interchangeable lens 3 constituting the imaging optical system or the imaging element 22. Such suppression of image blur is also referred to as image blur correction. Details of the image blur correction will be described later.
 図1の撮像素子22は、CCDイメージセンサやCMOSイメージセンサによって構成される。撮像素子22は、撮像光学系を通過した光束を撮像面で受光し、被写体像を光電変換(撮像)する。光電変換により、撮像素子22の撮像面に配置されている複数の画素のそれぞれで、受光量に応じて電荷が生成される。生成された電荷に基づく信号は、撮像素子22から読み出されて信号処理回路27へ送られる。 1 is configured by a CCD image sensor or a CMOS image sensor. The imaging element 22 receives the light beam that has passed through the imaging optical system at the imaging surface, and photoelectrically converts (captures) the subject image. Electric charges are generated according to the amount of received light in each of the plurality of pixels arranged on the imaging surface of the imaging element 22 by photoelectric conversion. A signal based on the generated charge is read from the image sensor 22 and sent to the signal processing circuit 27.
 シャッター23は、撮像素子22の露光時間を制御する。なお、撮像素子22の露光時間は、撮像素子22における電荷蓄積時間を制御する方式(いわゆる電子シャッター制御)によっても制御可能に構成されている。シャッター23は、不図示のシャッター駆動部によって開閉駆動される。 The shutter 23 controls the exposure time of the image sensor 22. The exposure time of the image sensor 22 can be controlled by a method for controlling the charge accumulation time in the image sensor 22 (so-called electronic shutter control). The shutter 23 is opened and closed by a shutter drive unit (not shown).
 半透過のクイックリターンミラー(以下ミラーと呼ぶ)24は、不図示のミラー駆動部によって駆動されることにより、ミラー24が光路上に移動したダウン位置(図1に例示)と、ミラー24が光路外に退避するアップ位置との間を移動する。例えば、レリーズ前は、ダウン位置に移動したミラー24によって被写体光が上方(Y軸プラス方向)に設けられた不図示のファインダー部へと反射される。また、ミラー24を透過した被写体光の一部は、サブミラー24aによって下方(Y軸マイナス方向)へ折り曲げられ、AFセンサ25へ導かれる。
 レリーズスイッチ押下直後は、ミラー24がアップ位置へ回動される。これにより、被写体光がシャッター23を介して撮像素子22へ導かれる。
A semi-transmissive quick return mirror (hereinafter referred to as a mirror) 24 is driven by a mirror driving unit (not shown), so that the mirror 24 is moved down on the optical path (illustrated in FIG. 1), and the mirror 24 is in the optical path. Move between up-positions that retract to the outside. For example, before the release, the subject light is reflected by the mirror 24 moved to the down position to a finder unit (not shown) provided upward (Y-axis plus direction). Part of the subject light transmitted through the mirror 24 is bent downward (Y-axis minus direction) by the sub mirror 24 a and guided to the AF sensor 25.
Immediately after the release switch is pressed, the mirror 24 is rotated to the up position. Thereby, the subject light is guided to the image sensor 22 via the shutter 23.
 AFセンサ25は、交換レンズ3の撮像光学系による焦点調節状態を検出する。CPU21は、AFセンサ25による検出信号を用いて公知の位相差方式の焦点検出演算を行う。CPU21は、この演算によって撮像光学系によるデフォーカス量を求め、デフォーカス量に基づいてフォーカス光学系32の移動量を算出する。CPU21は、算出したフォーカス光学系32の移動量を、移動方向および移動速度とともにフォーカス駆動機構36へ送信する。 The AF sensor 25 detects the focus adjustment state of the interchangeable lens 3 by the imaging optical system. The CPU 21 performs a known phase difference type focus detection calculation using a detection signal from the AF sensor 25. The CPU 21 obtains the defocus amount by the imaging optical system by this calculation, and calculates the movement amount of the focus optical system 32 based on the defocus amount. The CPU 21 transmits the calculated movement amount of the focus optical system 32 to the focus drive mechanism 36 together with the movement direction and movement speed.
 ブレ補正駆動機構26は、CPU21から出力される信号に基づいて、光軸L1と交差する平面内で像ブレを打ち消す向きに撮像素子22を進退移動させて、撮像素子22の撮像面上の像ブレを抑える。CPU21から出力される信号には、撮像素子22の移動方向や移動量、移動速度などを示す情報が含まれる。 Based on the signal output from the CPU 21, the blur correction drive mechanism 26 moves the image sensor 22 forward and backward in a direction that cancels image blur in a plane that intersects the optical axis L <b> 1. Reduce blurring. The signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the image sensor 22.
 信号処理回路27は、撮像素子22から読み出された画像の信号に基づき、被写体像に関する画像データを生成する。また、信号処理回路27は、生成した画像データに対して所定の画像処理を行う。画像処理には、例えば、階調変換処理、色補間処理、輪郭強調処理、ホワイトバランス処理等の公知の画像処理が含まれる。 The signal processing circuit 27 generates image data related to the subject image based on the image signal read from the image sensor 22. The signal processing circuit 27 performs predetermined image processing on the generated image data. The image processing includes known image processing such as gradation conversion processing, color interpolation processing, contour enhancement processing, and white balance processing.
 メモリ28は、例えばEEPROM(Electrically Erasable Programmable Read Only Memory)や、フラッシュメモリ等によって構成される。メモリ28には、例えば、振れセンサ39に設定する検出ゲインなどの調整値情報を記録する。メモリ28に対するデータの記録や、メモリ28からのデータの読み出しは、CPU21によって行われる。 The memory 28 includes, for example, an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a flash memory, or the like. In the memory 28, for example, adjustment value information such as a detection gain set in the shake sensor 39 is recorded. Data recording to the memory 28 and data reading from the memory 28 are performed by the CPU 21.
 操作部材29は、レリーズボタン、録画ボタン、ライブビューボタン、各種設定スイッチ等を含み、それぞれの操作に応じた操作信号をCPU21へ出力する。
 液晶表示部30は、CPU21からの指示により、画像データに基づく画像、シャッター速度、絞り値等の撮影に関する情報、およびメニュー操作画面等を表示する。
The operation member 29 includes a release button, a recording button, a live view button, various setting switches, and the like, and outputs an operation signal corresponding to each operation to the CPU 21.
In response to an instruction from the CPU 21, the liquid crystal display unit 30 displays an image based on image data, information relating to shooting such as a shutter speed and an aperture value, a menu operation screen, and the like.
 記録媒体50は、例えば、カメラボディ2に対して着脱可能なメモリカード等により構成される。記録媒体50には、画像データや音声データ等が記録される。記録媒体50に対するデータの記録や、記録媒体50からのデータの読み出しは、CPU21によって行われる。 The recording medium 50 is composed of, for example, a memory card that can be attached to and detached from the camera body 2. Image data, audio data, and the like are recorded on the recording medium 50. Recording of data on the recording medium 50 and reading of data from the recording medium 50 are performed by the CPU 21.
<像ブレ補正>
 第1の実施の形態によるカメラ1は、交換レンズ3のブレ補正駆動機構37を作動させて行う像ブレ補正と、カメラボディ2のブレ補正駆動機構26を作動させて行う像ブレ補正とが可能に構成されている。第1の実施の形態では、CPU21が、いずれか一方のブレ補正駆動機構を作動させる。CPU21は、例えば、ブレ補正駆動機構37を備えた交換レンズ3がカメラボディ2に装着された場合は、交換レンズ3のブレ補正駆動機構37を作動させて像ブレ補正を行い、ブレ補正駆動機構37を備えていない交換レンズ3がカメラボディ2に装着された場合は、カメラボディ2のブレ補正駆動機構26を作動させて像ブレ補正を行う。
 なお、後述する第3の実施の形態のように、交換レンズ3とカメラボディ2のブレ補正駆動機構を同時に作動させても良い。
<Image stabilization>
The camera 1 according to the first embodiment can perform image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3 and image blur correction performed by operating the blur correction drive mechanism 26 of the camera body 2. It is configured. In the first embodiment, the CPU 21 operates one of the blur correction drive mechanisms. For example, when the interchangeable lens 3 having the blur correction drive mechanism 37 is attached to the camera body 2, the CPU 21 operates the blur correction drive mechanism 37 of the interchangeable lens 3 to perform image blur correction, and the blur correction drive mechanism. When the interchangeable lens 3 that does not include the lens 37 is attached to the camera body 2, the blur correction drive mechanism 26 of the camera body 2 is operated to perform image blur correction.
Note that, as in a third embodiment to be described later, the shake correction drive mechanism of the interchangeable lens 3 and the camera body 2 may be operated simultaneously.
 一般に、カメラ1で発生する像ブレは、カメラ1の回転運動に伴う像ブレ(角度ブレとも称する)と、カメラ1の並進移動に伴う像ブレ(並進ブレとも称する)とに分けられる。ブレ補正部21aは、カメラ1の回転運動による像ブレと、カメラ1の並進移動による像ブレとをそれぞれ算出する。 In general, image blur generated by the camera 1 is classified into image blur (also referred to as angle blur) accompanying the rotational movement of the camera 1 and image blur accompanying translational movement of the camera 1 (also referred to as translation blur). The blur correction unit 21 a calculates an image blur due to the rotational movement of the camera 1 and an image blur due to the translational movement of the camera 1.
 図2は、ブレ補正部21aを説明する図である。ブレ補正部21aは、角度ブレ演算部201と、並進ブレ演算部202と、ブレ補正光学系目標位置演算部(選択部)203とを有する。
 角度ブレ演算部201は、角速度センサ39aによるX軸と平行な軸回り(Pitch方向)の検出信号を用いて、回転運動によるY軸方向の像ブレを算出する。また、角度ブレ演算部201は、角速度センサ39aによるY軸と平行な軸回り(Yaw方向)の検出信号を用いて、回転運動によるX軸方向の像ブレを算出する。
FIG. 2 is a diagram illustrating the blur correction unit 21a. The shake correction unit 21 a includes an angle shake calculation unit 201, a translational shake calculation unit 202, and a shake correction optical system target position calculation unit (selection unit) 203.
The angle blur calculation unit 201 calculates the image blur in the Y-axis direction due to the rotational motion using the detection signal around the axis parallel to the X-axis (Pitch direction) by the angular velocity sensor 39a. Further, the angle blur calculation unit 201 calculates an image blur in the X-axis direction due to the rotational motion using a detection signal around the axis parallel to the Y-axis (Yaw direction) by the angular velocity sensor 39a.
 並進ブレ演算部202は、加速度センサ39bによるX軸方向の検出信号を用いて、並進運動によるX軸方向の像ブレを算出する。また、並進ブレ演算部202は、加速度センサ39bによるY軸方向の検出信号を用いて、並進運動によるY軸方向の像ブレを算出する。 The translation blur calculation unit 202 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b. The translation blur calculation unit 202 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
 ブレ補正光学系目標位置演算部203は、角度ブレ演算部201によって算出されたX軸方向およびY軸方向の像ブレと、並進ブレ演算部202によって算出されたX軸方向およびY軸方向の像ブレとを軸ごとに足し合わせて、X軸方向およびY軸方向の像ブレを算出する。例えば、ある軸方向について角度ブレ演算部201によって算出された像ブレと並進ブレ演算部202によって算出された像ブレの向きが同じ場合は、足し合わせにより像ブレが大きくなるが、算出された2つの像ブレの向きが異なる場合は、足し合わせにより像ブレが小さくなる。このように、各軸の像ブレの向きにより正負の符号をつけて足し合わせ演算を行う。 The blur correction optical system target position calculation unit 203 is an image blur in the X axis direction and the Y axis direction calculated by the angle blur calculation unit 201 and an image in the X axis direction and the Y axis direction calculated by the translation blur calculation unit 202. The image blur in the X-axis direction and the Y-axis direction is calculated by adding the blur for each axis. For example, when the image blur calculated by the angle blur calculation unit 201 and the image blur direction calculated by the translation blur calculation unit 202 are the same in a certain axial direction, the image blur increases due to the addition, but the calculated 2 When the directions of the two image blurs are different, the image blur is reduced by the addition. In this way, addition calculation is performed by adding positive and negative signs depending on the image blur direction of each axis.
 次に、ブレ補正光学系目標位置演算部203は、足し合わせ後のX軸方向およびY軸方向の像ブレと、撮影倍率(ズーム光学系31の位置に基づいて算出する)と、カメラ1から被写体80までの距離(フォーカス光学系32の位置に基づいて算出する)とに基づいて、像面(撮像素子22の撮像面)の予め定めた位置における像ブレ量を算出する。 Next, the blur correction optical system target position calculator 203 adds the image blur in the X-axis direction and the Y-axis direction after addition, the photographing magnification (calculated based on the position of the zoom optical system 31), and the camera 1. Based on the distance to the subject 80 (calculated based on the position of the focus optical system 32), the image blur amount at a predetermined position of the image plane (imaging plane of the image sensor 22) is calculated.
 ブレ補正光学系目標位置演算部203は、交換レンズ3のブレ補正駆動機構37を作動させて像ブレ補正を行う場合、算出した像ブレ量を打ち消す向きにブレ補正光学系33を移動させるためのブレ補正光学系33の目標位置を演算する。
 また、ブレ補正光学系目標位置演算部203は、カメラボディ2のブレ補正駆動機構26を作動させて像ブレ補正を行う場合、算出した像ブレ量を打ち消す向き撮像素子22を移動させるための撮像素子22の目標位置を演算する。
The blur correction optical system target position calculation unit 203 moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount when the blur correction drive mechanism 37 of the interchangeable lens 3 is operated to perform the image blur correction. The target position of the blur correction optical system 33 is calculated.
In addition, the blur correction optical system target position calculation unit 203, when operating the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction, performs imaging for moving the imaging element 22 in a direction that cancels the calculated image blur amount. The target position of the element 22 is calculated.
 そして、ブレ補正光学系目標位置演算部203は、交換レンズ3のブレ補正駆動機構37、または、カメラボディ2のブレ補正駆動機構26に対して目標位置を示す信号を送出する。
 なお、ブレ補正光学系目標位置演算部203は、交換レンズ3とカメラボディ2のブレ補正駆動機構に対し、それぞれ目標位置を示す信号を送出することもできる。
 また、交換レンズ3のブレ補正駆動機構37は、カメラボディ2から送信された目標位置がブレ補正駆動機構37の可動範囲を超えている場合には、その旨をカメラボディ2のCPU21へ知らせてもよい。これにより、CPU21は、例えば像ブレ補正の許容範囲を超えていることを知らせるアラームを発するなどの対処を行うことが可能になる。
Then, the shake correction optical system target position calculation unit 203 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3 or the shake correction drive mechanism 26 of the camera body 2.
Note that the shake correction optical system target position calculation unit 203 can also send a signal indicating the target position to the interchangeable lens 3 and the shake correction drive mechanism of the camera body 2, respectively.
Further, when the target position transmitted from the camera body 2 exceeds the movable range of the shake correction drive mechanism 37, the shake correction drive mechanism 37 of the interchangeable lens 3 notifies the CPU 21 of the camera body 2 to that effect. Also good. As a result, the CPU 21 can take measures such as issuing an alarm notifying that the allowable range of image blur correction has been exceeded.
<予め定めた位置における像ブレ>
 角度ブレ演算部201による像ブレの算出について、さらに詳細に説明する。第1の実施の形態では、角度ブレ演算部201が、カメラ1の回転運動によって起こる像ブレを算出する場合に、像面(撮像素子22の撮像面)における位置を予め定めて、この位置についての像ブレを算出する。このようにする理由は、回転運動の回転角が同じでも、像面の位置によって像ブレが異なるからである。
<Image blurring at a predetermined position>
Calculation of image blur by the angle blur calculation unit 201 will be described in more detail. In the first embodiment, when the angle blur calculation unit 201 calculates an image blur caused by the rotational movement of the camera 1, a position on the image plane (imaging plane of the imaging element 22) is determined in advance, and this position is determined. Image blur is calculated. The reason for this is that even if the rotational angle of the rotational motion is the same, the image blur differs depending on the position of the image plane.
 図3は、角速度センサ39aによる角速度の検出方向と、像面70(撮像素子22の撮像面)における像ブレを説明する模式図である。図3において、像面70と交換レンズ3の光軸L1とが交差する点を座標の原点とし、交換レンズ3の光軸L1をZ軸として、像面70をXY平面として表している。図3によれば、光軸L1が撮像面の中心と交差する。交換レンズ3および被写体80は、像面70に対してZ軸プラス方向に位置する。角速度センサ39aは、例えば、X軸と平行な軸(small-x軸)周り(Pitch方向)の回転角θを検出する。被写体80が遠方にある場合には、図3、図4中の符号fは焦点距離を表す。 FIG. 3 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging device 22). In FIG. 3, the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates, the optical axis L1 of the interchangeable lens 3 is the Z axis, and the image plane 70 is represented as the XY plane. According to FIG. 3, the optical axis L1 intersects the center of the imaging surface. The interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70. The angular velocity sensor 39a detects, for example, the rotation angle θ around the axis (small-x axis) parallel to the X axis (Pitch direction). When the subject 80 is far away, the symbol f in FIGS. 3 and 4 represents the focal length.
 カメラ1が振れることにより、振れの前に像面70における座標(0,yp)に位置した被写体80の像は、振れの後にY軸マイナス方向に移動する。移動した被写体80の像の位置は、座標(0,yp-Δy2)である。図4は、図3の像ブレΔy2を説明する図であり、図3におけるYZ平面を表している。 When the camera 1 is shaken, the image of the subject 80 located at the coordinates (0, yp) on the image plane 70 before the shake moves in the Y-axis minus direction after the shake. The position of the image of the moved subject 80 is the coordinates (0, yp-Δy2). FIG. 4 is a diagram for explaining the image blur Δy2 in FIG. 3, and represents the YZ plane in FIG.
 像ブレΔy2を数式で表すと、次式(1)となる。
 Δy2=f×tan(θ+tan-1(yp/f))-yp …(1)
ただし、Pitch方向の回転角(手振れ角を表し、一般的には0.5度程度である)をθとする。被写体80が遠方にある場合には、図3、図4中の符号fは交換レンズ3の焦点距離を表す。
The image blur Δy2 is expressed by the following formula (1).
Δy2 = f × tan (θ + tan −1 (yp / f)) − yp (1)
However, the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is θ. When the subject 80 is far away, the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
 上式(1)と比較するため、カメラ1が振れる前に、像面70の中心の座標(0,0)に位置した被写体80の像の像ブレΔy1を説明する。交換レンズ3のPitch方向の回転角は、上記と同じθであるとする。カメラ1が振れることにより、振れの前に像面70における座標(0,0)に位置した被写体80の像は、振れの後にY軸マイナス方向に移動する。移動した被写体80の像の位置は、座標(0,-Δy1)である。 For comparison with the above equation (1), the image blur Δy1 of the image of the subject 80 positioned at the coordinate (0, 0) of the center of the image plane 70 before the camera 1 shakes will be described. It is assumed that the angle of rotation of the interchangeable lens 3 in the pitch direction is the same as the above. As the camera 1 shakes, the image of the subject 80 located at the coordinate (0, 0) on the image plane 70 before the shake moves in the Y-axis minus direction after the shake. The position of the image of the moved subject 80 is the coordinates (0, −Δy1).
 像ブレΔy1を数式で表すと、次式(2)となる。
 Δy1=f×tanθ            …(2)
 上式(1)および(2)によれば、焦点距離fがypに比べて十分に大きい場合は、回転角θ(手振れ角度)は一般的に0.5度程度なので、Δy1≒Δy2とみなすことができる。すなわち、像面70における被写体80の像の位置が像面70の中心(本例では原点)にある場合も、中心から離れた位置にある場合も、換言すると、光軸L1からの距離が異なっていても、像ブレは略同じとみなせる。このことは、像面70における位置をどこに定めて像ブレを算出してもよいことを意味する。このため、例えば、像面70の中心において算出した像ブレに基づいて像ブレ補正を行うと、像面70の中心に位置する被写体80の像も、像面70の中心から離れた位置の被写体80の像も、いずれも像ブレを抑制することができる。
The image blur Δy1 is expressed by the following formula (2).
Δy1 = f × tan θ (2)
According to the above formulas (1) and (2), when the focal length f is sufficiently larger than yp, the rotation angle θ (camera shake angle) is generally about 0.5 degrees, so that it is considered that Δy1≈Δy2. be able to. In other words, whether the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different. Even so, the image blur can be regarded as almost the same. This means that the position of the image plane 70 can be determined anywhere to calculate the image blur. For this reason, for example, when image blur correction is performed based on the image blur calculated at the center of the image plane 70, the image of the subject 80 positioned at the center of the image plane 70 is also a subject at a position away from the center of the image plane 70. Any of the 80 images can suppress image blurring.
 しかしながら、交換レンズ3が広角レンズである場合のように、焦点距離fがypに比べて十分に大きいといえない場合は、Δy1<Δy2となる。そのため、像面70における位置をいずれかに定めて像ブレを算出することが必要となる。例えば、像面70の中心において算出した像ブレに基づいて像ブレ補正を行うと、像面70の中心に位置する被写体80の像の像ブレを抑制できても、像面70の中心から離れた位置の被写体80の像については、Δy2とΔy1との差に相当する像ブレが抑制できずに残ってしまうからである。Δy2とΔy1との差は、像ブレを算出する位置が像面70の周辺に向かうほど、すなわち、像高が高い位置になるほど大きくなる。 However, when the focal length f is not sufficiently larger than yp, as in the case where the interchangeable lens 3 is a wide-angle lens, Δy1 <Δy2. Therefore, it is necessary to calculate the image blur by setting the position on the image plane 70 to any one. For example, if image blur correction is performed based on the image blur calculated at the center of the image plane 70, the image blur of the subject 80 located at the center of the image plane 70 can be suppressed, but it is far from the center of the image plane 70. This is because the image blur corresponding to the difference between Δy2 and Δy1 remains without being suppressed for the image of the subject 80 at the selected position. The difference between Δy2 and Δy1 increases as the position at which image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
<像ブレを算出する位置>
 ユーザは、多くの場合において、撮像される被写体80のうち主要被写体の像の像ブレを抑制することを望む。そこで、第1の実施の形態におけるCPU21は、後述するように、像面70において主要被写体の像が存在する可能性が高い位置を定める。そして、角度ブレ演算部201が、CPU21によって定められた位置の像ブレを算出し、この像ブレに基づく像ブレ補正を行う。
 CPU21は、像ブレを算出する位置を定めるために、以下の(1)から(3)のいずれかの方法を選択する。CPU21は、像ブレを算出する位置を定めた後に、例えば、構図変更に伴うカメラ1の動き量を検出した場合には、像ブレを算出する位置を設定し直す(更新する)。振れセンサ39は、動き量検出部としても機能する。
<Position for calculating image blur>
In many cases, the user desires to suppress image blurring of the main subject image of the subject 80 to be imaged. Therefore, the CPU 21 in the first embodiment determines a position where the image of the main subject is highly likely to exist on the image plane 70, as will be described later. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
The CPU 21 selects one of the following methods (1) to (3) in order to determine a position where image blur is calculated. After determining the position for calculating the image blur, the CPU 21 resets (updates) the position for calculating the image blur, for example, when the amount of motion of the camera 1 associated with the composition change is detected. The shake sensor 39 also functions as a motion amount detection unit.
(1)フォーカスエリアの位置
 一つ目は、フォーカスエリアの位置において像ブレを算出する方法である。図5は、撮像画面90に形成されたフォーカスエリアを例示する図である。フォーカスエリアは、AFセンサ25が焦点調節状態を検出するエリアであり、焦点検出エリア、測距点、オートフォーカス(AF)ポイントとも称される。第1の実施の形態では、撮像画面90の中に予め11ヶ所のフォーカスエリア25P-1~25P-11が設けられている。CPU21は、11ヶ所のフォーカスエリアにおいてデフォーカス量を求めることができる。
 なお、フォーカスエリア25P-1~25P-11の数は一例であり、11より多くても少なくても構わない。
(1) Focus Area Position The first method is to calculate image blur at the focus area position. FIG. 5 is a diagram illustrating a focus area formed on the imaging screen 90. The focus area is an area where the AF sensor 25 detects the focus adjustment state, and is also referred to as a focus detection area, a distance measuring point, and an autofocus (AF) point. In the first embodiment, eleven focus areas 25P-1 to 25P-11 are provided in the imaging screen 90 in advance. The CPU 21 can obtain the defocus amount in 11 focus areas.
The number of focus areas 25P-1 to 25P-11 is an example, and the number may be larger or smaller than 11.
 CPU21は、像面70において像ブレを算出する位置を、選択されたフォーカスエリアに対応する位置に定める。そして、角度ブレ演算部201が、CPU21により定められた位置の像ブレを算出し、この像ブレに基づく像ブレ補正を行う。像面70において像ブレを算出する位置を、選択されたフォーカスエリアに対応する位置に定めるのは、焦点調節のためのデフォーカス量を求める位置に、主要被写体が存在する可能性が高いためである。
 なお、フォーカスエリアの選択は、操作部材29からの操作信号に基づくフォーカスエリアをCPU21が選んでもよいし、カメラ1に近い被写体80に対応するフォーカスエリアをCPU21が選んでもよい。CPU21は、例えばフォーカス光学系32の位置に基づき、カメラ1に近い被写体80に対応するフォーカスエリアを選ぶことができる。
 また、CPU21は、被写体80の像のうち、コントラストが高い被写体80に対応するフォーカスエリアを選んでもよく、被写体80の像のうち、輝度値が高い被写体80に対応するフォーカスエリアを選んでもよい。
The CPU 21 determines a position for calculating the image blur on the image plane 70 as a position corresponding to the selected focus area. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur. The reason why the image blur calculation position on the image plane 70 is set to the position corresponding to the selected focus area is that there is a high possibility that the main subject is present at the position for obtaining the defocus amount for focus adjustment. is there.
For the selection of the focus area, the CPU 21 may select the focus area based on the operation signal from the operation member 29, or the CPU 21 may select the focus area corresponding to the subject 80 close to the camera 1. For example, the CPU 21 can select a focus area corresponding to the subject 80 close to the camera 1 based on the position of the focus optical system 32.
Further, the CPU 21 may select a focus area corresponding to the subject 80 having a high contrast among the images of the subject 80, or may select a focus area corresponding to the subject 80 having a high luminance value among the images of the subject 80.
(2)被写体の位置
 二つ目は、写り込む物体(被写体80)の位置において像ブレを算出する方法である。CPU21は、例えば、ライブビュー画像に被写体80として写っている物体を公知の物体認識処理によって認識し、ライブビュー画像における物体(被写体80)の位置を主要被写体の位置とする。そして、像面70において像ブレを算出する位置を、主要被写体に対応する位置に定める。角度ブレ演算部201は、CPU21により定められた位置の像ブレを算出し、この像ブレに基づく像ブレ補正を行う。
(2) Subject Position The second method is to calculate image blur at the position of the object (subject 80). For example, the CPU 21 recognizes an object appearing as the subject 80 in the live view image by a known object recognition process, and sets the position of the object (subject 80) in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject. The angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
 ライブビュー画像とは、本撮像が行われる前において撮像素子22によって所定の間隔(例えば60fps)で取得されるモニタ用画像である。CPU21は、例えば、操作部材29を構成するライブビューボタンが操作された場合に、ミラー24をアップ位置へ回動させた状態を保ち、撮像素子22によってライブビュー画像の取得を開始させる。CPU21は、ライブビュー画像を液晶表示部30に表示させることもできる。 The live view image is a monitor image acquired at a predetermined interval (for example, 60 fps) by the image sensor 22 before the main imaging is performed. For example, when the live view button constituting the operation member 29 is operated, the CPU 21 maintains the state where the mirror 24 is rotated to the up position, and starts the acquisition of the live view image by the image sensor 22. The CPU 21 can also display the live view image on the liquid crystal display unit 30.
 CPU21は、例えば、ライブビュー画像の各フレームに基づいて、主要被写体の位置を逐次更新することで、移動する物体(被写体80)を追尾することもできる。この場合には、角度ブレ演算部201が、CPU21により逐次更新された位置の像ブレを逐次算出することによって、ライブビュー画像を取得する際に、移動する物体(被写体80)に対する像ブレ補正を行う。
 また、CPU21は、カメラ1がパンニングされた場合においても、ライブビュー画像の各フレーム内の主要被写体の位置を逐次更新することで、移動する物体(被写体80)を追尾することもできる。
For example, the CPU 21 can also track the moving object (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image. In this case, the image blur correction unit 201 performs image blur correction on the moving object (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21. Do.
Further, even when the camera 1 is panned, the CPU 21 can track the moving object (subject 80) by sequentially updating the position of the main subject in each frame of the live view image.
 CPU21は、例えば、カメラ1が「風景」、「料理」、「花」、「動物」などの撮像シーンモードに設定された場合に二つ目の方法を選択するとともに、物体認識処理を開始するようにしてもよい。また、カメラ1に設定された「風景」、「料理」、「花」、「動物」などの撮像シーンモードによって、物体認識する対象を切り替えてもよい。 For example, when the camera 1 is set to an imaging scene mode such as “landscape”, “cooking”, “flower”, “animal”, the CPU 21 selects the second method and starts object recognition processing. You may do it. In addition, the object recognition target may be switched according to an imaging scene mode such as “landscape”, “cooking”, “flower”, “animal” set in the camera 1.
(3)顔の位置
 三つ目は、写り込む顔(被写体80)の位置において像ブレを算出する方法である。CPU21は、例えば、ライブビュー画像に被写体80として写っている顔を公知の顔認識処理によって認識し、ライブビュー画像における顔の位置を主要被写体の位置とする。そして、像面70において像ブレを算出する位置を、主要被写体に対応する位置に定める。角度ブレ演算部201は、CPU21により定められた位置の像ブレを算出し、この像ブレに基づく像ブレ補正を行う。
(3) Face Position The third method is to calculate image blur at the position of the face (subject 80). For example, the CPU 21 recognizes the face shown as the subject 80 in the live view image by a known face recognition process, and sets the position of the face in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject. The angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
 CPU21は、例えば、操作部材29を構成するライブビューボタンが操作された場合に、ミラー24をアップ位置へ回動させた状態を保ち、撮像素子22によってライブビュー画像の取得を開始させる。 CPU21 keeps the state which rotated the mirror 24 to the up position, for example, when the live view button which comprises the operation member 29 is operated, and starts acquisition of a live view image with the image pick-up element 22. FIG.
 CPU21は、上記(2)と同様に、ライブビュー画像の各フレームに基づいて、主要被写体の位置を逐次更新することで、移動する顔(被写体80)を追尾することもできる。角度ブレ演算部201は、CPU21により逐次更新された位置の像ブレを逐次算出することによって、ライブビュー画像を取得する際に、移動する顔(被写体80)に対する像ブレ補正を行う。 The CPU 21 can also track the moving face (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image, as in (2) above. The angle blur calculation unit 201 performs image blur correction on the moving face (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21.
 CPU21は、例えば、カメラ1の撮像シーンモードが「ポートレート」に設定された場合に三つ目の方法を選択するとともに、顔認識処理を開始するようにしてもよい。 The CPU 21 may select the third method and start the face recognition process when the imaging scene mode of the camera 1 is set to “portrait”, for example.
<像ブレを算出する位置が複数存在する場合>
 上記(1)から上記(3)の方法は、いずれも、像面70において像ブレを算出する位置を1つだけ定める場合を例示した。しかしながら、以下のように、複数の位置が像ブレを算出する位置の候補になる場合がある。具体例を挙げると、上記(1)において複数のフォーカスエリアが選ばれた場合、または、上記(2)において複数の物体(被写体80)が認識された場合、または、上記(3)において複数の顔が認識された場合である。このような場合において、CPU21は以下の(4)または(5)の方法を選択する。
<When there are multiple positions to calculate image blur>
In any of the above methods (1) to (3), the case where only one position where the image blur is calculated on the image plane 70 is determined. However, as described below, a plurality of positions may be candidates for positions where image blur is calculated. Specifically, when a plurality of focus areas are selected in (1) above, or when a plurality of objects (subject 80) are recognized in (2) above, or a plurality of focus areas in (3) above are described. This is the case when a face is recognized. In such a case, the CPU 21 selects the following method (4) or (5).
(4)1つの代表位置を定める
 四つ目は、1つの代表位置において像ブレを算出する方法である。図6は、複数の候補から1つの代表位置を定める例を説明する図である。例えば像面70において、図5のフォーカスエリア25P-1に対応する位置P-1と、フォーカスエリア25P-2に対応する位置P-2と、フォーカスエリア25P-4に対応する位置P-4との3点が候補になる場合、CPU21は、複数の候補の位置とX軸(図3)との距離の絶対値、および、複数の候補の位置とY軸(図3)との距離の絶対値とに基づいて平均となる位置Pを求め、位置Pを代表位置とする。そして、像面70において像ブレを算出する位置を、代表位置Pに定める。このように、像面70の軸(X軸、Y軸)上の距離の絶対値の平均により、代表位置Pを求める。
(4) Determine one representative position The fourth is a method for calculating image blur at one representative position. FIG. 6 is a diagram illustrating an example in which one representative position is determined from a plurality of candidates. For example, on the image plane 70, a position P-1 corresponding to the focus area 25P-1 in FIG. 5, a position P-2 corresponding to the focus area 25P-2, and a position P-4 corresponding to the focus area 25P-4. When the three points become candidates, the CPU 21 determines the absolute value of the distance between the plurality of candidate positions and the X axis (FIG. 3) and the absolute value of the distance between the plurality of candidate positions and the Y axis (FIG. 3). Based on the value, an average position P is obtained, and the position P is set as a representative position. Then, the position where the image blur is calculated on the image plane 70 is determined as the representative position P. In this way, the representative position P is obtained by averaging the absolute values of the distances on the axis (X axis, Y axis) of the image plane 70.
 角度ブレ演算部201は、代表位置Pにおいて像ブレを算出し、この像ブレに基づく像ブレ補正を行う。 The angle blur calculation unit 201 calculates image blur at the representative position P, and performs image blur correction based on the image blur.
 上記(4)の説明では、複数のフォーカスエリアが選ばれた場合を例示したが、複数の物体(被写体80)が認識された場合や、複数の顔が認識された場合も同様である。例えば、CPU21は、認識した複数の物体の位置や、認識した複数の顔の位置に基づいて、上述したように代表位置Pを定める。角度ブレ演算部201は、CPU21により定められた代表位置Pについて像ブレを算出し、この像ブレに基づく像ブレ補正を行う。 In the description of (4) above, the case where a plurality of focus areas are selected is exemplified, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized. For example, the CPU 21 determines the representative position P as described above based on the positions of the recognized objects and the positions of the recognized faces. The angle blur calculation unit 201 calculates image blur for the representative position P determined by the CPU 21, and performs image blur correction based on the image blur.
(5)1つの像ブレを求める
 五つ目は、複数の像ブレに基づいて1つの像ブレを算出する方法である。図6の例を参照すると、例えば像面70において、図5のフォーカスエリア25P-1に対応する位置P-1と、フォーカスエリア25P-2に対応する位置P-2と、フォーカスエリア25P-4に対応する位置P-4との3点が候補である。
(5) Obtaining one image blur The fifth method is to calculate one image blur based on a plurality of image blurs. Referring to the example of FIG. 6, for example, on the image plane 70, the position P-1 corresponding to the focus area 25P-1 in FIG. 5, the position P-2 corresponding to the focus area 25P-2, and the focus area 25P-4. Three points with the position P-4 corresponding to are candidates.
 CPU21は、複数の位置を、それぞれ像面70において像ブレを算出する位置に定める。角度ブレ演算部201は、像面70における位置P-1と、位置P-2と、位置P-4とにおいてそれぞれ像ブレを算出する。角度ブレ演算部201はさらに、算出した複数の像ブレの平均を求め、像ブレの平均値に基づく像ブレ補正を行う。
 像ブレの平均値は、例えば単純平均により求めるが、重み付け平均により求めてもよい。
The CPU 21 determines a plurality of positions as positions where image blur is calculated on the image plane 70. The angle blur calculation unit 201 calculates image blur at the position P-1, the position P-2, and the position P-4 on the image plane 70, respectively. The angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
The average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
 上記(5)の説明では、複数のフォーカスエリアが選ばれた場合を例示したが、複数の物体(被写体80)が認識された場合や、複数の顔が認識された場合も同様である。例えば、CPU21は、認識した複数の物体の位置や、認識した複数の顔の位置を、それぞれ像面70において像ブレを算出する位置に定める。角度ブレ演算部201は、像面70における各位置についてそれぞれ像ブレを算出する。角度ブレ演算部201はさらに、算出した複数の像ブレの平均を求め、像ブレの平均値に基づいて像ブレ補正を行う。
 なお、(4)の変形例として、複数の被写体から1つの被写体を選択してもよい。例えば、複数の被写体から像ブレが大きい被写体を選択する。または、複数の被写体からカメラ1からの距離が近い像ブレが大きい被写体を選択する。または、複数の被写体から交換レンズ3の光軸L1からの像高が高い被写体を選択する。
In the description of (5) above, the case where a plurality of focus areas are selected is illustrated, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized. For example, the CPU 21 determines the positions of the plurality of recognized objects and the positions of the plurality of recognized faces as positions at which image blur is calculated on the image plane 70, respectively. The angle blur calculation unit 201 calculates an image blur for each position on the image plane 70. The angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
As a modification of (4), one subject may be selected from a plurality of subjects. For example, a subject with a large image blur is selected from a plurality of subjects. Alternatively, a subject with a large image blur that is close to the camera 1 from a plurality of subjects is selected. Alternatively, a subject having a high image height from the optical axis L1 of the interchangeable lens 3 is selected from a plurality of subjects.
 なお、第1の実施の形態における像ブレ補正は、カメラ1がPitch方向に回転した場合におけるY軸方向の補正と、カメラ1がYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上述した第1の実施の形態の説明は、カメラ1がPitch方向に回転した場合におけるY軸方向の補正について、代表して説明したものである。カメラ1がYaw方向にも回転した場合には、X軸方向に対して上述した補正と同様の補正が必要である。カメラ1がPitch方向に回転した場合におけるY軸方向の補正と、カメラ1がYaw方向に回転した場合におけるX軸方向の補正とは、方向が異なる以外は同様であるので、X軸方向の補正の説明については省略する。
Note that image blur correction in the first embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
The description of the first embodiment described above is representative of the correction in the Y-axis direction when the camera 1 rotates in the pitch direction. When the camera 1 also rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction. Since the correction in the Y-axis direction when the camera 1 rotates in the pitch direction and the correction in the X-axis direction when the camera 1 rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
 なお、第1の実施の形態において、並進ブレ演算部202によって算出される像ブレについては像面70(撮像素子22の撮像面)における位置が異なっても略一定として扱う。
 第1の実施の形態の概要は、以下の通りである。
 角度ブレ演算部201は、像ブレを算出する位置を、像面70におけるいずれかの位置に定めて像ブレを算出する。
 並進ブレ演算部202は、像ブレを算出する位置を、例えば像面70の中心に定めて像ブレを算出する。
 ブレ補正光学系目標位置演算部203は、角度ブレ演算部201で算出された像ブレおよび並進ブレ演算部202によって算出された像ブレを、X軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の位置における像ブレ量を算出する。
In the first embodiment, image blur calculated by the translation blur calculation unit 202 is treated as substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different.
The outline of the first embodiment is as follows.
The angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70.
The translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
The blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
 以上説明した第1の実施の形態によれば、次の作用効果が得られる。
(1)カメラ1のブレ補正装置は、カメラ1の振れを検出する振れセンサ39と、振れセンサ39の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部21aと、像面70における位置を決めるCPU21とを備える。ブレ補正部21aは、CPU21による決定位置と、振れセンサ39によって検出された、例えばY軸方向の振れとに基づいて、Y軸方向の像ブレΔy2を演算する。これにより、CPU21が決定した像面70の位置が、光軸L1と交差する像面70の中央以外の位置である場合にも、適切に像ブレを抑えることができる。とくに、交換レンズ3の焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。
According to the first embodiment described above, the following operational effects can be obtained.
(1) The shake correction device of the camera 1 detects the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the camera 1 and the output of the shake sensor 39. It includes a blur correction unit 21 a that calculates and a CPU 21 that determines a position on the image plane 70. The blur correction unit 21a calculates the image blur Δy2 in the Y-axis direction based on the position determined by the CPU 21 and the shake in the Y-axis direction detected by the shake sensor 39, for example. Thereby, even when the position of the image plane 70 determined by the CPU 21 is a position other than the center of the image plane 70 intersecting the optical axis L1, the image blur can be appropriately suppressed. In particular, it is suitable when the focal length f of the interchangeable lens 3 is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
(2)上記(1)のブレ補正装置において、ブレ補正部21aは、像面70において、Y軸方向と交差するX軸方向の軸から上記決定位置までの距離が長いほど大きなブレ量を演算するので、像高が高い位置においても、適切に像ブレを抑えることができる。 (2) In the blur correction device of (1), the blur correction unit 21a calculates a larger blur amount on the image plane 70 as the distance from the X-axis direction axis intersecting the Y-axis direction to the determined position is longer. Therefore, image blur can be suppressed appropriately even at a position where the image height is high.
(3)上記(2)のブレ補正装置において、ブレ補正部21aは、振れセンサ39の出力と、上記距離と、撮像光学系の焦点距離とによりブレ量を演算するので、焦点距離fが異なる交換レンズ3に交換された場合にも、適切に像ブレを抑えることができる。 (3) In the shake correction device of (2) above, the shake correction unit 21a calculates the shake amount based on the output of the shake sensor 39, the distance, and the focal length of the imaging optical system, so the focal length f differs. Even when the interchangeable lens 3 is replaced, image blur can be appropriately suppressed.
(4)上記(1)から(3)のブレ補正装置において、CPU21は、像面70において撮像光学系の焦点調節の対象になるフォーカスエリアの位置を上記決定位置とするので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (4) In the blur correction device described in (1) to (3) above, the CPU 21 uses the position of the focus area that is the target of focus adjustment of the imaging optical system on the image plane 70 as the determined position. Therefore, it is possible to appropriately suppress image blurring at a position where there is a high possibility of image blurring.
(5)上記(1)から(3)のブレ補正装置において、CPU21は、被写体像のコントラスト情報に基づいて上記決定位置を決めるので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (5) In the blur correction device described in (1) to (3) above, the CPU 21 determines the determined position based on the contrast information of the subject image, so that the image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
(6)上記(1)から(3)のブレ補正装置において、CPU21は、被写体80の像の輝度値情報に基づいて上記決定位置を決めるので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (6) In the blur correction device described in (1) to (3) above, the CPU 21 determines the determined position based on the luminance value information of the image of the subject 80, so that the main subject is highly likely to exist. Image blur can be suppressed appropriately.
(7)上記(1)から(3)のブレ補正装置において、CPU21は、被写体80の像に基づく被写体認識情報に基づいて上記決定位置を決めるので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (7) In the blur correction device described in (1) to (3) above, the CPU 21 determines the determined position based on the subject recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
(8)上記(1)から(3)のブレ補正装置において、CPU21は、被写体80の像に基づく顔認識情報に基づいて上記決定位置を決めるので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (8) In the shake correction apparatus of (1) to (3) above, the CPU 21 determines the determined position based on the face recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
(9)上記(4)から(8)のブレ補正装置において、CPU21は、設定されている撮像シーンモードにより上記決定位置を決めるので、主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (9) In the blur correction device according to (4) to (8) above, the CPU 21 determines the determined position according to the set imaging scene mode, so that an image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
(10)上記(1)から(3)のブレ補正装置において、CPU21は、像面70においてユーザ操作によって指示された位置を上記決定位置とするので、ユーザが望む位置で、適切に像ブレを抑えることができる。 (10) In the blur correction device described in (1) to (3) above, the CPU 21 sets the position designated by the user operation on the image plane 70 as the determined position, so that the image blur is appropriately performed at the position desired by the user. Can be suppressed.
(11)上記(1)から(3)のブレ補正装置において、CPU21は、撮影距離情報に基づいて、例えばカメラ1に近い被写体80に対応する位置を上記決定位置とするので、主要被写体に対応する位置で、適切に像ブレを抑えることができる。 (11) In the shake correction apparatus of (1) to (3) above, the CPU 21 sets the position corresponding to the subject 80 close to the camera 1, for example, based on the shooting distance information, so that it corresponds to the main subject. The image blur can be appropriately suppressed at the position where the image is to be moved.
(12)上記(1)から(3)のブレ補正装置において、振れセンサ39の出力に基づき、構図変更による動き量を検出するCPU21を備え、CPU21により上記決定位置が決められた後にCPU21によって動き量が検出された場合、ブレ補正部21bは、動き量に基づいて決定位置を変更した位置に基づき、ブレ量を演算する。これにより、構図変更の後に主要被写体が存在する可能性が高い位置で、適切に像ブレを抑えることができる。 (12) The blur correction device according to (1) to (3) above includes a CPU 21 that detects the amount of movement due to composition change based on the output of the shake sensor 39, and the CPU 21 moves after the determined position is determined by the CPU 21. When the amount is detected, the shake correction unit 21b calculates the shake amount based on the position where the determined position is changed based on the amount of motion. Accordingly, it is possible to appropriately suppress image blur at a position where there is a high possibility that the main subject exists after the composition change.
(13)上記(4)のブレ補正装置において、CPU21は、撮像光学系の焦点調節の対象になるフォーカスエリアが複数存在する場合に、複数のフォーカスエリアの位置に基づき、像面70の軸(X軸、Y軸)上の距離の絶対値の重心(代表位置P)を上記決定位置とするので、複数のフォーカスエリアの位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (13) In the blur correction device according to (4) above, when there are a plurality of focus areas to be subjected to focus adjustment of the imaging optical system, the CPU 21 determines the axis of the image plane 70 based on the positions of the plurality of focus areas ( Since the center of gravity (representative position P) of the absolute value of the distance on the X-axis and Y-axis) is the determined position, image blurring is appropriately performed so that image blurring at a plurality of focus area positions is approximately the same. Can be suppressed.
(14)上記(7)のブレ補正装置において、CPU21は、被写体認識情報によって被写体が複数存在する場合に、複数の被写体の位置に基づき、像面70の軸(X軸、Y軸)上の距離の絶対値の重心(代表位置P)を上記決定位置とするので、複数の被写体の位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (14) In the blur correction device of (7) above, when there are a plurality of subjects based on the subject recognition information, the CPU 21 is on the axis (X axis, Y axis) of the image plane 70 based on the positions of the plurality of subjects. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of subjects becomes approximately the same.
(15)上記(8)のブレ補正装置において、CPU21は、顔認識情報によって顔が複数存在する場合に、複数の顔の位置に基づき、像面70の軸(X軸、Y軸)上の距離の絶対値の重心(代表位置P)を上記決定位置とするので、複数の顔の位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (15) In the shake correction apparatus of (8), when there are a plurality of faces based on the face recognition information, the CPU 21 is based on the positions of the plurality of faces on the axes (X axis, Y axis) of the image plane 70. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of faces becomes approximately the same.
(16)上記(4)のブレ補正装置において、CPU21は、撮像光学系の焦点調節の対象になるフォーカスエリアが複数存在する場合に、複数のフォーカスエリアの位置を上記決定位置とし、ブレ補正部21bは、複数の決定位置に基づいて演算した複数のブレ量の平均値を演算する。これにより、複数のフォーカスエリアの位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (16) In the blur correction device of (4), when there are a plurality of focus areas that are targets of focus adjustment of the imaging optical system, the CPU 21 sets the positions of the plurality of focus areas as the determined positions, and the blur correction unit 21b calculates an average value of a plurality of shake amounts calculated based on a plurality of determined positions. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
(17)上記(7)のブレ補正装置において、CPU21は、被写体認識情報によって主要被写体が複数存在する場合に、複数の主要被写体の位置を上記決定位置とし、ブレ補正部21bは、複数の決定位置に基づいて演算した複数のブレ量の平均値を演算する。これにより、複数のフォーカスエリアの位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (17) In the blur correction device of (7), when there are a plurality of main subjects based on the subject recognition information, the CPU 21 sets the positions of the plurality of main subjects as the determined positions, and the blur correction unit 21b determines the plurality of determinations. An average value of a plurality of shake amounts calculated based on the position is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
(18)上記(8)のブレ補正装置において、CPU21は、顔認識情報によって顔が複数存在する場合に、複数の顔の位置を上記決定位置とし、ブレ補正部21bは、複数の決定位置に基づいて演算した複数のブレ量の平均値を演算する。これにより、複数のフォーカスエリアの位置での像ブレが同程度になるように、適切に像ブレを抑えることができる。 (18) In the shake correction apparatus of (8), when there are a plurality of faces based on the face recognition information, the CPU 21 sets the positions of the plurality of faces as the determined positions, and the shake correction unit 21b sets the determined positions to the determined positions. An average value of a plurality of blur amounts calculated based on the above is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
 次のような変形も発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態、あるいは後述する実施の形態と組み合わせることも可能である。
(変形例1)
 第1の実施の形態では、カメラ1が、交換レンズ3のブレ補正駆動機構37を作動させて行う像ブレ補正を例に説明した。この代わりに、第1の実施の形態の変形例1では、カメラ1が、カメラボディ2のブレ補正駆動機構26を作動させて像ブレ補正を行う。第1の実施の形態の変形例1による像ブレ補正も、第1の実施の形態と同様に行うことができ、第1の実施の形態と同様の作用効果を奏する。
The following modifications are also within the scope of the invention, and one or a plurality of modifications can be combined with the above-described embodiment or an embodiment described later.
(Modification 1)
In the first embodiment, the image blur correction performed by the camera 1 by operating the blur correction drive mechanism 37 of the interchangeable lens 3 has been described as an example. Instead, in the first modification of the first embodiment, the camera 1 operates the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction. Image blur correction according to the first modification of the first embodiment can be performed in the same manner as in the first embodiment, and the same effects as those in the first embodiment can be obtained.
(変形例2)
 第1の実施の形態において説明した上記(3)の三つ目の方法、すなわち、写り込む顔(被写体80)の位置の像ブレを算出する場合において、例えば、顔が画面に大きく写る場合には、CPU21は、上記(5)の五つ目の方法を選択してもよい。
(Modification 2)
In the third method (3) described in the first embodiment, that is, when the image blur at the position of the face (subject 80) to be captured is calculated, for example, when the face appears large on the screen. The CPU 21 may select the fifth method of (5) above.
 図7は、第1の実施の形態の変形例2を説明する図である。図7を参照して、複数の候補から1つの代表位置を定める例を説明する。図7によれば、像面70に顔(被写体)が大きく写っている。CPU21は、像面70において、例えば、検出した顔の左端の位置P-aと、右端の位置P-bとの2点を候補位置とする。 FIG. 7 is a diagram for explaining a modification 2 of the first embodiment. An example in which one representative position is determined from a plurality of candidates will be described with reference to FIG. According to FIG. 7, a large face (subject) is shown on the image plane 70. On the image plane 70, the CPU 21 sets, for example, two points, that is, the detected left edge position Pa and the right edge position Pb of the face.
 CPU21は、上記2点の候補位置を、それぞれ像面70において像ブレを算出する位置に定める。角度ブレ演算部201は、像面70における位置P-aと、位置P-bとにおいてそれぞれ像ブレを算出する。角度ブレ演算部201はさらに、算出した複数の像ブレの平均を求め、像ブレの平均値に基づいて像ブレ補正を行う。
 像ブレの平均値は、例えば単純平均により求めるが、重み付け平均により求めてもよい。
The CPU 21 determines the two candidate positions as positions where image blur is calculated on the image plane 70. The angle blur calculation unit 201 calculates image blur at a position Pa and a position Pb on the image plane 70, respectively. The angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
The average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
 以上説明した第1の実施の形態の変形例2によれば、顔が大きく写る場合において、顔の両端における像ブレが同程度になるように像ブレ補正を行うことができる。これにより、顔の左右で像ブレの大きさが異なる場合に比べて、ユーザから見た違和感を抑えることができる。 As described above, according to the second modification of the first embodiment, when a face appears large, image blur correction can be performed so that the image blur at both ends of the face becomes the same. Thereby, compared with the case where the magnitude | size of image blur differs by right and left of a face, the discomfort seen from the user can be suppressed.
(第2の実施の形態)
 第2の実施の形態では、検出した角速度の方向と交差する方向(異なる方向)についての像ブレを説明する。
 カメラ1は、図1に例示した一眼レフタイプでも、ミラー24を備えないミラーレスタイプでもよい。
 また、カメラ1を、交換レンズ3とカメラボディ2とを一体にしたレンズ一体型として構成してもよい。
 さらにまた、撮像装置はカメラ1に限らず、撮像センサを備えたレンズ鏡筒や、撮像機能を備えたスマートフォン等であってもよい。
(Second Embodiment)
In the second embodiment, image blurring in a direction (different direction) intersecting with the detected angular velocity direction will be described.
The camera 1 may be a single lens reflex type illustrated in FIG. 1 or a mirrorless type without the mirror 24.
The camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
Furthermore, the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
 図8は、角速度センサ39aによる角速度の検出方向と、像面70(撮像素子22の撮像面)における像ブレを説明する模式図である。図8において、像面70と交換レンズ3の光軸L1とが交差する点を座標の原点とし、交換レンズ3の光軸L1をZ軸として、像面70をXY平面として表している。図8によれば、光軸L1が撮像面の中心と交差する。交換レンズ3および被写体80は、像面70に対してZ軸プラス方向に位置する。角速度センサ39aは、例えば、X軸と平行な軸(small-x軸)周り(Pitch方向)の回転角θを検出する。被写体80が遠方にある場合には、図3、図4中の符号fは焦点距離を表す。 FIG. 8 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging element 22). In FIG. 8, the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates, the optical axis L1 of the interchangeable lens 3 is the Z axis, and the image plane 70 is represented as the XY plane. According to FIG. 8, the optical axis L1 intersects the center of the imaging surface. The interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70. The angular velocity sensor 39a detects, for example, the rotation angle θ around the axis (small-x axis) parallel to the X axis (Pitch direction). When the subject 80 is far away, the symbol f in FIGS. 3 and 4 represents the focal length.
 カメラ1が振れることにより、振れの前に像面70における座標(xp,yp)に位置した被写体80の像は、振れの後にY軸マイナス方向かつX軸プラス方向に移動する。従って、被写体80の像の座標は、(xp+Δx2,yp-Δy2)となる。 When the camera 1 is shaken, the image of the subject 80 located at the coordinates (xp, yp) on the image plane 70 before the shake moves in the Y-axis minus direction and the X-axis plus direction after the shake. Accordingly, the coordinates of the image of the subject 80 are (xp + Δx2, yp−Δy2).
 Y軸方向における像ブレΔy2を表す数式は、第1の実施の形態で説明した場合と同様に、上式(1)である。
 一方、X軸方向における像ブレΔx2を数式で表すと、次式(3)となる。
 Δx2=f×xp/[(f+yp)1/2×cos(θ+tan-1(yp/f))]-xp  …(3)
ただし、Pitch方向の回転角(手振れ角を表し、一般的には0.5度程度である)をθとする。被写体80が遠方にある場合には、図3、図4中の符号fは交換レンズ3の焦点距離を表す。
The mathematical expression representing the image blur Δy2 in the Y-axis direction is the above expression (1), as in the case described in the first embodiment.
On the other hand, the image blur Δx2 in the X-axis direction is expressed by the following formula (3).
Δx2 = f × xp / [(f 2 + yp 2 ) 1/2 × cos (θ + tan −1 (yp / f))] − xp (3)
However, the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is θ. When the subject 80 is far away, the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
 上式(1)および(3)によれば、焦点距離fがypに比べて十分に大きい場合は、回転角θ(手振れ角度)は一般的に0.5度程度なので、Δx2≒0とみなすことができる。すなわち、像面70における被写体80の像の位置が像面70の中心(本例では原点)にある場合も、中心から離れた位置にある場合も、換言すると、光軸L1からの距離が異なっていても、Pitch方向に回転角θを検出した場合の像ブレはY軸方向のみを考慮すればよく、X軸方向については無視できる。このため、例えば、像面70の中心において算出した像ブレに基づいてY軸方向に像ブレ補正を行うと、像面70の中心に位置する被写体80の像も、像面70の中心から離れた位置の被写体80の像も、いずれも像ブレを抑制することができる。 According to the above formulas (1) and (3), when the focal length f is sufficiently larger than yp, the rotation angle θ (camera shake angle) is generally about 0.5 degrees, so it is regarded that Δx2≈0. be able to. In other words, whether the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different. However, the image blur when the rotation angle θ is detected in the pitch direction need only consider the Y-axis direction, and can be ignored in the X-axis direction. Therefore, for example, when image blur correction is performed in the Y-axis direction based on the image blur calculated at the center of the image plane 70, the image of the subject 80 positioned at the center of the image plane 70 is also separated from the center of the image plane 70. Any image of the subject 80 at the selected position can suppress image blurring.
 しかしながら、交換レンズ3が広角レンズである場合のように、焦点距離fがypに比べて十分に大きいといえない場合は、上式(3)によるΔx2≠0となる。そのため、Pitch方向の回転角θを検出した場合において、上式(1)によりY軸方向の像ブレを算出するだけでなく、上式(3)によりX軸方向の像ブレを算出することが必要になる。さもなければ、上式(3)による像ブレΔx2に対応するX軸方向の像ブレが抑制できずに残ってしまうからである。像ブレΔx2は、像ブレを算出する位置が像面70の周辺に向かうほど、すなわち、像高が高い位置になるほど大きくなる。 However, when the interchangeable lens 3 is a wide-angle lens and the focal length f cannot be said to be sufficiently larger than yp, Δx2 ≠ 0 according to the above equation (3). Therefore, when the rotation angle θ in the pitch direction is detected, not only the image blur in the Y-axis direction is calculated by the above equation (1) but also the image blur in the X-axis direction can be calculated by the above equation (3). I need it. Otherwise, the image blur in the X-axis direction corresponding to the image blur Δx2 according to the above equation (3) cannot be suppressed and remains. The image blur Δx2 increases as the position at which the image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
 CPU21は、像面70において像ブレを算出する位置を、第1の実施の形態と同様に定める。すなわち、CPU21は、上記(1)の方法から上記(4)の方法のうちのいずれかの方法を選び、像面70において像ブレを算出する位置を定める。そして、角度ブレ演算部201は、CPU21によって定められた位置の像ブレを算出する。ブレ補正光学系目標位置演算部203は、角度ブレ演算部201によって算出された像ブレと、並進ブレ演算部202によって算出された像ブレとに基づいて像ブレ量を算出する。 The CPU 21 determines the position where image blur is calculated on the image plane 70 in the same manner as in the first embodiment. That is, the CPU 21 selects one of the methods (4) from the method (1), and determines the position where the image blur is calculated on the image plane 70. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21. The blur correction optical system target position calculation unit 203 calculates an image blur amount based on the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
 なお、第2の実施の形態における像ブレ補正は、カメラ1がPitch方向に回転した場合におけるY軸方向の補正と、カメラ1がYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上述した第2の実施の形態の説明は、カメラ1がPitch方向に回転した場合におけるY軸方向の補正を行う場合に、焦点距離fがypに比べて十分に大きいといえない場合には、X軸方向についても補正する点を述べたものである。
 カメラ1がYaw方向に回転した場合には、Y軸方向に対して上述した補正と同様の補正が必要である。すなわち、図面を参照しての説明は省略するが、カメラ1がYaw方向に回転した場合におけるX軸方向の補正を行う場合に、焦点距離fがxpに比べて十分に大きいといえない場合には、Y軸方向についても補正する。
 また、カメラ1がPitch方向とYaw方向とに回転した場合には、両回転運動によるX軸、Y軸に対する像ブレが同時に起こるので、両回転運動による像ブレを、X軸、Y軸の各軸の方向にそれぞれ正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後の像ブレに基づき、X軸およびY軸方向においてそれぞれ補正する。
Note that the image blur correction in the second embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
In the description of the second embodiment described above, when the camera 1 rotates in the pitch direction and the correction in the Y-axis direction is performed, if the focal length f is not sufficiently larger than yp, The point to correct also about the X-axis direction is described.
When the camera 1 rotates in the Yaw direction, correction similar to the correction described above is necessary for the Y-axis direction. That is, although the description with reference to the drawings is omitted, when correction in the X-axis direction is performed when the camera 1 rotates in the Yaw direction, it cannot be said that the focal length f is sufficiently larger than xp. Corrects also in the Y-axis direction.
Further, when the camera 1 rotates in the Pitch direction and the Yaw direction, image blurring with respect to the X axis and the Y axis due to both rotational movements occurs simultaneously. Addition is performed by adding positive and negative signs to the axis directions. And based on the image blur after addition, it correct | amends in an X-axis direction and a Y-axis direction, respectively.
 また、第2の実施の形態においても、第1の実施の形態と同様に、並進ブレ演算部202によって算出される像ブレについては像面70(撮像素子22の撮像面)における位置が異なっても略一定として扱う。
 第2の実施の形態の概要は以下の通りである。
 角度ブレ演算部201は、像ブレを算出する位置を、像面70におけるいずれかの位置に定めて像ブレを算出する。この時、例えばPitch方向の回転角θを検出した場合において、上式(1)によりY軸方向の像ブレを算出するだけでなく、上式(3)によりX軸方向の像ブレも算出する。
 並進ブレ演算部202は、像ブレを算出する位置を、例えば像面70の中心に定めて像ブレを算出する。
 ブレ補正光学系目標位置演算部203は、角度ブレ演算部201で算出された像ブレおよび並進ブレ演算部202によって算出された像ブレを、X軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の位置における像ブレ量を算出する。
Also in the second embodiment, as in the first embodiment, the position of the image blur calculated by the translation blur calculation unit 202 on the image plane 70 (imaging plane of the image sensor 22) is different. Are also treated as substantially constant.
The outline of the second embodiment is as follows.
The angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70. At this time, for example, when the rotation angle θ in the pitch direction is detected, not only the image blur in the Y axis direction is calculated by the above equation (1) but also the image blur in the X axis direction is calculated by the above equation (3). .
The translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
The blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
 以上説明した第2の実施の形態によれば、次の作用効果が得られる。
(1)カメラ1のブレ補正装置は、装置のY軸の方向の振れを検出する振れセンサ39と、振れセンサ39の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部21aとを備える。ブレ補正部21aは、Y軸の方向と交差するX軸の方向の像ブレを演算する。これにより、振れを検出したY軸と交差するX軸の方向の像ブレを抑えることができる。
According to the second embodiment described above, the following operational effects can be obtained.
(1) The shake correction device of the camera 1 includes a shake sensor 39 that detects a shake in the Y-axis direction of the device, and an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 39. And a blur correction unit 21a that calculates the amount of blurring. The blur correction unit 21a calculates image blur in the X-axis direction that intersects the Y-axis direction. As a result, it is possible to suppress image blurring in the direction of the X-axis that intersects the Y-axis in which the shake is detected.
(2)上記(1)のブレ補正装置において、ブレ補正部21aは、Y軸の方向の像ブレを演算するので、振れを検出したY軸の方向の像ブレを抑えることができる。 (2) In the blur correction device of (1), the blur correction unit 21a calculates the image blur in the Y-axis direction, and thus can suppress the image blur in the Y-axis direction in which the shake is detected.
(3)上記(1)または(2)のブレ補正装置はさらに、像面70における位置を決定するCPU21を備える。ブレ補正部21aは、CPU21による決定位置と、振れセンサ39によって検出された、Y軸の方向の回転角とに基づいて、X軸の方向、Y軸の方向の像ブレ量を演算する。これにより、CPU21が決定した像面70の位置が、像面70の中央以外の位置である場合にも、適切に像ブレを抑えることができる。とくに、交換レンズ3の焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。 (3) The blur correction apparatus according to (1) or (2) further includes a CPU 21 that determines a position on the image plane 70. The blur correction unit 21 a calculates the image blur amount in the X-axis direction and the Y-axis direction based on the determined position by the CPU 21 and the rotation angle in the Y-axis direction detected by the shake sensor 39. Thereby, even when the position of the image plane 70 determined by the CPU 21 is a position other than the center of the image plane 70, the image blur can be appropriately suppressed. In particular, it is suitable when the focal length f of the interchangeable lens 3 is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
 次のような変形も発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態、あるいは後述する実施の形態と組み合わせることも可能である。
(変形例3)
 CPU21は、第2の実施の形態において説明した像ブレ補正において、交換レンズ3によって生じている光学的な歪みを考慮した補正を行う。図9は、交換レンズ3によって歪曲収差(例えば樽型)が生じている例を説明する図である。多数の実線の円は、仮に、交換レンズ3に歪曲収差がないと仮定した場合における被写体80の像を示す。これに対して、多数のハッチング状の円は、交換レンズ3の光学特性に基づく樽型歪曲収差の影響によって歪曲した被写体80の像を示す。
The following modifications are also within the scope of the invention, and one or a plurality of modifications can be combined with the above-described embodiment or an embodiment described later.
(Modification 3)
The CPU 21 performs correction in consideration of optical distortion caused by the interchangeable lens 3 in the image blur correction described in the second embodiment. FIG. 9 is a diagram illustrating an example in which distortion (for example, barrel shape) is generated by the interchangeable lens 3. A large number of solid circles represent images of the subject 80 when it is assumed that the interchangeable lens 3 has no distortion. On the other hand, a large number of hatched circles indicate images of the subject 80 that are distorted by the influence of barrel distortion based on the optical characteristics of the interchangeable lens 3.
 一般に、交換レンズ3の歪曲収差は、設計により異なるものの、焦点距離の短い広角レンズにおいて大きいものが多い。このため、図9に例示したように、撮像光学系の光軸L1から離れる(像面70の中心Oを光軸L1に合わせる場合は、像面70の中心Oから離れる)にしたがって歪量が大きくなる。歪量は、図9に示した実線の円とハッチング状の円との間の位置ずれとして現れる。図9の例では、像面70の中心Oからの距離が長い(換言すると、像高が高い)位置において実線の円とハッチング状の円との位置ずれが最も大きくなり、例えば右下位置における位置ずれは、X軸方向にΔx、Y軸方向にΔyである。 In general, the distortion aberration of the interchangeable lens 3 varies depending on the design, but is often large in a wide-angle lens having a short focal length. For this reason, as illustrated in FIG. 9, the amount of distortion increases as the distance from the optical axis L1 of the imaging optical system increases (when the center O of the image plane 70 is aligned with the optical axis L1, the distance from the center O of the image plane 70). growing. The distortion amount appears as a positional deviation between the solid line circle and the hatched circle shown in FIG. In the example of FIG. 9, the positional deviation between the solid circle and the hatched circle becomes the largest at a position where the distance from the center O of the image plane 70 is long (in other words, the image height is high). The positional deviation is Δx in the X-axis direction and Δy in the Y-axis direction.
 図8に例示した模式図は、図9における実線の円のように、撮像光学系による歪曲がないものとして表している。そのため、例えば像面70において像ブレを算出する位置を、像面70の中心Oから離れた位置に定める場合、第2の実施の形態で説明した像ブレ補正をそのまま行うと、歪曲収差が存在する場合には、補正しきれない像ブレが発生する。 The schematic diagram illustrated in FIG. 8 is represented as having no distortion due to the imaging optical system, as indicated by a solid circle in FIG. Therefore, for example, when the position at which image blur is calculated on the image plane 70 is set at a position away from the center O of the image plane 70, if the image blur correction described in the second embodiment is performed as it is, distortion aberration exists. In this case, image blur that cannot be corrected occurs.
 そこで、第2の実施の形態の変形例3では、歪曲収差が大きい交換レンズ3をカメラボディ2に装着した状態で、第2の実施の形態において説明した像ブレ補正を行う場合には、図9におけるハッチング状の円のような、撮像光学系による歪曲があるものとして像ブレ補正を行う。 Therefore, in the third modification of the second embodiment, when the image blur correction described in the second embodiment is performed in a state where the interchangeable lens 3 having a large distortion is mounted on the camera body 2, FIG. Image blur correction is performed assuming that there is distortion due to the imaging optical system, such as a hatched circle in FIG.
 図9のハッチング状の円のように、像面70のどの位置で、どの向きに、どの大きさの歪量となるかを示す歪曲収差の情報は、交換レンズ3の設計情報として既知である。このため、カメラボディ2に装着される交換レンズ3の歪曲収差の情報を、予めメモリ28に記録しておく。CPU21は、歪曲収差が大きい交換レンズ3が装着されたことを検出した場合、メモリ28から対応する歪曲収差の情報を読み出して、上述した像ブレを算出する演算に用いる。 As in the hatched circle in FIG. 9, distortion aberration information indicating which position on the image plane 70 is in which direction and in which amount of distortion is known as design information of the interchangeable lens 3. . Therefore, information on distortion aberration of the interchangeable lens 3 attached to the camera body 2 is recorded in the memory 28 in advance. When the CPU 21 detects that the interchangeable lens 3 having a large distortion aberration is attached, the CPU 21 reads out the corresponding distortion aberration information from the memory 28 and uses it for the above-described calculation to calculate the image blur.
 ブレ補正部21aのブレ補正光学系目標位置演算部203は、X軸およびY軸の各軸について、角度ブレ演算部201によって算出された像ブレ、並進ブレ演算部202によって算出された像ブレ、および、メモリ28から読み出された歪曲収差の情報の向きによって、正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の位置における像ブレ量を算出する。 The blur correction optical system target position calculation unit 203 of the blur correction unit 21a is configured to perform image blur calculated by the angle blur calculation unit 201 and image blur calculated by the translation blur calculation unit 202 for each of the X axis and the Y axis. Then, depending on the direction of the distortion aberration information read from the memory 28, a positive / negative sign is added to perform the addition calculation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
 なお、以上の説明では、樽型歪曲収差が生じている例を説明したが、糸巻き型歪曲収差が生じる場合も同様である。 In the above description, an example in which barrel distortion has occurred has been described, but the same applies to the case where pincushion distortion occurs.
 以上説明した第2の実施の形態の変形例3によれば、歪曲収差があっても適切に像ブレを補正することができる。
 また、交換レンズ3によって生じている光学的な歪み(Distortion)が大きい場合にも、像面70の中央以外の位置において、適切に像ブレを抑えることができる。
According to the third modification of the second embodiment described above, it is possible to appropriately correct image blur even when there is distortion.
Further, even when the optical distortion (Distortion) generated by the interchangeable lens 3 is large, image blur can be appropriately suppressed at a position other than the center of the image plane 70.
(第3の実施の形態)
 第3の実施の形態では、カメラボディ2Aに対して交換レンズ3Aが装着される。交換レンズ3Aは、交換レンズ3に比べて、ブレ補正部40が追加されている点において相違する。振れセンサ39からの検出信号は、ブレ補正部40に送出される。
 カメラボディ2Aは、カメラボディ2に比べて、振れセンサ(動き検出部、振れ検出部)31が追加されている点において相違する。振れセンサ31からの検出信号は、CPU21(ブレ補正部21a)に送出される。振れセンサ31は、振れセンサ39と同様の機能を備える。
(Third embodiment)
In the third embodiment, the interchangeable lens 3A is attached to the camera body 2A. The interchangeable lens 3A is different from the interchangeable lens 3 in that a shake correction unit 40 is added. A detection signal from the shake sensor 39 is sent to the shake correction unit 40.
The camera body 2A is different from the camera body 2 in that a shake sensor (motion detection unit, shake detection unit) 31 is added. A detection signal from the shake sensor 31 is sent to the CPU 21 (blur correction unit 21a). The shake sensor 31 has the same function as the shake sensor 39.
 第3の実施の形態では、ブレ補正駆動機構37を備えた交換レンズ3Aがカメラボディ2Aに装着された場合において、交換レンズ3Aのブレ補正駆動機構37を作動させて行う像ブレ補正と、カメラボディ2Aのブレ補正駆動機構26を作動させて行う像ブレ補正とを併用する。
 一方、ブレ補正駆動機構37を備えていない交換レンズ3Aがカメラボディ2Aに装着された場合には、カメラボディ2Aのブレ補正駆動機構26を作動させることにより、第1の実施の形態の変形例1と同様の像ブレ補正を行う。
In the third embodiment, when the interchangeable lens 3A including the blur correction drive mechanism 37 is attached to the camera body 2A, image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3A, and the camera Image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is also used.
On the other hand, when the interchangeable lens 3A that does not include the shake correction drive mechanism 37 is attached to the camera body 2A, the shake correction drive mechanism 26 of the camera body 2A is operated to modify the first embodiment. Image blur correction similar to 1 is performed.
 図10は、第3の実施の形態によるカメラ1Aの要部構成を示す図である。カメラ1Aは、カメラボディ2Aと交換レンズ3Aとで構成される。交換レンズ3Aは、不図示のマウント部を介してカメラボディ2Aに装着される。交換レンズ3Aがカメラボディ2Aに装着されると、カメラボディ2Aと交換レンズ3Aとが電気的に接続され、カメラボディ2Aと交換レンズ3Aとの間で通信が可能になる。カメラボディ2Aと交換レンズ3Aとの通信は、無線通信によって行ってもよい。
 図10において図1と同様の構成には、図1と同じ符号を付して説明を省略する。
FIG. 10 is a diagram showing a main configuration of a camera 1A according to the third embodiment. The camera 1A includes a camera body 2A and an interchangeable lens 3A. The interchangeable lens 3A is attached to the camera body 2A via a mount portion (not shown). When the interchangeable lens 3A is attached to the camera body 2A, the camera body 2A and the interchangeable lens 3A are electrically connected, and communication is possible between the camera body 2A and the interchangeable lens 3A. Communication between the camera body 2A and the interchangeable lens 3A may be performed by wireless communication.
In FIG. 10, the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
 図11は、交換レンズ3Aのブレ補正部40を説明する図である。ブレ補正部40は、角度ブレ演算部401と、並進ブレ演算部402と、ブレ補正光学系目標位置演算部403とを有する。
 角度ブレ演算部401は、角速度センサ39aによるX軸と平行な軸回り(Pitch方向)の検出信号を用いて、回転運動によるY軸方向の像ブレと、必要な場合にはX軸方向の像ブレとを算出する。また、角度ブレ演算部201は、角速度センサ39aによるY軸と平行な軸回り(Yaw方向)の検出信号を用いて、回転運動によるX軸方向の像ブレと、必要な場合にはY軸方向の像ブレとを算出する。
FIG. 11 is a diagram illustrating the blur correction unit 40 of the interchangeable lens 3A. The shake correction unit 40 includes an angle shake calculation unit 401, a translational shake calculation unit 402, and a shake correction optical system target position calculation unit 403.
The angle blur calculation unit 401 uses the detection signal around the axis parallel to the X axis (Pitch direction) detected by the angular velocity sensor 39a, and the image blur in the Y axis direction due to the rotational motion and, if necessary, the image in the X axis direction. Calculate blur. Further, the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) detected by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
 並進ブレ演算部402は、加速度センサ39bによるX軸方向の検出信号を用いて、並進運動によるX軸方向の像ブレを算出する。また、並進ブレ演算部402は、加速度センサ39bによるY軸方向の検出信号を用いて、並進運動によるY軸方向の像ブレを算出する。 The translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b. The translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
 ブレ補正光学系目標位置演算部403は、角度ブレ演算部401によって算出されたX軸方向およびY軸方向の像ブレと、並進ブレ演算部402によって算出されたX軸方向およびY軸方向の像ブレとを足し合わせて、X軸方向およびY軸方向の像ブレを算出する。 The shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402. The image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
 また、ブレ補正光学系目標位置演算部403は、足し合わせ後のX軸方向およびY軸方向の像ブレと、撮影倍率(ズーム光学系31の位置に基づいて算出する)と、カメラ1Aから被写体80までの距離(フォーカス光学系32の位置に基づいて算出する)とに基づいて、像面70の、後述する位置における像ブレ量を算出する。 In addition, the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), an image blur amount at a position to be described later on the image plane 70 is calculated.
 ブレ補正光学系目標位置演算部403は、交換レンズ3Aのブレ補正駆動機構37を作動させるため、算出した像ブレ量に基づいてブレ補正光学系33の目標位置を演算する。
 そして、ブレ補正光学系目標位置演算部403は、交換レンズ3Aのブレ補正駆動機構37に対して目標位置を示す信号を送出する。
The blur correction optical system target position calculation unit 403 calculates the target position of the blur correction optical system 33 based on the calculated image blur amount in order to operate the blur correction drive mechanism 37 of the interchangeable lens 3A.
Then, the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
 なお、カメラ1Aは、図10に例示した一眼レフタイプでも、ミラー24を備えないミラーレスタイプでもよい。
 また、撮像素子22を進退移動させるブレ補正駆動機構26と、ブレ補正光学系33を進退移動させるブレ補正駆動機構37とを備えるものであれば、交換レンズ3Aとカメラボディ2Aとを一体にしたレンズ一体型のカメラとして構成してもよい。
Note that the camera 1 </ b> A may be a single-lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
In addition, if the camera has a shake correction drive mechanism 26 that moves the image sensor 22 forward and backward and a shake correction drive mechanism 37 that moves the shake correction optical system 33 forward and backward, the interchangeable lens 3A and the camera body 2A are integrated. You may comprise as a lens integrated camera.
<併用する像ブレ補正>
 以下に、交換レンズ3Aによる像ブレ補正と、カメラボディ2Aによる像ブレ補正とを併用する像ブレ補正について説明する。
 角度ブレ演算部201による像ブレの演算、および、並進ブレ演算部202による像ブレの演算は、第1の実施の形態や第2の実施の形態の場合と同様である。
 ただし、以下の点で第1の実施の形態や第2の実施の形態と相違する。相違点の一つは、交換レンズ3Aによる像ブレ補正では像ブレを算出する位置として像面70の中心を選び、カメラボディ2Aによる像ブレ補正では像ブレを算出する位置として像面70におけるいずれかの位置を選ぶ点である。
 相違点のもう一つは、カメラボディ2AのCPU21において決定された分担比率に基づいて、交換レンズ3Aによる像ブレ補正とカメラボディ2Aによる像ブレ補正とを行う点である。分担比率の説明については後述する。
<Combined image blur correction>
Hereinafter, the image blur correction using both the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A will be described.
The image blur calculation by the angle blur calculation unit 201 and the image blur calculation by the translation blur calculation unit 202 are the same as those in the first embodiment and the second embodiment.
However, it differs from the first embodiment and the second embodiment in the following points. One of the differences is that the center of the image plane 70 is selected as the position for calculating the image blur in the image blur correction using the interchangeable lens 3A, and any position on the image plane 70 is used as the position for calculating the image blur in the image blur correction using the camera body 2A. It is a point to choose the position.
Another difference is that the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A are performed based on the sharing ratio determined by the CPU 21 of the camera body 2A. The explanation of the sharing ratio will be described later.
<像ブレを算出する位置>
 CPU21は、交換レンズ3Aのブレ補正部40が像ブレを算出する位置を、例えば像面70の中心に定め、カメラボディ2Aのブレ補正部21aが像ブレを算出する位置を、像面70におけるいずれかの位置に定める。これにより、交換レンズ3Aの角度ブレ演算部401は、像面70の中心位置の像ブレと、CPU21により決定された交換レンズ3Aの分担比率とに基づいて、ブレ補正量(L)を算出する。カメラボディ2Aの角度ブレ演算部201は、CPU21によって定められた像面70の中心と異なる位置の像ブレと、CPU21により決定されたカメラボディ2Aの分担比率とに基づいて、ブレ補正量(B)を算出する。
 CPU21は、像ブレを算出する位置を像面70の中心と異なる位置に定める場合、第1の実施の形態における(1)から(4)のいずれかの方法によって位置を定める。
<Position for calculating image blur>
The CPU 21 determines the position at which the blur correction unit 40 of the interchangeable lens 3A calculates image blur at the center of the image plane 70, for example, and sets the position at which the blur correction unit 21a of the camera body 2A calculates image blur on the image plane 70. Set in any position. Accordingly, the angle blur calculation unit 401 of the interchangeable lens 3A calculates the blur correction amount (L) based on the image blur at the center position of the image plane 70 and the sharing ratio of the interchangeable lens 3A determined by the CPU 21. . The angle blur calculation unit 201 of the camera body 2A is based on the image blur at a position different from the center of the image plane 70 determined by the CPU 21 and the share ratio of the camera body 2A determined by the CPU 21. ) Is calculated.
When determining the position at which image blur is calculated at a position different from the center of the image plane 70, the CPU 21 determines the position by any one of methods (1) to (4) in the first embodiment.
 像ブレを算出する位置が像面70の中心である場合、Y軸方向における像ブレΔy1を表す数式は、第1の実施の形態で説明した通り、上式(2)である。
 また、像ブレを算出する位置が像面70の中心と異なる位置である場合、Y軸方向における像ブレΔy2を表す数式は、第1の実施の形態で説明した通り、上式(1)である。
When the position where the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur Δy1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
Further, when the position where the image blur is calculated is a position different from the center of the image plane 70, the mathematical expression representing the image blur Δy2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
<分担比率>
 CPU21は、交換レンズ3Aによる像ブレ補正と、カメラボディ2Aによる像ブレ補正との分担比率を定める。本例のCPU21は、例えば、分担比率を50:50と定める。この比率は、70:30でもよいし、40:60でもよい。
<Share ratio>
The CPU 21 determines a sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A. For example, the CPU 21 of this example sets the sharing ratio to 50:50. This ratio may be 70:30 or 40:60.
 交換レンズ3Aの角度ブレ演算部401は、CPU21によって定められた分担比率が50:50の場合、次式(4)に示すように、交換レンズ3Aに分担させる像ブレV(L)を求める。右辺において1/2倍するのは、分担比率を50%に定めたことによる。
 V(L)=Δy1/2
     =f×tanθ/2        …(4)
ただし、Δy1は、像面70の中心におけるY軸方向の像ブレである。また、Pitch方向の回転角(手振れ角を表し、一般的には0.5度程度である)をθとする。被写体80が遠方にある場合には、符号fは交換レンズ3Aの焦点距離を表す。
When the sharing ratio determined by the CPU 21 is 50:50, the angle blur calculation unit 401 of the interchangeable lens 3A obtains an image blur V (L) to be shared by the interchangeable lens 3A as shown in the following equation (4). The reason for halving the right side is that the sharing ratio is set to 50%.
V (L) = Δy1 / 2
= F x tan θ / 2 (4)
However, Δy1 is an image blur in the Y-axis direction at the center of the image plane 70. In addition, the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is defined as θ. When the subject 80 is far away, the symbol f represents the focal length of the interchangeable lens 3A.
 一方、カメラボディ2Aの角度ブレ演算部201は、CPU21によって定められた分担比率が50:50の場合、次式(5)に示すように、カメラボディ2Aに分担させる像ブレV(B)を求める。
 V(B)=Δy1/2+d
     =f×tanθ/2+d      …(5)
ただし、d=Δy2-Δy1とする。Δy2は、像面70の中心と異なる位置におけるY軸方向の像ブレである。
On the other hand, when the sharing ratio determined by the CPU 21 is 50:50, the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (5). Ask.
V (B) = Δy1 / 2 + d
= F × tan θ / 2 + d (5)
However, d = Δy2−Δy1. Δy2 is an image blur in the Y-axis direction at a position different from the center of the image plane 70.
 交換レンズ3Aのブレ補正光学系目標位置演算部403は、角度ブレ演算部401によって算出された像ブレV(L)と、並進ブレ演算部402によって算出された像ブレとに基づき、交換レンズ3Aのブレ補正駆動機構37を作動させて行う像ブレ補正における、ブレ補正光学系33の目標位置を演算する。 The blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402. The target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
 また、カメラボディ2Aのブレ補正光学系目標位置演算部203は、角度ブレ演算部201によって算出された像ブレV(B)と、並進ブレ演算部202によって算出された像ブレとに基づき、カメラボディ2Aのブレ補正駆動機構26を作動させて行う像ブレ補正における、撮像素子22の目標位置を演算する。
 交換レンズ3Aのブレ補正光学系目標位置演算部403はさらに、交換レンズ3Aのブレ補正駆動機構37に対して目標位置を示す信号を送出する。また、カメラボディ2Aのブレ補正光学系目標位置演算部203はさらに、カメラボディ2Aのブレ補正駆動機構26に対して目標位置を示す信号を送出する。
Further, the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202. The target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
The shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A. Further, the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
 第3の実施の形態では、交換レンズ3Aによる像ブレ補正によって、角度ブレ演算部401が像面70の中心位置において算出した像ブレに基づく像ブレ補正を行う。また、カメラボディ2Aによる像ブレ補正によって、角度ブレ演算部201が像面70の中心と異なる位置において算出した像ブレに基づく像ブレ補正を行う。 In the third embodiment, the image blur correction based on the image blur calculated by the angle blur calculation unit 401 at the center position of the image plane 70 is performed by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated by the angle blur calculation unit 201 at a position different from the center of the image plane 70 is performed by the image blur correction by the camera body 2A.
 なお、第3の実施の形態における像ブレ補正は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上述した第3の実施の形態の説明は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正について、代表して説明したものである。このため、カメラ1AがYaw方向にも回転した場合には、X軸方向に対して上述した補正と同様の補正が必要である。
 カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とは、方向が異なる以外は同様であるので、X軸方向の説明については省略する。
Note that the image blur correction in the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
The description of the third embodiment described above is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction.
Since the correction in the Y-axis direction when the camera 1A is rotated in the pitch direction and the correction in the X-axis direction when the camera 1A is rotated in the Yaw direction are the same except for the direction, the description in the X-axis direction is the same. Is omitted.
 また、カメラ1AがPitch方向とYaw方向とに回転した場合には、両回転運動によるX軸、Y軸に対する像ブレが同時に起こるので、両回転運動による像ブレを、X軸、Y軸の各軸の方向にそれぞれ正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後の像ブレに基づき、X軸およびY軸方向においてそれぞれ補正する。 In addition, when the camera 1A rotates in the Pitch direction and the Yaw direction, image blurring with respect to both the X axis and the Y axis due to both rotational movements occurs simultaneously. Addition is performed by adding positive and negative signs to the axis directions. And based on the image blur after addition, it correct | amends in an X-axis direction and a Y-axis direction, respectively.
 なお、第3の実施の形態において、第1の実施の形態や第2の実施の形態と同様に、並進ブレ演算部202、並進ブレ演算部402によって算出される像ブレについては像面70における位置が異なっても略一定として扱う。
 第3の実施の形態の概要は、以下の通りである。
 交換レンズ3Aのブレ補正部40の角度ブレ演算部401は、像面70の中心位置で、像ブレを算出する。カメラボディ2Aのブレ補正部21aの角度ブレ演算部201は、像面70の中心とは異なる位置で、像ブレを算出する。
 交換レンズ3Aの角度ブレ演算部401は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレV(L)を、像面70の中心における像ブレΔy1の1/2とし、カメラボディ2Aの角度ブレ演算部201は、カメラボディ2Aに分担させる像ブレV(B)を、V(L)+dとする。dは、像面70の中心とは異なる位置における像ブレΔy2と、上記Δy1との差である。
 交換レンズ3Aの並進ブレ演算部402は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレを、例えば像面70の中心における像ブレの1/2とする。カメラボディ2Aの並進ブレ演算部202は、カメラボディ2Aに分担させる像ブレを、例えば像面70の中心における像ブレの1/2とする。
 交換レンズ3Aのブレ補正光学系目標位置演算部403は、角度ブレ演算部401で算出された像ブレV(L)および並進ブレ演算部402で算出された像ブレを、それぞれX軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の中心位置における像ブレ量を算出する。
 カメラボディ2Aのブレ補正光学系目標位置演算部203は、角度ブレ演算部201で算出された像ブレV(B)および並進ブレ演算部202で算出された像ブレを、それぞれX軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の中心とは異なる位置における像ブレ量を算出する。
In the third embodiment, the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is similar to that in the first and second embodiments. Even if the position is different, it is treated as almost constant.
The outline of the third embodiment is as follows.
The angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 </ b> A calculates image blur at the center position of the image plane 70. The angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 </ b> A calculates image blur at a position different from the center of the image plane 70.
The angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, the sharing ratio of 50%) as 1/2 of the image blur Δy1 at the center of the image plane 70, and the camera body 2A. The angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d. d is the difference between the image blur Δy2 at a position different from the center of the image plane 70 and the Δy1.
The translation blur calculation unit 402 of the interchangeable lens 3 </ b> A sets the image blur to be assigned to the interchangeable lens 3 </ b> A (for example, a sharing ratio of 50%) to, for example, ½ of the image blur at the center of the image plane 70. The translation blur calculation unit 202 of the camera body 2 </ b> A sets the image blur shared by the camera body 2 </ b> A to, for example, half of the image blur at the center of the image plane 70.
The blur correction optical system target position calculation unit 403 of the interchangeable lens 3 </ b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the center position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
The camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at a position different from the center of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
 以上説明した第3の実施の形態によれば、次の作用効果が得られる。
(1)カメラ1Aのブレ補正装置は、装置の振れを検出する振れセンサ39と、振れセンサ39の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部40と、ブレ補正部40の出力に基づいて、ブレ量を抑える向きにブレ補正光学系33を移動するブレ補正駆動機構37と、を交換レンズ3Aに備える。また、装置の振れを検出する振れセンサ31と、振れセンサ31の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部21bと、ブレ補正部21aの出力に基づいて、像面70において被写体80の像を撮像する撮像素子22を、ブレ量を抑える向きに移動するブレ補正駆動機構26と、像面70における位置を決めるCPU21と、をカメラボディ2Aに備える。
According to the third embodiment described above, the following operational effects can be obtained.
(1) The camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39. The interchangeable lens 3 </ b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40. Further, a shake sensor 31 that detects a shake of the apparatus, a shake correction unit 21 b that calculates a shake amount of an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a, a blur correction drive mechanism 26 that moves the image sensor 22 that captures an image of the subject 80 on the image plane 70 in a direction that suppresses the blur amount, and a CPU 21 that determines a position on the image plane 70 are provided. Provided in the camera body 2A.
 交換レンズ3Aのブレ補正部40は、像面70に予め定められた第1位置(像面70の中心)と振れセンサ39によって検出された振れとに基づく像ブレΔy1を演算する。ブレ補正部40は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレV(L)を、像ブレΔy1の1/2とする。
 カメラボディ2Aのブレ補正部21bは、CPU21により決定された第2位置(中心と異なる位置)と振れセンサ31によって検出された振れとに基づく像ブレΔy2と、像面70に予め定められた第1位置(像面70の中心)と振れセンサ31によって検出された振れとに基づく像ブレΔy1を演算する。ブレ補正部21bはさらに、像ブレΔy2および像ブレΔy1の差dを算出する。角度ブレ演算部201は、カメラボディ2Aに分担させる像ブレV(B)を、V(L)+dとする。
 これにより、CPU21決定した位置が像面70の中央以外である場合にも、適切に像ブレを抑えることができる。とくに、交換レンズ3Aの焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。
The blur correction unit 40 of the interchangeable lens 3A calculates an image blur Δy1 based on a first position (the center of the image plane 70) predetermined on the image plane 70 and the shake detected by the shake sensor 39. The blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to ½ of the image blur Δy1.
The blur correction unit 21 b of the camera body 2 </ b> A has an image blur Δy <b> 2 based on the second position (a position different from the center) determined by the CPU 21 and the shake detected by the shake sensor 31, and a predetermined image plane 70. An image blur Δy1 based on one position (the center of the image plane 70) and the shake detected by the shake sensor 31 is calculated. The blur correction unit 21b further calculates a difference d between the image blur Δy2 and the image blur Δy1. The angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d.
Thereby, even when the position determined by the CPU 21 is other than the center of the image plane 70, the image blur can be appropriately suppressed. Particularly, it is suitable when the focal length f of the interchangeable lens 3A is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
(2)上記(1)のブレ補正装置において、交換レンズ3Aのブレ補正部40は、ブレ補正駆動機構37に像ブレΔy1の50%を出力し、カメラボディ2Aのブレ補正部21aは、ブレ補正駆動機構26に像ブレΔy1の残り50%と、上記の差dを出力する。ブレ補正駆動機構26およびブレ補正駆動機構37を併用しない場合に比べて、ブレ補正駆動機構26およびブレ補正駆動機構37による移動距離を、それぞれ小さく抑えることができる。 (2) In the blur correction device of (1), the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur Δy1 to the blur correction drive mechanism 37, and the blur correction unit 21a of the camera body 2A The remaining 50% of the image blur Δy1 and the difference d are output to the correction drive mechanism 26. Compared with the case where the shake correction drive mechanism 26 and the shake correction drive mechanism 37 are not used together, the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
 なお、CPU21が決定する分担比率を、交換レンズ3Aによる像ブレ補正を100%とし、カメラボディ2Aによる像ブレ補正を0%としてもよい。この場合において、交換レンズ3Aの角度ブレ演算部401は、交換レンズ3Aで分担する像ブレV(L)を100%とし、カメラボディ2Aの角度ブレ演算部201は、カメラボディ2Aで分担する像ブレV(B)をdとする。dは、像面70の中心とは異なる位置における像ブレΔy2と、像面70の中心における像ブレΔy1との差である。 The sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A. In this case, the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A. The blur V (B) is d. d is the difference between the image blur Δy 2 at a position different from the center of the image plane 70 and the image blur Δy 1 at the center of the image plane 70.
 次のような変形も発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態、あるいは後述する実施の形態と組み合わせることも可能である。
(変形例4)
 第3の実施の形態の変形例4において、CPU21は、像ブレを算出する位置として、例えば像面70における2つの位置(第1位置、第2位置と称する)を定める。交換レンズ3Aの角度ブレ演算部401は、CPU21によって定められた第1位置について像ブレを算出する。カメラボディ2Aの角度ブレ演算部201は、CPU21によって定められた第1位置および第2位置について像ブレを算出する。CPU21は、像ブレを算出する第1位置、第2位置を、第1の実施の形態における(1)から(4)のいずれかの方法によって定める。
The following modifications are also within the scope of the invention, and one or a plurality of modifications can be combined with the above-described embodiment or an embodiment described later.
(Modification 4)
In the fourth modification of the third embodiment, the CPU 21 determines, for example, two positions (referred to as a first position and a second position) on the image plane 70 as positions for calculating the image blur. The angle blur calculation unit 401 of the interchangeable lens 3A calculates image blur for the first position determined by the CPU 21. The angle blur calculation unit 201 of the camera body 2A calculates image blur for the first position and the second position determined by the CPU 21. The CPU 21 determines the first position and the second position for calculating the image blur by one of the methods (1) to (4) in the first embodiment.
 第3の実施の形態の変形例4は、第1位置および第2位置がいずれも像面70の中心と異なる位置である場合を含む点で、第3の実施の形態と相違する。一方、第3の実施の形態の変形例4において、CPU21が、交換レンズ3Aによる像ブレ補正と、カメラボディ2Aによる像ブレ補正との分担比率を定める点は、第3の実施の形態と同様である。 Modification 4 of the third embodiment is different from the third embodiment in that the first position and the second position include a case where both the first position and the second position are different from the center of the image plane 70. On the other hand, in the fourth modification of the third embodiment, the CPU 21 determines the sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A, as in the third embodiment. It is.
 像ブレを算出する第1位置または第2位置が、像面70の中心である場合、Y軸方向における像ブレΔy1を表す数式は、第1の実施の形態で説明した通り、上式(2)である。
 また、像ブレを算出する第1位置、第2位置が像面70の中心と異なる位置である場合、Y軸方向における像ブレΔy2を表す数式は、第1の実施の形態で説明した通り、上式(1)である。
When the first position or the second position at which the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur Δy1 in the Y-axis direction can be expressed by the above formula (2) as described in the first embodiment. ).
Further, when the first position and the second position for calculating the image blur are positions different from the center of the image plane 70, the mathematical expression representing the image blur Δy2 in the Y-axis direction is as described in the first embodiment. The above formula (1).
 交換レンズ3Aの角度ブレ演算部401は、CPU21によって定められた分担比率が、例えば50:50の場合、次式(6)に示すように、交換レンズ3Aに分担させる像ブレV(L)を求める。右辺において1/2倍するのは、分担比率を50%に定めたことによる。
 V(L)=Δy2a/2      …(6)
ただし、Δy2aは、像面70の中心と異なる第1位置におけるY軸方向の像ブレである。
When the sharing ratio determined by the CPU 21 is 50:50, for example, the angle blur calculation unit 401 of the interchangeable lens 3A performs image blur V (L) to be shared by the interchangeable lens 3A as shown in the following formula (6). Ask. The reason for halving the right side is that the sharing ratio is set to 50%.
V (L) = Δy2a / 2 (6)
However, Δy2a is an image blur in the Y-axis direction at a first position different from the center of the image plane 70.
 また、カメラボディ2Aの角度ブレ演算部201は、CPU21によって定められた分担比率が50:50の場合、次式(7)に示すように、カメラボディ2Aに分担させる像ブレV(B)を求める。
 V(B)=Δy2a/2+d2   …(7)
ただし、d2=Δy2b-Δy2aとする。Δy2bは、像面70の中心と異なる第2位置におけるY軸方向の像ブレである。
Further, when the sharing ratio determined by the CPU 21 is 50:50, the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (7). Ask.
V (B) = Δy2a / 2 + d2 (7)
However, d2 = Δy2b−Δy2a. Δy2b is an image blur in the Y-axis direction at a second position different from the center of the image plane 70.
 交換レンズ3Aのブレ補正光学系目標位置演算部403は、角度ブレ演算部401によって算出された像ブレV(L)と、並進ブレ演算部402によって算出された像ブレとに基づき、交換レンズ3Aのブレ補正駆動機構37を作動させて行う像ブレ補正における、ブレ補正光学系33の目標位置を演算する。 The blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402. The target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
 また、カメラボディ2Aのブレ補正光学系目標位置演算部203は、角度ブレ演算部201によって算出された像ブレV(B)と、並進ブレ演算部202によって算出された像ブレとに基づき、カメラボディ2Aのブレ補正駆動機構26を作動させて行う像ブレ補正における、撮像素子22の目標位置を演算する。
 交換レンズ3Aのブレ補正光学系目標位置演算部403はさらに、交換レンズ3Aのブレ補正駆動機構37に対して目標位置を示す信号を送出する。また、カメラボディ2Aのブレ補正光学系目標位置演算部203はさらに、カメラボディ2Aのブレ補正駆動機構26に対して目標位置を示す信号を送出する。
Further, the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202. The target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
The shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A. Further, the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
 第3の実施の形態の変形例4では、交換レンズ3Aによる像ブレ補正によって、角度ブレ演算部401が像面70の第1位置において算出した像ブレに基づく像ブレ補正を行う。また、カメラボディ2Aによる像ブレ補正によって、角度ブレ演算部201が像面70の第2位置において算出した像ブレに基づく像ブレ補正を行う。 In the fourth modification of the third embodiment, the image blur correction unit 401 performs image blur correction based on the image blur calculated at the first position of the image plane 70 by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated at the second position of the image plane 70 by the angle blur calculation unit 201 is performed by the image blur correction by the camera body 2A.
 なお、第3の実施の形態の変形例4における像ブレ補正は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上述した第3の実施の形態の変形例4の説明は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正について、代表して説明したものである。このため、カメラ1AがYaw方向にも回転した場合には、X軸方向に対して上述した補正と同様の補正が必要である。
 カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とは、方向が異なる以外は同様であるので、X軸方向の説明については省略する。
Note that the image blur correction in the fourth modification of the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction. Including.
The description of Modification 4 of the above-described third embodiment is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction.
Since the correction in the Y-axis direction when the camera 1A is rotated in the pitch direction and the correction in the X-axis direction when the camera 1A is rotated in the Yaw direction are the same except for the direction, the description in the X-axis direction is the same. Is omitted.
 また、カメラ1AがPitch方向とYaw方向とに回転した場合には、両回転運動によるX軸、Y軸に対する像ブレが同時に起こるので、両回転運動による像ブレを、X軸、Y軸の各軸の方向にそれぞれ正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後の像ブレに基づき、X軸およびY軸方向においてそれぞれ補正する。 In addition, when the camera 1A rotates in the Pitch direction and the Yaw direction, image blurring with respect to both the X axis and the Y axis due to both rotational movements occurs simultaneously. Addition is performed by adding positive and negative signs to the axis directions. And based on the image blur after addition, it correct | amends in an X-axis direction and a Y-axis direction, respectively.
 また、第3の実施の形態の変形例4において、第1の実施の形態から第3の実施の形態と同様に、並進ブレ演算部202、並進ブレ演算部402によって算出される像ブレについては像面70(撮像素子22の撮像面)における位置が異なっても略一定として扱う。
 第3の実施の形態の変形例4の概要は、以下の通りである。
 交換レンズ3Aのブレ補正部40の角度ブレ演算部401は、像面70における第1位置で、像ブレを算出する。カメラボディ2Aのブレ補正部21aの角度ブレ演算部201は、像面70における第2位置で、像ブレを算出する。
 交換レンズ3Aの角度ブレ演算部401は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレV(L)を、像面70の第1位置における像ブレΔy2aの1/2とし、カメラボディ2Aの角度ブレ演算部201は、カメラボディ2Aに分担させる像ブレV(B)を、V(L)+d2とする。d2は、像面70の第2位置における像ブレΔy2bと、上記Δy2aとの差である。
 交換レンズ3Aの並進ブレ演算部402は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレを、例えば像面70の中心における像ブレの1/2とする。カメラボディ2Aの並進ブレ演算部202は、カメラボディ2Aに分担させる像ブレを、例えば像面70の中心における像ブレの1/2とする。
 交換レンズ3Aのブレ補正光学系目標位置演算部403は、角度ブレ演算部401で算出された像ブレV(L)および並進ブレ演算部402で算出された像ブレを、それぞれX軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の第1位置における像ブレ量を算出する。
 カメラボディ2Aのブレ補正光学系目標位置演算部203は、角度ブレ演算部201で算出された像ブレV(B)および並進ブレ演算部202で算出された像ブレを、それぞれX軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の第2位置における像ブレ量を算出する。
In the fourth modification of the third embodiment, the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is the same as in the first to third embodiments. Even if the positions on the image plane 70 (the imaging plane of the imaging device 22) are different, they are treated as being substantially constant.
The outline of Modification 4 of the third embodiment is as follows.
The angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 </ b> A calculates the image blur at the first position on the image plane 70. The angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 </ b> A calculates image blur at the second position on the image plane 70.
The angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) as 1/2 of the image blur Δy2a at the first position of the image plane 70, and The angle blur calculation unit 201 of the body 2A sets the image blur V (B) shared by the camera body 2A to V (L) + d2. d2 is the difference between the image blur Δy2b at the second position of the image plane 70 and the Δy2a.
The translation blur calculation unit 402 of the interchangeable lens 3 </ b> A sets the image blur to be assigned to the interchangeable lens 3 </ b> A (for example, a sharing ratio of 50%) to, for example, ½ of the image blur at the center of the image plane 70. The translation blur calculation unit 202 of the camera body 2 </ b> A sets the image blur shared by the camera body 2 </ b> A to, for example, half of the image blur at the center of the image plane 70.
The blur correction optical system target position calculation unit 403 of the interchangeable lens 3 </ b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the first position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
The camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
 以上説明した第3の実施の形態の変形例4によれば、次の作用効果が得られる。
(1)カメラ1Aのブレ補正装置は、装置の振れを検出する振れセンサ39と、振れセンサ39の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部40と、ブレ補正部40の出力に基づいて、ブレ量を抑える向きにブレ補正光学系33を移動するブレ補正駆動機構37と、を交換レンズ3Aに備える。
また、装置の振れを検出する振れセンサ31と、振れセンサ31の出力に基づき、撮像光学系によって像面70に形成された被写体80の像のブレ量を演算するブレ補正部21aと、ブレ補正部21aの出力に基づいて、像面70において被写体80の像を撮像する撮像素子22を、ブレ量を抑える向きに移動するブレ補正駆動機構26と、像面70における第1位置と第2位置とを決定するCPU21と、をカメラボディ2Aに備える。
According to Modification 4 of the third embodiment described above, the following operational effects can be obtained.
(1) The camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39. The interchangeable lens 3 </ b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40.
Also, a shake sensor 31 that detects the shake of the apparatus, a shake correction unit 21 a that calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a, the image pickup device 22 that picks up the image of the subject 80 on the image plane 70 is moved in a direction that suppresses the amount of shake, and the first position and the second position on the image plane 70. The camera body 2A is provided with a CPU 21 that determines the above.
 交換レンズ3Aのブレ補正部40は、第1位置と振れセンサ39によって検出された振れとに基づく像ブレΔy2aを算出する。ブレ補正部40は、交換レンズ3Aに分担(例えば分担比率50%)させる像ブレV(L)を、像ブレΔy2aの1/2とする。
 カメラボディ2Aのブレ補正部21aは、第1位置と振れセンサ31によって検出された振れとに基づく像ブレΔy2aと、第2位置と振れセンサ31によって検出された振れとに基づく像ブレΔy2bとを演算する。ブレ補正部21bはさらに、像ブレΔy2aおよび像ブレΔy2bの差d2を算出する。角度ブレ演算部201は、カメラボディ2Aに分担させる像ブレV(B)を、V(L)+d2とする。
 これにより、像面70の中央以外にCPU21が決定した第2位置において、適切に像ブレを抑えることができる。とくに、交換レンズ3Aの焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。
The blur correction unit 40 of the interchangeable lens 3A calculates an image blur Δy2a based on the first position and the shake detected by the shake sensor 39. The blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to ½ of the image blur Δy2a.
The camera shake correction unit 21a of the camera body 2A generates an image blur Δy2a based on the first position and the shake detected by the shake sensor 31, and an image blur Δy2b based on the second position and the shake detected by the shake sensor 31. Calculate. The blur correction unit 21b further calculates a difference d2 between the image blur Δy2a and the image blur Δy2b. The angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d2.
Thus, image blur can be appropriately suppressed at the second position determined by the CPU 21 other than the center of the image plane 70. Particularly, it is suitable when the focal length f of the interchangeable lens 3A is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
(2)上記(1)のブレ補正装置において、交換レンズ3Aのブレ補正部40は、ブレ補正駆動機構37に像ブレΔy2aの50%を出力し、カメラボディ2Aのブレ補正部21bは、ブレ補正駆動機構26にΔy2aの残り50%と、上記の差d2を出力する。ブレ補正駆動機構26およびブレ補正駆動機構37を併用しない場合に比べて、ブレ補正駆動機構26およびブレ補正駆動機構37による移動距離を、それぞれ小さく抑えることができる。 (2) In the blur correction device of (1) above, the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur Δy2a to the blur correction drive mechanism 37, and the blur correction unit 21b of the camera body 2A The remaining 50% of Δy2a and the difference d2 are output to the correction drive mechanism 26. Compared with the case where the shake correction drive mechanism 26 and the shake correction drive mechanism 37 are not used together, the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
 なお、CPU21が決定する分担比率を、交換レンズ3Aによる像ブレ補正を100%とし、カメラボディ2Aによる像ブレ補正を0%としてもよい。この場合において、交換レンズ3Aの角度ブレ演算部401は、交換レンズ3Aで分担する像ブレV(L)を100%とし、カメラボディ2Aの角度ブレ演算部201は、カメラボディ2Aで分担する像ブレV(B)をd2とする。d2は、像面70の中心とは異なる第1位置における像ブレΔy2aと、像面70の中心とは異なる第2位置における像ブレΔy2bとの差である。 The sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A. In this case, the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A. The blur V (B) is d2. d2 is the difference between the image blur Δy2a at a first position different from the center of the image plane 70 and the image blur Δy2b at a second position different from the center of the image plane 70.
(変形例5)
 上式(5)、上式(7)の像ブレV(B)に基づく像ブレ補正演算を交換レンズ3Aのブレ補正部40によって行い、上式(4)、上式(6)の像ブレV(L)に基づくブレ補正演算をカメラボディ2Aのブレ補正部21aによって行ってもよい。第3の実施の形態の変形例5によれば、交換レンズ3Aによる像ブレ補正のために像ブレを算出する像面70の位置と、カメラボディ2Aによる像ブレ補正のために像ブレを算出する像面70の位置とを、第3の実施の形態や第3の実施の形態の変形例4の場合と入れ替えることができる。
(Modification 5)
Image blur correction calculation based on the image blur V (B) of the above formula (5) and the above formula (7) is performed by the blur correction unit 40 of the interchangeable lens 3A, and the image blur of the above formula (4) and the above formula (6) is performed. The blur correction calculation based on V (L) may be performed by the blur correction unit 21a of the camera body 2A. According to the fifth modification of the third embodiment, the position of the image plane 70 for calculating the image blur for the image blur correction by the interchangeable lens 3A and the image blur for the image blur correction by the camera body 2A are calculated. The position of the image plane 70 to be switched can be exchanged with the case of the third embodiment or the fourth modification of the third embodiment.
(変形例6)
 上述した第3の実施の形態、第3の実施の形態の変形例4の説明では、第2の実施の形態の内容の説明を省略したが、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正の際に、焦点距離fがypに比べて十分に大きいといえない場合には、X軸方向についても第2の実施の形態と同様の補正を行う。角度ブレ演算部201、および、角度ブレ演算部401は、X軸およびY軸の各軸について、ブレの向きにより正負の符号をつけて足し合わせ演算を行う。
(Modification 6)
In the description of the third embodiment and the fourth modification of the third embodiment, the description of the content of the second embodiment is omitted, but the Y axis when the camera 1A rotates in the pitch direction. When correcting the direction, if it cannot be said that the focal length f is sufficiently larger than yp, the same correction as in the second embodiment is performed also in the X-axis direction. The angle blur calculation unit 201 and the angle blur calculation unit 401 perform addition calculation by adding a positive / negative sign to the X axis and the Y axis depending on the blur direction.
 また、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正時においても同様である。すなわち、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正の際に、焦点距離fがxpに比べて十分に大きいといえない場合には、Y軸方向についても同様に補正を行う。角度ブレ演算部201、および、角度ブレ演算部401は、X軸およびY軸の各軸について、ブレの向きにより正負の符号をつけて足し合わせ演算を行う。 The same applies to correction in the X-axis direction when the camera 1A rotates in the Yaw direction. That is, when the camera 1A rotates in the Yaw direction and the X-axis direction is corrected, if the focal length f is not sufficiently larger than xp, the Y-axis direction is similarly corrected. The angle blur calculation unit 201 and the angle blur calculation unit 401 perform addition calculation by adding a positive / negative sign to the X axis and the Y axis depending on the blur direction.
(第4の実施の形態)
 第4の実施の形態では、図10のカメラ1Aを用いて、専ら交換レンズ3Aによって像ブレ補正を行う。カメラ1Aは、図10に例示した一眼レフタイプでも、ミラー24を備えないミラーレスタイプでもよい。
 また、交換レンズ3Aとカメラボディ2Aとを一体にしたレンズ一体型のカメラとして構成してもよい。
(Fourth embodiment)
In the fourth embodiment, image blur correction is performed exclusively by the interchangeable lens 3A using the camera 1A of FIG. The camera 1 </ b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
Alternatively, the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
<像ブレを算出する位置>
 第4の実施の形態におけるカメラボディ2AのCPU21は、例えば、第1の実施の形態における(1)から(4)のいずれかの方法によって、像面70において主要被写体の像が存在する可能性が高い位置を定める。そして、CPU21は、像面70に定めた位置を示す情報を、交換レンズ3Aのブレ補正部40へ送信する。
<Position for calculating image blur>
The CPU 21 of the camera body 2A in the fourth embodiment may cause an image of the main subject on the image plane 70 by, for example, any one of the methods (1) to (4) in the first embodiment. Determine the high position. Then, the CPU 21 transmits information indicating the position determined on the image plane 70 to the blur correction unit 40 of the interchangeable lens 3A.
 カメラボディ2AのCPU21が、像面70において像ブレを算出する位置の情報をブレ補正部40へ伝えるタイミングは、例えば、CPU21が像面70において像ブレを算出する位置を定めた(新たに定める場合と、更新する場合とを含む)ときである。
 CPU21は、カメラボディ2Aおよび交換レンズ3A間の定常的な通信に上記の位置情報を含めたり、カメラボディ2Aから交換レンズ3Aへ像ブレ補正の開始を指示する通信に上記の位置情報を含めたりするなどして、すみやかに位置情報をブレ補正部40に通知する。
The timing at which the CPU 21 of the camera body 2A transmits information on the position at which image blur is calculated on the image plane 70 to the blur correction unit 40 is determined, for example, by the CPU 21 at which the image blur is calculated on the image plane 70 (newly determined). And the case of updating).
The CPU 21 includes the position information in steady communication between the camera body 2A and the interchangeable lens 3A, or includes the position information in communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A. The position information is immediately notified to the shake correction unit 40.
 ブレ補正部40の角度ブレ演算部401は、CPU21から受信した情報が示す位置の像ブレを算出し、この像ブレに基づく像ブレ補正を行う。 The angle blur calculation unit 401 of the blur correction unit 40 calculates the image blur at the position indicated by the information received from the CPU 21, and performs the image blur correction based on the image blur.
 像ブレを算出する位置が像面70の中心である場合、Y軸方向における像ブレΔy1を表す数式は、第1の実施の形態で説明した通り、上式(2)である。
 また、像ブレを算出する位置が像面70の中心と異なる位置である場合、Y軸方向における像ブレΔy2を表す数式は、第1の実施の形態で説明した通り、上式(1)である。
When the position where the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur Δy1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
Further, when the position where the image blur is calculated is a position different from the center of the image plane 70, the mathematical expression representing the image blur Δy2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
 なお、第4の実施の形態における像ブレ補正は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上式(1)および上式(2)は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正について、表したものである。これと同様の補正が、カメラ1AがYaw方向に回転した場合には、X軸方向に対して上述した補正と同様の補正が必要である。
 カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とは、方向が異なる以外は同様であるので、X軸方向の補正の説明については省略する。
Note that the image blur correction in the fourth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
The above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction. When the camera 1A rotates in the Yaw direction, the same correction as that described above with respect to the X-axis direction is necessary.
Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
 また、カメラ1AがPitch方向とYaw方向とに回転した場合には、両回転運動によるX軸、Y軸に対する像ブレが同時に起こるので、両回転運動による像ブレを、X軸、Y軸の各軸の方向にそれぞれ正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後の像ブレに基づき、X軸およびY軸方向においてそれぞれ補正する。 In addition, when the camera 1A rotates in the Pitch direction and the Yaw direction, image blurring with respect to both the X axis and the Y axis due to both rotational movements occurs simultaneously. Addition is performed by adding positive and negative signs to the axis directions. And based on the image blur after addition, it correct | amends in an X-axis direction and a Y-axis direction, respectively.
 なお、第4の実施の形態において、第3の形態と同様に、並進ブレ演算部402によって算出される像ブレについては像面70(撮像素子22の撮像面)における位置が異なっても略一定として扱う。
 第4の実施の形態の概要は、以下の通りである。
 交換レンズ3Aのブレ補正部40の角度ブレ演算部401は、像面70において像ブレを算出する位置を、カメラボディ2AのCPU21から通知された位置に定めて像ブレを算出する。
 並進ブレ演算部402は、例えば像面70の中心において像ブレを算出する。
 ブレ補正光学系目標位置演算部403は、角度ブレ演算部401で算出された像ブレおよび並進ブレ演算部402で算出された像ブレを、X軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、カメラボディ2AのCPU21から通知された像面70の位置における像ブレ量を算出する。
In the fourth embodiment, as in the third embodiment, the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different. Treat as.
The outline of the fourth embodiment is as follows.
The angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates the image blur by setting the position where the image blur is calculated on the image plane 70 to the position notified from the CPU 21 of the camera body 2A.
The translation blur calculation unit 402 calculates an image blur at the center of the image plane 70, for example.
The blur correction optical system target position calculator 403 determines whether the image blur calculated by the angle blur calculator 401 and the image blur calculated by the translation blur calculator 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 notified from the CPU 21 of the camera body 2A is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
 以上説明した第4の実施の形態によれば、次の作用効果が得られる。
(1)ブレ補正装置は、交換レンズ3Aによって像面70に形成された被写体像を撮像する撮像素子22と、像面70における位置を決定するCPU21と、CPU21により決定した位置の情報を交換レンズ3Aへ送信するCPU21と、を有するカメラボディ2Aと、ブレ補正するブレ補正光学系33と、カメラボディ2Aから位置の情報を受信するブレ補正部40と、カメラボディ2Aから受信した位置と振れセンサ39で検出した振れとに基づき像ブレΔy2を演算するブレ補正部40と、像ブレΔy2を抑える向きにブレ補正光学系33を移動するブレ補正駆動機構37と、を有する交換レンズ3Aとを備える。これにより、例えば、カメラボディ2AのCPU21が決定した像面70の中央以外の位置において、適切に像ブレを抑えることができる。とくに、交換レンズ3Aの焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。
According to the fourth embodiment described above, the following operational effects can be obtained.
(1) The blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and information on the position determined by the CPU 21 as an interchangeable lens. A camera body 2A having a CPU 21 for transmission to 3A, a blur correction optical system 33 for blur correction, a blur correction unit 40 for receiving position information from the camera body 2A, and a position and shake sensor received from the camera body 2A. An interchangeable lens 3A having a blur correction unit 40 that calculates an image blur Δy2 based on the shake detected in 39, and a blur correction drive mechanism 37 that moves the blur correction optical system 33 in a direction to suppress the image blur Δy2. . Thereby, for example, image blur can be appropriately suppressed at a position other than the center of the image plane 70 determined by the CPU 21 of the camera body 2A. Particularly, it is suitable when the focal length f of the interchangeable lens 3A is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
(2)交換レンズ3Aのブレ補正部40は、振れセンサ39の出力と、交換レンズ3Aの焦点距離fとにより像ブレΔy2を演算する。これにより、像面70の中央以外の位置において適切に像ブレΔy2を算出でき、この像ブレΔy2に基づいて適切に像ブレを抑えることができる。 (2) The blur correction unit 40 of the interchangeable lens 3A calculates the image blur Δy2 based on the output of the shake sensor 39 and the focal length f of the interchangeable lens 3A. As a result, the image blur Δy2 can be appropriately calculated at a position other than the center of the image plane 70, and the image blur can be appropriately suppressed based on the image blur Δy2.
(第5の実施の形態)
 第5の実施の形態は、第4の実施の形態と同様に、図10のカメラ1Aを用いる。
 第5の実施の形態による像ブレ補正は、専ら交換レンズ3Aのブレ補正駆動機構37を作動させて行うが、カメラボディ2AのCPU21のブレ補正部21aと交換レンズ3Aのブレ補正部40との双方が演算を行う点で、第4の実施の形態と相違する。
 カメラ1Aは、図10に例示した一眼レフタイプでも、ミラー24を備えないミラーレスタイプでもよい。
 また、交換レンズ3Aとカメラボディ2Aとを一体にしたレンズ一体型のカメラとして構成してもよい。
(Fifth embodiment)
In the fifth embodiment, the camera 1A shown in FIG. 10 is used, as in the fourth embodiment.
The image blur correction according to the fifth embodiment is performed exclusively by operating the blur correction drive mechanism 37 of the interchangeable lens 3A. However, the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A. It differs from the fourth embodiment in that both perform computations.
The camera 1 </ b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
Alternatively, the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
<像ブレを算出する位置>
 カメラボディ2AのCPU21は、例えば、第1の実施の形態における(1)から(4)のいずれかの方法によって、像面70において主要被写体の像が存在する可能性が高い位置を定める。そして、CPU21は、像面70の中心を第1位置とし、上記のように定めた位置を第2位置とする。
<Position for calculating image blur>
The CPU 21 of the camera body 2A determines a position where the image of the main subject is highly likely to exist on the image plane 70 by, for example, any one of methods (1) to (4) in the first embodiment. Then, the CPU 21 sets the center of the image plane 70 as the first position, and sets the position determined as described above as the second position.
<カメラボディ側の演算>
 CPU21のブレ補正部21aは、像面70の第1位置および第2位置における像ブレを算出する。
 具体的には、角度ブレ演算部201により、振れセンサ31の角速度センサからのX軸と平行な軸回り(Pitch方向)の検出信号を用いて、回転運動によるY軸方向の像ブレと、必要な場合にはX軸方向の像ブレとを算出する。また、角度ブレ演算部201により、振れセンサ31の角速度センサからのY軸と平行な軸回り(Yaw方向)の検出信号を用いて、回転運動によるX軸方向の像ブレと、必要な場合にはY軸方向の像ブレとを算出する。
<Calculation on the camera body>
The blur correction unit 21 a of the CPU 21 calculates image blur at the first position and the second position of the image plane 70.
Specifically, the angle blur calculation unit 201 uses the detection signal around the axis parallel to the X axis (Pitch direction) from the angular velocity sensor of the shake sensor 31 and the image blur in the Y axis direction due to the rotational motion and necessary. In such a case, the image blur in the X-axis direction is calculated. In addition, the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) from the angular velocity sensor of the shake sensor 31 to cause image blur in the X axis direction due to rotational motion, and when necessary. Calculates the image blur in the Y-axis direction.
 像ブレを算出する位置が第1位置の場合、Y軸方向における像ブレΔy1を表す数式は、第1の実施の形態で説明した通り、上式(2)である。
 また、像ブレを算出する位置が第2位置で、像面70の中心と異なる位置である場合、Y軸方向における像ブレΔy2を表す数式は、第1の実施の形態で説明した通り、上式(1)である。
When the position where the image blur is calculated is the first position, the mathematical expression representing the image blur Δy1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
In addition, when the position where the image blur is calculated is the second position, which is a position different from the center of the image plane 70, the mathematical expression representing the image blur Δy2 in the Y-axis direction is as described in the first embodiment. Formula (1).
 第5の実施の形態において、CPU21のブレ補正部21aはさらに、第1位置における像ブレΔy1と、第2位置における像ブレΔy2との比gを次式(8)により算出する。
 g=Δy2/Δy1      ……(8)
 上記gを、補正係数gと称する。
 CPU21は、上記補正係数gを示す情報を、交換レンズ3Aのブレ補正部40へ送信する。CPU21は、Δy2とΔy1との比を示す情報の代わりに、Δy2とΔy1との差を示す情報を交換レンズ3Aのブレ補正部40へ送信してもよい。
In the fifth embodiment, the blur correction unit 21a of the CPU 21 further calculates a ratio g between the image blur Δy1 at the first position and the image blur Δy2 at the second position by the following equation (8).
g = Δy2 / Δy1 (8)
The g is referred to as a correction coefficient g.
The CPU 21 transmits information indicating the correction coefficient g to the blur correction unit 40 of the interchangeable lens 3A. The CPU 21 may transmit information indicating the difference between Δy2 and Δy1 to the blur correction unit 40 of the interchangeable lens 3A instead of the information indicating the ratio between Δy2 and Δy1.
 カメラボディ2AのCPU21が、補正係数gを示す情報をブレ補正部40へ伝えるタイミングは、例えば、CPU21が像面70において像ブレを算出する第1位置および第2位置を定めた(新たに定める場合と、更新する場合とを含む)後、補正係数gを算出したときである。
 CPU21は、カメラボディ2Aおよび交換レンズ3A間の定常的な通信に上記の補正係数gの情報を含めたり、カメラボディ2Aから交換レンズ3Aへ像ブレ補正の開始を指示する通信に上記の補正係数gの情報を含めたりするなどして、すみやかに補正係数gの情報をブレ補正部40に通知する。
The timing at which the CPU 21 of the camera body 2A transmits the information indicating the correction coefficient g to the blur correction unit 40 is determined by, for example, a first position and a second position at which the CPU 21 calculates image blur on the image plane 70 (newly determined). And the case where the correction coefficient g is calculated.
The CPU 21 includes the information of the correction coefficient g in the steady communication between the camera body 2A and the interchangeable lens 3A, or the correction coefficient in the communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A. Information on the correction coefficient g is promptly notified to the shake correction unit 40 by including information on g.
<交換レンズ側の演算>
 ブレ補正部40の角度ブレ演算部401は、カメラボディ2Aのブレ補正部21aの角度ブレ演算部201と同様に、角速度センサ39aによるX軸と平行な軸回り(Pitch方向)の検出信号を用いて、回転運動によるY軸方向の像ブレと、必要な場合にはX軸方向の像ブレとを算出する。また、角度ブレ演算部401は、角速度センサ39aによるY軸と平行な軸回り(Yaw方向)の検出信号を用いて、回転運動によるX軸方向の像ブレと、必要な場合にはY軸方向の像ブレとを算出する。
<Calculation on the interchangeable lens side>
Similar to the angle blur calculation unit 201 of the camera shake correction unit 21a of the camera body 2A, the angle blur calculation unit 401 of the camera shake correction unit 40 uses a detection signal around the axis parallel to the X axis (Pitch direction) by the angular velocity sensor 39a. Thus, image blur in the Y-axis direction due to rotational movement and image blur in the X-axis direction are calculated if necessary. In addition, the angle blur calculation unit 401 uses the detection signal around the axis parallel to the Y axis (Yaw direction) by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
<像ブレを算出する位置>
 第5の実施の形態におけるブレ補正部40は、カメラボディ2AのCPU21が定める第1位置と同じ位置、本例では像面70の中心における像ブレを算出する。像ブレを算出する位置が像面70の中心であるので、Y軸方向における像ブレΔy1を表す数式は、第1の実施の形態で説明した通り、上式(2)である。
 角度ブレ演算部401は、Y軸方向における像ブレΔy1に対し、カメラボディ2Aから受信部により受信した情報に基づく補正係数gを掛けることにより、像面70の第2位置のY軸方向における像ブレΔy2を算出する。
 なお、カメラボディ2Aから受信した情報がΔy2とΔy1との差を示す情報である場合、角度ブレ演算部401は、像ブレΔy1に受信した情報を足し合わせることにより、像ブレΔy2を算出する。
<Position for calculating image blur>
The blur correction unit 40 in the fifth embodiment calculates the image blur at the same position as the first position defined by the CPU 21 of the camera body 2A, in this example, the center of the image plane 70. Since the position where the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur Δy1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
The angle blur calculation unit 401 multiplies the image blur Δy1 in the Y-axis direction by a correction coefficient g based on information received from the camera body 2A by the receiving unit, thereby obtaining an image in the Y-axis direction of the second position of the image plane 70. The blur Δy2 is calculated.
When the information received from the camera body 2A is information indicating the difference between Δy2 and Δy1, the angle blur calculation unit 401 calculates the image blur Δy2 by adding the received information to the image blur Δy1.
 並進ブレ演算部402は、加速度センサ39bによるX軸方向の検出信号を用いて、並進運動によるX軸方向の像ブレを算出する。また、並進ブレ演算部402は、加速度センサ39bによるY軸方向の検出信号を用いて、並進運動によるY軸方向の像ブレを算出する。 The translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b. The translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
 ブレ補正光学系目標位置演算部403は、角度ブレ演算部401によって算出されたX軸方向およびY軸方向の像ブレと、並進ブレ演算部402によって算出されたX軸方向およびY軸方向の像ブレとを足し合わせて、X軸方向およびY軸方向の像ブレを算出する。 The shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402. The image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
 また、ブレ補正光学系目標位置演算部403は、足し合わせ後のX軸方向およびY軸方向の像ブレと、撮影倍率(ズーム光学系31の位置に基づいて算出する)と、カメラ1Aから被写体80までの距離(フォーカス光学系32の位置に基づいて算出する)とに基づいて、像面70の第2位置における像ブレ量を算出する。 In addition, the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), the image blur amount at the second position of the image plane 70 is calculated.
 ブレ補正光学系目標位置演算部403は、交換レンズ3Aのブレ補正駆動機構37を作動させて像ブレ補正を行うため、算出した像ブレ量を打ち消す向きにブレ補正光学系33を移動させるためのブレ補正光学系33の目標位置を演算する。
 そして、ブレ補正光学系目標位置演算部403は、交換レンズ3Aのブレ補正駆動機構37に対して目標位置を示す信号を送出する。
The blur correction optical system target position calculation unit 403 operates the blur correction drive mechanism 37 of the interchangeable lens 3A to perform image blur correction, and therefore moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount. The target position of the blur correction optical system 33 is calculated.
Then, the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
 なお、第5の実施の形態における像ブレ補正は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とを含む。
 上式(1)および上式(2)は、カメラ1AがPitch方向に回転した場合におけるY軸方向の補正について、表したものである。これと同様の補正が、カメラ1AがYaw方向に回転した場合には、X軸方向に対して上述した補正と同様の補正が必要である。
 カメラ1AがPitch方向に回転した場合におけるY軸方向の補正と、カメラ1AがYaw方向に回転した場合におけるX軸方向の補正とは、方向が異なる以外は同様であるので、X軸方向の補正の説明については省略する。
Note that the image blur correction in the fifth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
The above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction. When the camera 1A rotates in the Yaw direction, the same correction as that described above with respect to the X-axis direction is necessary.
Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
 また、カメラ1AがPitch方向とYaw方向とに回転した場合には、両回転運動によるX軸、Y軸に対する像ブレが同時に起こるので、両回転運動による像ブレを、X軸、Y軸の各軸の方向にそれぞれ正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後の像ブレに基づき、X軸およびY軸方向においてそれぞれ補正する。 In addition, when the camera 1A rotates in the Pitch direction and the Yaw direction, image blurring with respect to both the X axis and the Y axis due to both rotational movements occurs simultaneously. Addition is performed by adding positive and negative signs to the axis directions. And based on the image blur after addition, it correct | amends in an X-axis direction and a Y-axis direction, respectively.
 なお、第5の実施の形態において、第4の形態と同様に、並進ブレ演算部402によって算出される像ブレについては像面70(撮像素子22の撮像面)における位置が異なっても略一定として扱う。
 第5の実施の形態の概要は、以下の通りである。
 カメラボディ2Aのブレ補正部21aの角度ブレ演算部201は、像面70の第1位置(像面70の中心)および第2位置において像ブレΔy1およびΔy2を算出する。
 ブレ補正部21aは、第1位置における像ブレΔy1と、第2位置における像ブレΔy2との比である補正係数gを算出し、補正係数gを示す情報を、交換レンズ3Aのブレ補正部40へ送信する。
In the fifth embodiment, as in the fourth embodiment, the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different. Treat as.
The outline of the fifth embodiment is as follows.
The angle blur calculation unit 201 of the blur correction unit 21a of the camera body 2A calculates image blurs Δy1 and Δy2 at the first position (center of the image plane 70) and the second position of the image plane 70.
The blur correction unit 21a calculates a correction coefficient g that is a ratio between the image blur Δy1 at the first position and the image blur Δy2 at the second position, and information indicating the correction coefficient g is used as the blur correction unit 40 of the interchangeable lens 3A. Send to.
 交換レンズ3Aのブレ補正部40の角度ブレ演算部401は、像面70の第1位置(像面70の中心)において像ブレを算出する。角度ブレ演算部401はさらに、第1位置における像ブレに対し、カメラボディ2Aから受信部により受信した情報に基づく補正係数gを掛けることにより、像面70の第2位置における像ブレを算出する。
 ブレ補正部40の並進ブレ演算部402は、例えば第1位置において像ブレを算出する。
 ブレ補正部40のブレ補正光学系目標位置演算部403は、第2位置における像ブレおよび並進ブレ演算部402で算出された像ブレを、X軸、Y軸の各軸の方向の向きによって正負の符号をつけて足し合わせ演算を行う。そして、足し合わせ後のX軸方向およびY軸方向の像ブレに基づき、像面70の第2位置における像ブレ量を算出する。
The angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates an image blur at the first position of the image plane 70 (the center of the image plane 70). The angle blur calculation unit 401 further calculates the image blur at the second position of the image plane 70 by multiplying the image blur at the first position by the correction coefficient g based on the information received from the camera body 2A by the receiving unit. .
The translation blur calculation unit 402 of the blur correction unit 40 calculates image blur at the first position, for example.
The shake correction optical system target position calculation unit 403 of the shake correction unit 40 determines whether the image shake at the second position and the image shake calculated by the translational shake calculation unit 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
 以上説明した第5の実施の形態によれば、次の作用効果が得られる。
(1)ブレ補正装置は、交換レンズ3Aによって像面70に形成された被写体像を撮像する撮像素子22と、像面70における位置を決定するCPU21と、像面70に予め定めた第1位置(像面70の中心)およびCPU21により決定した第2位置と振れセンサ31で検出した振れとに基づき、第1位置(像面70の中心)および第2位置における像ブレΔy1および像ブレΔy2を演算するブレ補正部21aと、像ブレΔy1および像ブレΔy2の比である補正係数gまたは差の情報を交換レンズ3Aへ送信するCPU21と、を有するカメラボディ2Aと、ブレ補正するブレ補正光学系33と、第1位置(像面70の中心)と振れセンサ39で検出した振れとに基づき撮像素子22の第1位置(像面70の中心)における像ブレΔy1を演算するブレ補正部40と、カメラボディ2Aから情報を受信するブレ補正部40と、ブレ補正部40で算出した像ブレΔy1を、受信した情報に基づいて補正し、補正後の像ブレを抑える向きにブレ補正光学系33を移動するブレ補正駆動機構37と、を有する交換レンズ3Aとを備える。これにより、交換レンズ3Aのブレ補正部40は、例えば、カメラボディ2AのCPU21が決定した第2位置において、適切に像ブレを抑えることができる。とくに、交換レンズ3Aの焦点距離fが短い場合(もしくは、撮像素子22のサイズと焦点距離fとの関係で、画角が広くなる場合)に好適である。
According to the fifth embodiment described above, the following operational effects can be obtained.
(1) The blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and a first position that is predetermined on the image plane 70. Based on (the center of the image plane 70) and the second position determined by the CPU 21 and the shake detected by the shake sensor 31, the image blur Δy1 and the image blur Δy2 at the first position (the center of the image plane 70) and the second position are obtained. A camera body 2A having a camera shake correction unit 21a that calculates, a CPU 21 that transmits information about a correction coefficient g or a difference between the image camera shake Δy1 and the image camera shake Δy2 to the interchangeable lens 3A, and a camera shake correction optical system that performs camera shake correction. 33, the image blur Δy1 at the first position (center of the image plane 70) of the image sensor 22 based on the first position (center of the image plane 70) and the shake detected by the shake sensor 39. Based on the received information, the shake correction unit 40 that calculates, the shake correction unit 40 that receives information from the camera body 2A, and the image blur Δy1 calculated by the shake correction unit 40 are corrected to suppress the corrected image blur. An interchangeable lens 3 </ b> A having a shake correction drive mechanism 37 that moves the shake correction optical system 33 in the direction is provided. Thereby, the blur correction unit 40 of the interchangeable lens 3A can appropriately suppress the image blur at the second position determined by the CPU 21 of the camera body 2A, for example. Particularly, it is suitable when the focal length f of the interchangeable lens 3A is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
(2)カメラボディ2Aのブレ補正部21aは、振れセンサ31の出力と、交換レンズ3Aの焦点距離fとにより像ブレΔy1および像ブレΔy2を演算し、交換レンズ3Aのブレ補正部40は、振れセンサ39の出力と、焦点距離fとにより像ブレΔy1を演算する。これにより、交換レンズ3Aのブレ補正部40は、像面70の中央以外の第2位置における像ブレΔy2を適切に算出でき、この像ブレΔy2に基づいて適切に像ブレを抑えることができる。 (2) The blur correction unit 21a of the camera body 2A calculates the image blur Δy1 and the image blur Δy2 based on the output of the shake sensor 31 and the focal length f of the interchangeable lens 3A, and the blur correction unit 40 of the interchangeable lens 3A The image blur Δy1 is calculated from the output of the shake sensor 39 and the focal length f. Accordingly, the blur correction unit 40 of the interchangeable lens 3A can appropriately calculate the image blur Δy2 at the second position other than the center of the image plane 70, and can appropriately suppress the image blur based on the image blur Δy2.
 第5の実施の形態と上述した第3の実施の形態の変形例4とを組み合わせてもよい。像ブレ補正を、カメラボディ2Aのブレ補正駆動機構26と交換レンズ3Aのブレ補正駆動機構37との双方を作動させて行う点は、第3の実施の形態の変形例4と共通する。また、カメラボディ2AのCPU21のブレ補正部21aと、交換レンズ3Aのブレ補正部40との双方が演算を行う点は、第5の実施の形態と共通する。 The fifth embodiment and the fourth modification of the third embodiment described above may be combined. Image blur correction is performed by operating both the camera shake correction drive mechanism 26 of the camera body 2A and the camera shake correction drive mechanism 37 of the interchangeable lens 3A, which is the same as in the fourth modification of the third embodiment. Also, the point that both the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A perform the calculation is common to the fifth embodiment.
 例えば、カメラボディ2AのCPU21は、交換レンズ3Aのブレ補正部40へ、(a)像面70において像ブレを算出する第1位置の情報と、(b)交換レンズ3Aによる像ブレ補正とカメラボディ2Aによる像ブレ補正との分担比率を示す情報とを送信する。
 交換レンズ3Aのブレ補正部40は、像面70の第1位置における像ブレを演算した上で、上式(6)により、交換レンズ3Aで分担する像ブレV(L)を求める。
 一方、カメラボディ2Aのブレ補正部21aは、像面70の第1位置における角度ブレと像面70の第2位置における像ブレを演算した上で、上式(7)により、カメラボディ2Aで分担する像ブレV(B)を求める。
For example, the CPU 21 of the camera body 2A sends to the shake correction unit 40 of the interchangeable lens 3A (a) information on the first position for calculating image blur on the image plane 70, and (b) image blur correction by the interchangeable lens 3A and the camera. Information indicating a sharing ratio with the image blur correction by the body 2A is transmitted.
The blur correction unit 40 of the interchangeable lens 3A calculates the image blur at the first position of the image plane 70, and obtains the image blur V (L) shared by the interchangeable lens 3A by the above equation (6).
On the other hand, the blur correction unit 21a of the camera body 2A calculates the angular blur at the first position of the image plane 70 and the image blur at the second position of the image plane 70, and then calculates the camera body 2A by the above equation (7). Find the image blur V (B) to be shared.
 交換レンズ3Aのブレ補正部40は、算出した像ブレV(L)と、並進ブレ演算部402で算出した像ブレとに基づいてブレ補正光学系33の目標位置を演算することにより、交換レンズ3Aのブレ補正駆動機構37を作動させて像ブレ補正を行う。
 また、カメラボディ2Aのブレ補正部21aは、算出した像ブレ(B)と、並進ブレ演算部202で算出した像ブレとに基づいて撮像素子22の目標位置を演算することにより、カメラボディ2Aのブレ補正駆動機構26を作動させて行う像ブレ補正を行う。
The blur correction unit 40 of the interchangeable lens 3A calculates the target position of the blur correction optical system 33 based on the calculated image blur V (L) and the image blur calculated by the translation blur calculation unit 402, thereby replacing the interchangeable lens. The image blur correction is performed by operating the blur correction drive mechanism 37 of 3A.
Further, the blur correction unit 21a of the camera body 2A calculates the target position of the image sensor 22 based on the calculated image blur (B) and the image blur calculated by the translational blur calculation unit 202, so that the camera body 2A. The image blur correction performed by operating the image blur correction drive mechanism 26 is performed.
 上述した各実施の形態や、その変形例では、ブレを止めたい位置における像ブレを補正する。そのため、像面70においてCPU21が定めた位置の像ブレを抑える一方で、像面70の他の位置では像ブレが残る場合も想定される。このような場合には、画像処理による画像復元と組み合わせてもよい。CPU21は、信号処理回路27に指示を送り、信号処理回路27によって生成された画像データのうち、上記他の位置に相当するデータに対し、例えばエッジ強調処理を強くかけるなどして像ブレを目立たなくする画像復元処理を実行させる。 In each of the above-described embodiments and modifications thereof, image blur at a position where blur is desired to be corrected is corrected. For this reason, it may be assumed that image blur at a position determined by the CPU 21 on the image plane 70 is suppressed while image blur remains at other positions on the image plane 70. In such a case, image restoration by image processing may be combined. The CPU 21 sends an instruction to the signal processing circuit 27 to make the image blurring conspicuous, for example, by strongly applying edge enhancement processing to the data corresponding to the other positions among the image data generated by the signal processing circuit 27. The image restoration process to be eliminated is executed.
 次の優先権基礎出願の開示内容は引用文としてここに組み込まれる。
 日本国特許出願2017年第72590号(2017年3月31日出願)
The disclosure of the following priority application is hereby incorporated by reference.
Japanese patent application No. 72590 in 2017 (filed on March 31, 2017)
1、1A…カメラ
2、2A…カメラボディ
3、3A…交換レンズ
21…CPU
21a、40…ブレ補正部
22…撮像素子
26,37…ブレ補正駆動機構
31,39…振れセンサ
33…ブレ補正光学系
70…像面
80…被写体
DESCRIPTION OF SYMBOLS 1, 1A ... Camera 2, 2A ... Camera body 3, 3A ... Interchangeable lens 21 ... CPU
21a, 40 ... blur correction unit 22 ... image sensor 26, 37 ... blur correction drive mechanism 31, 39 ... shake sensor 33 ... blur correction optical system 70 ... image plane 80 ... subject

Claims (10)

  1.  被写体像を撮像する撮像素子を備えるカメラボディに着脱可能な交換レンズであって、
     前記被写体像を像面に結像させる撮像光学系と、
     前記交換レンズまたは前記カメラボディの少なくとも一方で検出されたブレ量が入力される入力部と、
     前記像面における光軸外でのブレを補正するための軸外補正量を算出するために用いる情報を受信する受信部と、
     少なくとも前記情報と前記ブレ量とに基づいて、前記撮像光学系の少なくとも一部の可動部を光軸と直交する面内において駆動する駆動部と、
     を備える交換レンズ。
    An interchangeable lens that can be attached to and detached from a camera body including an image sensor that captures a subject image,
    An imaging optical system for forming the subject image on an image plane;
    An input unit for inputting a blur amount detected on at least one of the interchangeable lens or the camera body;
    A receiving unit that receives information used to calculate an off-axis correction amount for correcting blurring off the optical axis in the image plane;
    A driving unit that drives at least a part of the movable unit of the imaging optical system in a plane perpendicular to the optical axis based on at least the information and the amount of blur;
    Interchangeable lens with
  2.  前記情報は、前記像面における前記光軸外の位置を表す、請求項1に記載の交換レンズ。 The interchangeable lens according to claim 1, wherein the information represents a position outside the optical axis in the image plane.
  3.  前記受信部は、前記カメラボディで設定された前記光軸外の位置が変更されると、前記情報を受信する、請求項1または2に記載の交換レンズ。 The interchangeable lens according to claim 1 or 2, wherein the reception unit receives the information when a position outside the optical axis set in the camera body is changed.
  4.  前記受信部は、前記情報を周期的に受信する、請求項1または2に記載の交換レンズ。 The interchangeable lens according to claim 1 or 2, wherein the receiving unit periodically receives the information.
  5.  前記情報は、前記軸外補正量と前記像面における光軸上での前記ブレを補正するための軸上補正量との差分または比率の少なくとも一方を表す、請求項1から4のいずれか一項に記載の交換レンズ。 5. The information according to claim 1, wherein the information represents at least one of a difference or a ratio between the off-axis correction amount and the on-axis correction amount for correcting the blur on the optical axis on the image plane. The interchangeable lens according to item.
  6.  前記駆動部は、前記軸上補正量と前記軸外補正量との少なくとも一方に基づいて前記可動部を駆動する、請求項5に記載の交換レンズ。 6. The interchangeable lens according to claim 5, wherein the driving unit drives the movable unit based on at least one of the on-axis correction amount and the off-axis correction amount.
  7.  撮像光学系を着脱可能なカメラボディであって、
     前記撮像光学系により像面に結像される被写体像を撮像する撮像素子と、
     前記像面における光軸外でのブレを補正するための軸外補正量を算出するために用いられる情報を前記撮像光学系に送信する送信部と、
     を備えるカメラボディ。
    A camera body to which an imaging optical system can be attached and detached,
    An image sensor that captures a subject image formed on an image plane by the imaging optical system;
    A transmitter that transmits information used to calculate an off-axis correction amount for correcting blurring off the optical axis in the image plane to the imaging optical system;
    Camera body with
  8.  撮像光学系を着脱可能なカメラボディであって、
     前記撮像光学系により像面に結像される被写体像を撮像する撮像素子と、
     前記撮像光学系または前記撮像素子の少なくとも一方のブレ量が入力される入力部と、
     前記像面における光軸外でのブレを補正するための軸外補正量を、前記ブレ量に基づいて算出する算出部と、を備えるカメラボディ。
    A camera body to which an imaging optical system can be attached and detached,
    An image sensor that captures a subject image formed on an image plane by the imaging optical system;
    An input unit for inputting a blur amount of at least one of the imaging optical system or the imaging element;
    A camera body comprising: a calculation unit that calculates an off-axis correction amount for correcting a shake outside the optical axis on the image plane based on the shake amount.
  9.  前記算出部は、前記像面における光軸上の位置でのブレを補正するための軸上補正量と前記軸外補正量との差分または比率の少なくとも一方を算出する、請求項8に記載のカメラボディ The calculation unit according to claim 8, wherein the calculation unit calculates at least one of a difference or a ratio between an on-axis correction amount for correcting blur at a position on the optical axis on the image plane and the off-axis correction amount. Camera body
  10.  前記撮像素子を光軸に対する位置を変化させるように駆動する駆動部を備え、
     前記駆動部は、前記差分または前記比率の少なくとも一方に基づいて前記像面における光軸外の位置でのブレを補正するように前記撮像素子を駆動する、請求項9に記載のカメラボディ。
    A drive unit for driving the image sensor so as to change the position with respect to the optical axis;
    10. The camera body according to claim 9, wherein the driving unit drives the imaging element so as to correct blurring at a position outside the optical axis on the image plane based on at least one of the difference or the ratio.
PCT/JP2018/011477 2017-03-31 2018-03-22 Interchangeable lens and camera body WO2018180909A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-072590 2017-03-31
JP2017072590A JP2020095071A (en) 2017-03-31 2017-03-31 Interchangeable lens and image capturing device

Publications (1)

Publication Number Publication Date
WO2018180909A1 true WO2018180909A1 (en) 2018-10-04

Family

ID=63677062

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/011477 WO2018180909A1 (en) 2017-03-31 2018-03-22 Interchangeable lens and camera body

Country Status (2)

Country Link
JP (1) JP2020095071A (en)
WO (1) WO2018180909A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115697716A (en) 2020-05-29 2023-02-03 富士胶片株式会社 On-press developable lithographic printing plate precursor, method for producing lithographic printing plate, and lithographic printing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015233248A (en) * 2014-06-10 2015-12-24 キヤノン株式会社 Imaging apparatus and imaging method
JP2017044876A (en) * 2015-08-27 2017-03-02 オリンパス株式会社 Imaging apparatus and image shake correction method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015233248A (en) * 2014-06-10 2015-12-24 キヤノン株式会社 Imaging apparatus and imaging method
JP2017044876A (en) * 2015-08-27 2017-03-02 オリンパス株式会社 Imaging apparatus and image shake correction method

Also Published As

Publication number Publication date
JP2020095071A (en) 2020-06-18

Similar Documents

Publication Publication Date Title
WO2018180916A1 (en) Blur correction device, replacement lens, and imaging device
JP2018173632A (en) Imaging device
US9602727B2 (en) Imaging apparatus and imaging method
WO2020088133A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
JP6685843B2 (en) Imaging device
WO2014156731A1 (en) Image-capturing device, solid-state image-capturing element, camera module, electronic devi ce, and image-capturing method
JP2020042078A (en) Optical apparatus
WO2017120771A1 (en) Depth information acquisition method and apparatus, and image collection device
US10616503B2 (en) Communication apparatus and optical device thereof
JP4832013B2 (en) Image blur correction device
JP6543946B2 (en) Shake correction device, camera and electronic device
US20190222767A1 (en) Shake correction device, imaging apparatus, and shake correction method
JP2023041748A (en) Camera, lens device, control method, and computer program
JP2019145958A (en) Imaging apparatus, control method of the same, and program
WO2019151030A1 (en) Imaging device, solid-state imaging element, camera module, drive control unit, and imaging method
JP2019164338A (en) Camera, lens device, control method, and computer program
WO2018180909A1 (en) Interchangeable lens and camera body
JP2017044876A (en) Imaging apparatus and image shake correction method
WO2018180908A1 (en) Blur correction device, replacement lens, and imaging device
US20150294442A1 (en) Camera system and imaging method
JP7484866B2 (en) Image blur correction device, interchangeable lens, imaging device, and image blur correction method
JP2006259114A (en) Digital camera
US11770614B2 (en) Image processing device and method, and program
JP6943323B2 (en) interchangeable lens
JP7210256B2 (en) Imaging device and display control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18776246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18776246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP