WO2018180909A1 - Lentille interchangeable et corps de caméra - Google Patents

Lentille interchangeable et corps de caméra Download PDF

Info

Publication number
WO2018180909A1
WO2018180909A1 PCT/JP2018/011477 JP2018011477W WO2018180909A1 WO 2018180909 A1 WO2018180909 A1 WO 2018180909A1 JP 2018011477 W JP2018011477 W JP 2018011477W WO 2018180909 A1 WO2018180909 A1 WO 2018180909A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
blur
image blur
correction
interchangeable lens
Prior art date
Application number
PCT/JP2018/011477
Other languages
English (en)
Japanese (ja)
Inventor
英志 三家本
豪 松本
大樹 中島
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2018180909A1 publication Critical patent/WO2018180909A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • G03B17/14Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets interchangeably
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an interchangeable lens and a camera body.
  • Patent Document 1 A technique for suppressing image blur due to camera shake is known (see Patent Document 1). However, the image blur at the center of the screen is corrected.
  • an interchangeable lens that can be attached to and detached from a camera body including an image sensor that captures a subject image includes an imaging optical system that forms the subject image on an image plane, and the interchangeable lens or the camera.
  • An input unit to which a blur amount detected on at least one of the bodies is input; and a receiver unit that receives information used to calculate an off-axis correction amount for correcting off-axis blur in the image plane;
  • a drive unit that drives at least a part of the movable unit of the imaging optical system in a plane orthogonal to the optical axis based on at least the information and the amount of blurring.
  • the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on the image plane by the imaging optical system, and an optical axis outside the image plane.
  • the camera body to which the imaging optical system can be attached and detached includes an imaging element that captures a subject image formed on an image plane by the imaging optical system, and the imaging optical system or the imaging element.
  • an imaging apparatus equipped with an image blur correction apparatus will be described with reference to the drawings.
  • an interchangeable lens digital camera hereinafter referred to as a camera 1
  • the camera 1 is a single lens reflex type having a mirror 24 in a camera body 2 and is not provided with a mirror 24. It may be a type.
  • the camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
  • the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
  • FIG. 1 is a diagram illustrating a main configuration of the camera 1.
  • the camera 1 includes a camera body 2 and an interchangeable lens 3.
  • the interchangeable lens 3 is attached to the camera body 2 via a mount unit (not shown).
  • the camera body 2 and the interchangeable lens 3 are electrically connected, and communication between the camera body 2 and the interchangeable lens 3 becomes possible. Communication between the camera body 2 and the interchangeable lens 3 may be performed by wireless communication.
  • the light from the subject enters in the negative direction of the Z axis.
  • the front direction perpendicular to the Z axis is defined as the X axis plus direction
  • the upward direction perpendicular to the Z axis and the X axis is defined as the Y axis plus direction.
  • the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes of FIG.
  • the interchangeable lens 3 has an imaging optical system (imaging optical system), and forms a subject image on the imaging surface of the imaging element 22 provided in the camera body 2.
  • the imaging optical system includes a zoom optical system 31, a focus (focus adjustment) optical system 32, a shake correction optical system 33, and a diaphragm 34.
  • the interchangeable lens 3 further includes a zoom drive mechanism 35, a focus drive mechanism 36, a shake correction drive mechanism 37, a diaphragm drive mechanism 38, and a shake sensor (motion detection unit, shake detection unit) 39.
  • the zoom drive mechanism 35 adjusts the magnification of the imaging optical system by moving the zoom optical system 31 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the zoom optical system 31.
  • the focus drive mechanism 36 adjusts the focus of the imaging optical system by moving the focus optical system 32 forward and backward in the direction of the optical axis L1 based on a signal output from the CPU 21 of the camera body 2.
  • the signal output from the CPU 21 at the time of focus adjustment includes information indicating the moving direction, moving amount, moving speed, and the like of the focus optical system 32.
  • the diaphragm driving mechanism 38 controls the aperture diameter of the diaphragm 34 based on a signal output from the CPU 21 of the camera body 2.
  • the blur correction drive mechanism 37 Based on a signal output from the CPU 21 of the camera body 2, the blur correction drive mechanism 37 cancels blurring of the subject image on the imaging surface of the imaging element 22 (referred to as image blur) within a plane that intersects the optical axis L1.
  • the blur correction optical system 33 is moved back and forth in the direction to suppress image blur.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the blur correction optical system 33.
  • the shake sensor 39 detects the shake of the camera 1 when the camera 1 swings due to hand shake or the like.
  • the shake sensor 39 includes an angular velocity sensor 39a and an acceleration sensor 39b. It is assumed that image blur is caused by camera shake.
  • the angular velocity sensor 39a detects an angular velocity generated by the rotational movement of the camera 1.
  • the angular velocity sensor 39a detects, for example, rotation around each axis of an axis parallel to the X axis, an axis parallel to the Y axis, and an axis parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2.
  • the angular velocity sensor 39a is also referred to as a gyro sensor.
  • the acceleration sensor 39b detects acceleration generated by the translational motion of the camera 1.
  • the acceleration sensor 39b detects, for example, accelerations in an axis direction parallel to the X axis, an axis parallel to the Y axis, and an axis direction parallel to the Z axis, and sends a detection signal to the CPU 21 of the camera body 2.
  • the acceleration sensor 39b is also referred to as a G sensor.
  • the shake sensor 39 is provided in the interchangeable lens 3 is illustrated, but the shake sensor 39 may be provided in the camera body 2. Further, the shake sensor 39 may be provided on both the camera body 2 and the interchangeable lens 3.
  • the camera body 2 includes a CPU 21, an image sensor 22, a shutter 23, a mirror 24, an AF sensor 25, a shake correction drive mechanism 26, a signal processing circuit 27, a memory 28, an operation member 29, and a liquid crystal display. Part 30.
  • the CPU 21 includes a CPU, a RAM (Random Access Memory), a ROM (Read Only Memory), and the like, and controls each unit of the camera 1 based on a control program.
  • the CPU 21 includes a shake correction unit (correction amount calculation unit) 21a.
  • the blur correction unit 21a calculates the image motion accompanying the rotational movement of the camera 1 and the translational movement of the camera 1.
  • the CPU 21 moves the blur correction optical system 33 by the blur correction drive mechanism (blur correction drive unit) 37 based on the calculation result by the blur correction unit 21 a or the image pickup element by the blur correction drive mechanism (blur correction drive unit) 26. 22 is moved.
  • image blurring is suppressed by moving the blur correction optical system 33 of the interchangeable lens 3 constituting the imaging optical system or the imaging element 22.
  • image blur correction is also referred to as image blur correction. Details of the image blur correction will be described later.
  • the imaging element 22 receives the light beam that has passed through the imaging optical system at the imaging surface, and photoelectrically converts (captures) the subject image. Electric charges are generated according to the amount of received light in each of the plurality of pixels arranged on the imaging surface of the imaging element 22 by photoelectric conversion. A signal based on the generated charge is read from the image sensor 22 and sent to the signal processing circuit 27.
  • the shutter 23 controls the exposure time of the image sensor 22.
  • the exposure time of the image sensor 22 can be controlled by a method for controlling the charge accumulation time in the image sensor 22 (so-called electronic shutter control).
  • the shutter 23 is opened and closed by a shutter drive unit (not shown).
  • a semi-transmissive quick return mirror (hereinafter referred to as a mirror) 24 is driven by a mirror driving unit (not shown), so that the mirror 24 is moved down on the optical path (illustrated in FIG. 1), and the mirror 24 is in the optical path. Move between up-positions that retract to the outside. For example, before the release, the subject light is reflected by the mirror 24 moved to the down position to a finder unit (not shown) provided upward (Y-axis plus direction). Part of the subject light transmitted through the mirror 24 is bent downward (Y-axis minus direction) by the sub mirror 24 a and guided to the AF sensor 25. Immediately after the release switch is pressed, the mirror 24 is rotated to the up position. Thereby, the subject light is guided to the image sensor 22 via the shutter 23.
  • the AF sensor 25 detects the focus adjustment state of the interchangeable lens 3 by the imaging optical system.
  • the CPU 21 performs a known phase difference type focus detection calculation using a detection signal from the AF sensor 25.
  • the CPU 21 obtains the defocus amount by the imaging optical system by this calculation, and calculates the movement amount of the focus optical system 32 based on the defocus amount.
  • the CPU 21 transmits the calculated movement amount of the focus optical system 32 to the focus drive mechanism 36 together with the movement direction and movement speed.
  • the blur correction drive mechanism 26 moves the image sensor 22 forward and backward in a direction that cancels image blur in a plane that intersects the optical axis L ⁇ b> 1. Reduce blurring.
  • the signal output from the CPU 21 includes information indicating the moving direction, moving amount, moving speed, and the like of the image sensor 22.
  • the signal processing circuit 27 generates image data related to the subject image based on the image signal read from the image sensor 22.
  • the signal processing circuit 27 performs predetermined image processing on the generated image data.
  • the image processing includes known image processing such as gradation conversion processing, color interpolation processing, contour enhancement processing, and white balance processing.
  • the memory 28 includes, for example, an EEPROM (Electrically-Erasable-Programmable-Read-Only Memory), a flash memory, or the like.
  • EEPROM Electrically-Erasable-Programmable-Read-Only Memory
  • flash memory or the like.
  • adjustment value information such as a detection gain set in the shake sensor 39 is recorded. Data recording to the memory 28 and data reading from the memory 28 are performed by the CPU 21.
  • the operation member 29 includes a release button, a recording button, a live view button, various setting switches, and the like, and outputs an operation signal corresponding to each operation to the CPU 21.
  • the liquid crystal display unit 30 displays an image based on image data, information relating to shooting such as a shutter speed and an aperture value, a menu operation screen, and the like.
  • the recording medium 50 is composed of, for example, a memory card that can be attached to and detached from the camera body 2. Image data, audio data, and the like are recorded on the recording medium 50. Recording of data on the recording medium 50 and reading of data from the recording medium 50 are performed by the CPU 21.
  • the camera 1 can perform image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3 and image blur correction performed by operating the blur correction drive mechanism 26 of the camera body 2. It is configured.
  • the CPU 21 operates one of the blur correction drive mechanisms. For example, when the interchangeable lens 3 having the blur correction drive mechanism 37 is attached to the camera body 2, the CPU 21 operates the blur correction drive mechanism 37 of the interchangeable lens 3 to perform image blur correction, and the blur correction drive mechanism. When the interchangeable lens 3 that does not include the lens 37 is attached to the camera body 2, the blur correction drive mechanism 26 of the camera body 2 is operated to perform image blur correction. Note that, as in a third embodiment to be described later, the shake correction drive mechanism of the interchangeable lens 3 and the camera body 2 may be operated simultaneously.
  • image blur generated by the camera 1 is classified into image blur (also referred to as angle blur) accompanying the rotational movement of the camera 1 and image blur accompanying translational movement of the camera 1 (also referred to as translation blur).
  • the blur correction unit 21 a calculates an image blur due to the rotational movement of the camera 1 and an image blur due to the translational movement of the camera 1.
  • FIG. 2 is a diagram illustrating the blur correction unit 21a.
  • the shake correction unit 21 a includes an angle shake calculation unit 201, a translational shake calculation unit 202, and a shake correction optical system target position calculation unit (selection unit) 203.
  • the angle blur calculation unit 201 calculates the image blur in the Y-axis direction due to the rotational motion using the detection signal around the axis parallel to the X-axis (Pitch direction) by the angular velocity sensor 39a. Further, the angle blur calculation unit 201 calculates an image blur in the X-axis direction due to the rotational motion using a detection signal around the axis parallel to the Y-axis (Yaw direction) by the angular velocity sensor 39a.
  • the translation blur calculation unit 202 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 202 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the blur correction optical system target position calculation unit 203 is an image blur in the X axis direction and the Y axis direction calculated by the angle blur calculation unit 201 and an image in the X axis direction and the Y axis direction calculated by the translation blur calculation unit 202.
  • the image blur in the X-axis direction and the Y-axis direction is calculated by adding the blur for each axis. For example, when the image blur calculated by the angle blur calculation unit 201 and the image blur direction calculated by the translation blur calculation unit 202 are the same in a certain axial direction, the image blur increases due to the addition, but the calculated 2 When the directions of the two image blurs are different, the image blur is reduced by the addition. In this way, addition calculation is performed by adding positive and negative signs depending on the image blur direction of each axis.
  • the blur correction optical system target position calculator 203 adds the image blur in the X-axis direction and the Y-axis direction after addition, the photographing magnification (calculated based on the position of the zoom optical system 31), and the camera 1. Based on the distance to the subject 80 (calculated based on the position of the focus optical system 32), the image blur amount at a predetermined position of the image plane (imaging plane of the image sensor 22) is calculated.
  • the blur correction optical system target position calculation unit 203 moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount when the blur correction drive mechanism 37 of the interchangeable lens 3 is operated to perform the image blur correction.
  • the target position of the blur correction optical system 33 is calculated.
  • the blur correction optical system target position calculation unit 203 when operating the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction, performs imaging for moving the imaging element 22 in a direction that cancels the calculated image blur amount.
  • the target position of the element 22 is calculated.
  • the shake correction optical system target position calculation unit 203 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3 or the shake correction drive mechanism 26 of the camera body 2.
  • the shake correction optical system target position calculation unit 203 can also send a signal indicating the target position to the interchangeable lens 3 and the shake correction drive mechanism of the camera body 2, respectively.
  • the shake correction drive mechanism 37 of the interchangeable lens 3 notifies the CPU 21 of the camera body 2 to that effect. Also good.
  • the CPU 21 can take measures such as issuing an alarm notifying that the allowable range of image blur correction has been exceeded.
  • FIG. 3 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging device 22).
  • the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates
  • the optical axis L1 of the interchangeable lens 3 is the Z axis
  • the image plane 70 is represented as the XY plane.
  • the optical axis L1 intersects the center of the imaging surface.
  • the interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70.
  • the angular velocity sensor 39a detects, for example, the rotation angle ⁇ around the axis (small-x axis) parallel to the X axis (Pitch direction).
  • the symbol f in FIGS. 3 and 4 represents the focal length.
  • FIG. 4 is a diagram for explaining the image blur ⁇ y2 in FIG. 3, and represents the YZ plane in FIG.
  • the image blur ⁇ y2 is expressed by the following formula (1).
  • ⁇ y2 f ⁇ tan ( ⁇ + tan ⁇ 1 (yp / f)) ⁇ yp (1)
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is ⁇ .
  • the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
  • the image blur ⁇ y1 of the image of the subject 80 positioned at the coordinate (0, 0) of the center of the image plane 70 before the camera 1 shakes will be described. It is assumed that the angle of rotation of the interchangeable lens 3 in the pitch direction is the same as the above.
  • the image of the subject 80 located at the coordinate (0, 0) on the image plane 70 before the shake moves in the Y-axis minus direction after the shake.
  • the position of the image of the moved subject 80 is the coordinates (0, ⁇ y1).
  • the image blur ⁇ y1 is expressed by the following formula (2).
  • ⁇ y1 f ⁇ tan ⁇ (2)
  • the rotation angle ⁇ is generally about 0.5 degrees, so that it is considered that ⁇ y1 ⁇ y2. be able to.
  • the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different. Even so, the image blur can be regarded as almost the same. This means that the position of the image plane 70 can be determined anywhere to calculate the image blur.
  • the image of the subject 80 positioned at the center of the image plane 70 is also a subject at a position away from the center of the image plane 70.
  • Any of the 80 images can suppress image blurring.
  • the focal length f is not sufficiently larger than yp, as in the case where the interchangeable lens 3 is a wide-angle lens, ⁇ y1 ⁇ y2. Therefore, it is necessary to calculate the image blur by setting the position on the image plane 70 to any one. For example, if image blur correction is performed based on the image blur calculated at the center of the image plane 70, the image blur of the subject 80 located at the center of the image plane 70 can be suppressed, but it is far from the center of the image plane 70. This is because the image blur corresponding to the difference between ⁇ y2 and ⁇ y1 remains without being suppressed for the image of the subject 80 at the selected position. The difference between ⁇ y2 and ⁇ y1 increases as the position at which image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
  • the CPU 21 in the first embodiment determines a position where the image of the main subject is highly likely to exist on the image plane 70, as will be described later.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the CPU 21 selects one of the following methods (1) to (3) in order to determine a position where image blur is calculated.
  • the CPU 21 resets (updates) the position for calculating the image blur, for example, when the amount of motion of the camera 1 associated with the composition change is detected.
  • the shake sensor 39 also functions as a motion amount detection unit.
  • FIG. 5 is a diagram illustrating a focus area formed on the imaging screen 90.
  • the focus area is an area where the AF sensor 25 detects the focus adjustment state, and is also referred to as a focus detection area, a distance measuring point, and an autofocus (AF) point.
  • eleven focus areas 25P-1 to 25P-11 are provided in the imaging screen 90 in advance.
  • the CPU 21 can obtain the defocus amount in 11 focus areas.
  • the number of focus areas 25P-1 to 25P-11 is an example, and the number may be larger or smaller than 11.
  • the CPU 21 determines a position for calculating the image blur on the image plane 70 as a position corresponding to the selected focus area. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the reason why the image blur calculation position on the image plane 70 is set to the position corresponding to the selected focus area is that there is a high possibility that the main subject is present at the position for obtaining the defocus amount for focus adjustment. is there.
  • the CPU 21 may select the focus area based on the operation signal from the operation member 29, or the CPU 21 may select the focus area corresponding to the subject 80 close to the camera 1.
  • the CPU 21 can select a focus area corresponding to the subject 80 close to the camera 1 based on the position of the focus optical system 32. Further, the CPU 21 may select a focus area corresponding to the subject 80 having a high contrast among the images of the subject 80, or may select a focus area corresponding to the subject 80 having a high luminance value among the images of the subject 80.
  • the second method is to calculate image blur at the position of the object (subject 80).
  • the CPU 21 recognizes an object appearing as the subject 80 in the live view image by a known object recognition process, and sets the position of the object (subject 80) in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • the live view image is a monitor image acquired at a predetermined interval (for example, 60 fps) by the image sensor 22 before the main imaging is performed.
  • a predetermined interval for example, 60 fps
  • the CPU 21 maintains the state where the mirror 24 is rotated to the up position, and starts the acquisition of the live view image by the image sensor 22.
  • the CPU 21 can also display the live view image on the liquid crystal display unit 30.
  • the CPU 21 can also track the moving object (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image.
  • the image blur correction unit 201 performs image blur correction on the moving object (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21. Do. Further, even when the camera 1 is panned, the CPU 21 can track the moving object (subject 80) by sequentially updating the position of the main subject in each frame of the live view image.
  • the CPU 21 selects the second method and starts object recognition processing. You may do it.
  • the object recognition target may be switched according to an imaging scene mode such as “landscape”, “cooking”, “flower”, “animal” set in the camera 1.
  • the third method is to calculate image blur at the position of the face (subject 80).
  • the CPU 21 recognizes the face shown as the subject 80 in the live view image by a known face recognition process, and sets the position of the face in the live view image as the position of the main subject. Then, the position at which image blur is calculated on the image plane 70 is determined as a position corresponding to the main subject.
  • the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21 and performs an image blur correction based on the image blur.
  • CPU21 keeps the state which rotated the mirror 24 to the up position, for example, when the live view button which comprises the operation member 29 is operated, and starts acquisition of a live view image with the image pick-up element 22.
  • the CPU 21 can also track the moving face (subject 80) by sequentially updating the position of the main subject based on each frame of the live view image, as in (2) above.
  • the angle blur calculation unit 201 performs image blur correction on the moving face (subject 80) when acquiring the live view image by sequentially calculating the image blur at the position sequentially updated by the CPU 21.
  • the CPU 21 may select the third method and start the face recognition process when the imaging scene mode of the camera 1 is set to “portrait”, for example.
  • ⁇ When there are multiple positions to calculate image blur> the case where only one position where the image blur is calculated on the image plane 70 is determined.
  • a plurality of positions may be candidates for positions where image blur is calculated. Specifically, when a plurality of focus areas are selected in (1) above, or when a plurality of objects (subject 80) are recognized in (2) above, or a plurality of focus areas in (3) above are described. This is the case when a face is recognized. In such a case, the CPU 21 selects the following method (4) or (5).
  • FIG. 6 is a diagram illustrating an example in which one representative position is determined from a plurality of candidates. For example, on the image plane 70, a position P-1 corresponding to the focus area 25P-1 in FIG. 5, a position P-2 corresponding to the focus area 25P-2, and a position P-4 corresponding to the focus area 25P-4.
  • the CPU 21 determines the absolute value of the distance between the plurality of candidate positions and the X axis (FIG. 3) and the absolute value of the distance between the plurality of candidate positions and the Y axis (FIG. 3).
  • an average position P is obtained, and the position P is set as a representative position. Then, the position where the image blur is calculated on the image plane 70 is determined as the representative position P. In this way, the representative position P is obtained by averaging the absolute values of the distances on the axis (X axis, Y axis) of the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at the representative position P, and performs image blur correction based on the image blur.
  • the case where a plurality of focus areas are selected is exemplified, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized.
  • the CPU 21 determines the representative position P as described above based on the positions of the recognized objects and the positions of the recognized faces.
  • the angle blur calculation unit 201 calculates image blur for the representative position P determined by the CPU 21, and performs image blur correction based on the image blur.
  • the fifth method is to calculate one image blur based on a plurality of image blurs. Referring to the example of FIG. 6, for example, on the image plane 70, the position P-1 corresponding to the focus area 25P-1 in FIG. 5, the position P-2 corresponding to the focus area 25P-2, and the focus area 25P-4. Three points with the position P-4 corresponding to are candidates.
  • the CPU 21 determines a plurality of positions as positions where image blur is calculated on the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at the position P-1, the position P-2, and the position P-4 on the image plane 70, respectively.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • the average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
  • the case where a plurality of focus areas are selected is illustrated, but the same applies to the case where a plurality of objects (subject 80) are recognized or a plurality of faces are recognized.
  • the CPU 21 determines the positions of the plurality of recognized objects and the positions of the plurality of recognized faces as positions at which image blur is calculated on the image plane 70, respectively.
  • the angle blur calculation unit 201 calculates an image blur for each position on the image plane 70.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • one subject may be selected from a plurality of subjects.
  • a subject with a large image blur is selected from a plurality of subjects.
  • a subject with a large image blur that is close to the camera 1 from a plurality of subjects is selected.
  • a subject having a high image height from the optical axis L1 of the interchangeable lens 3 is selected from a plurality of subjects.
  • image blur correction in the first embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the description of the first embodiment described above is representative of the correction in the Y-axis direction when the camera 1 rotates in the pitch direction.
  • correction similar to the correction described above is necessary for the X-axis direction. Since the correction in the Y-axis direction when the camera 1 rotates in the pitch direction and the correction in the X-axis direction when the camera 1 rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • image blur calculated by the translation blur calculation unit 202 is treated as substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different.
  • the outline of the first embodiment is as follows.
  • the angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70.
  • the translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the shake correction device of the camera 1 detects the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the camera 1 and the output of the shake sensor 39. It includes a blur correction unit 21 a that calculates and a CPU 21 that determines a position on the image plane 70.
  • the blur correction unit 21a calculates the image blur ⁇ y2 in the Y-axis direction based on the position determined by the CPU 21 and the shake in the Y-axis direction detected by the shake sensor 39, for example.
  • the image blur can be appropriately suppressed.
  • the focal length f of the interchangeable lens 3 is short (or when the angle of view becomes wide due to the relationship between the size of the image sensor 22 and the focal length f).
  • the blur correction unit 21a calculates a larger blur amount on the image plane 70 as the distance from the X-axis direction axis intersecting the Y-axis direction to the determined position is longer. Therefore, image blur can be suppressed appropriately even at a position where the image height is high.
  • the shake correction unit 21a calculates the shake amount based on the output of the shake sensor 39, the distance, and the focal length of the imaging optical system, so the focal length f differs. Even when the interchangeable lens 3 is replaced, image blur can be appropriately suppressed.
  • the CPU 21 uses the position of the focus area that is the target of focus adjustment of the imaging optical system on the image plane 70 as the determined position. Therefore, it is possible to appropriately suppress image blurring at a position where there is a high possibility of image blurring.
  • the CPU 21 determines the determined position based on the contrast information of the subject image, so that the image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
  • the CPU 21 determines the determined position based on the luminance value information of the image of the subject 80, so that the main subject is highly likely to exist. Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position based on the subject recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position based on the face recognition information based on the image of the subject 80, so that the main subject is highly likely to exist. , Image blur can be suppressed appropriately.
  • the CPU 21 determines the determined position according to the set imaging scene mode, so that an image is appropriately displayed at a position where there is a high possibility that the main subject exists. Blur can be suppressed.
  • the CPU 21 sets the position designated by the user operation on the image plane 70 as the determined position, so that the image blur is appropriately performed at the position desired by the user. Can be suppressed.
  • the CPU 21 sets the position corresponding to the subject 80 close to the camera 1, for example, based on the shooting distance information, so that it corresponds to the main subject.
  • the image blur can be appropriately suppressed at the position where the image is to be moved.
  • the blur correction device includes a CPU 21 that detects the amount of movement due to composition change based on the output of the shake sensor 39, and the CPU 21 moves after the determined position is determined by the CPU 21.
  • the shake correction unit 21b calculates the shake amount based on the position where the determined position is changed based on the amount of motion. Accordingly, it is possible to appropriately suppress image blur at a position where there is a high possibility that the main subject exists after the composition change.
  • the CPU 21 determines the axis of the image plane 70 based on the positions of the plurality of focus areas ( Since the center of gravity (representative position P) of the absolute value of the distance on the X-axis and Y-axis) is the determined position, image blurring is appropriately performed so that image blurring at a plurality of focus area positions is approximately the same. Can be suppressed.
  • the CPU 21 is on the axis (X axis, Y axis) of the image plane 70 based on the positions of the plurality of subjects. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of subjects becomes approximately the same.
  • the CPU 21 is based on the positions of the plurality of faces on the axes (X axis, Y axis) of the image plane 70. Since the center of gravity (representative position P) of the absolute value of the distance is set as the determination position, it is possible to appropriately suppress the image blur so that the image blur at the positions of a plurality of faces becomes approximately the same.
  • the CPU 21 sets the positions of the plurality of focus areas as the determined positions, and the blur correction unit 21b calculates an average value of a plurality of shake amounts calculated based on a plurality of determined positions. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • the CPU 21 sets the positions of the plurality of main subjects as the determined positions, and the blur correction unit 21b determines the plurality of determinations. An average value of a plurality of shake amounts calculated based on the position is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • the shake correction apparatus of (8) when there are a plurality of faces based on the face recognition information, the CPU 21 sets the positions of the plurality of faces as the determined positions, and the shake correction unit 21b sets the determined positions to the determined positions. An average value of a plurality of blur amounts calculated based on the above is calculated. Thereby, it is possible to appropriately suppress the image blur so that the image blur at the positions of the plurality of focus areas becomes approximately the same.
  • Modification 1 In the first embodiment, the image blur correction performed by the camera 1 by operating the blur correction drive mechanism 37 of the interchangeable lens 3 has been described as an example. Instead, in the first modification of the first embodiment, the camera 1 operates the blur correction drive mechanism 26 of the camera body 2 to perform image blur correction. Image blur correction according to the first modification of the first embodiment can be performed in the same manner as in the first embodiment, and the same effects as those in the first embodiment can be obtained.
  • Modification 2 In the third method (3) described in the first embodiment, that is, when the image blur at the position of the face (subject 80) to be captured is calculated, for example, when the face appears large on the screen.
  • the CPU 21 may select the fifth method of (5) above.
  • FIG. 7 is a diagram for explaining a modification 2 of the first embodiment. An example in which one representative position is determined from a plurality of candidates will be described with reference to FIG. According to FIG. 7, a large face (subject) is shown on the image plane 70. On the image plane 70, the CPU 21 sets, for example, two points, that is, the detected left edge position Pa and the right edge position Pb of the face.
  • the CPU 21 determines the two candidate positions as positions where image blur is calculated on the image plane 70.
  • the angle blur calculation unit 201 calculates image blur at a position Pa and a position Pb on the image plane 70, respectively.
  • the angle blur calculation unit 201 further obtains an average of the plurality of calculated image blurs, and performs image blur correction based on the average value of the image blurs.
  • the average value of image blur is obtained by, for example, a simple average, but may be obtained by a weighted average.
  • image blur correction can be performed so that the image blur at both ends of the face becomes the same.
  • size of image blur differs by right and left of a face, the discomfort seen from the user can be suppressed.
  • the camera 1 may be a single lens reflex type illustrated in FIG. 1 or a mirrorless type without the mirror 24.
  • the camera 1 may be configured as a lens integrated type in which the interchangeable lens 3 and the camera body 2 are integrated.
  • the imaging apparatus is not limited to the camera 1 and may be a lens barrel provided with an imaging sensor, a smartphone provided with an imaging function, or the like.
  • FIG. 8 is a schematic diagram for explaining the angular velocity detection direction by the angular velocity sensor 39a and the image blur on the image plane 70 (the imaging plane of the imaging element 22).
  • the point where the image plane 70 and the optical axis L1 of the interchangeable lens 3 intersect is the origin of coordinates
  • the optical axis L1 of the interchangeable lens 3 is the Z axis
  • the image plane 70 is represented as the XY plane.
  • the optical axis L1 intersects the center of the imaging surface.
  • the interchangeable lens 3 and the subject 80 are positioned in the Z axis plus direction with respect to the image plane 70.
  • the angular velocity sensor 39a detects, for example, the rotation angle ⁇ around the axis (small-x axis) parallel to the X axis (Pitch direction).
  • the symbol f in FIGS. 3 and 4 represents the focal length.
  • the image of the subject 80 located at the coordinates (xp, yp) on the image plane 70 before the shake moves in the Y-axis minus direction and the X-axis plus direction after the shake. Accordingly, the coordinates of the image of the subject 80 are (xp + ⁇ x2, yp ⁇ y2).
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above expression (1), as in the case described in the first embodiment.
  • the image blur ⁇ x2 in the X-axis direction is expressed by the following formula (3).
  • ⁇ x2 f ⁇ xp / [(f 2 + yp 2 ) 1/2 ⁇ cos ( ⁇ + tan ⁇ 1 (yp / f))] ⁇ xp (3)
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is ⁇ .
  • the symbol f in FIGS. 3 and 4 represents the focal length of the interchangeable lens 3.
  • the rotation angle ⁇ (camera shake angle) is generally about 0.5 degrees, so it is regarded that ⁇ x2 ⁇ 0. be able to. In other words, whether the position of the image of the subject 80 on the image plane 70 is at the center (in this example, the origin) of the image plane 70 or at a position away from the center, in other words, the distance from the optical axis L1 is different.
  • the image blur when the rotation angle ⁇ is detected in the pitch direction need only consider the Y-axis direction, and can be ignored in the X-axis direction.
  • the image of the subject 80 positioned at the center of the image plane 70 is also separated from the center of the image plane 70. Any image of the subject 80 at the selected position can suppress image blurring.
  • the interchangeable lens 3 is a wide-angle lens and the focal length f cannot be said to be sufficiently larger than yp, ⁇ x2 ⁇ 0 according to the above equation (3). Therefore, when the rotation angle ⁇ in the pitch direction is detected, not only the image blur in the Y-axis direction is calculated by the above equation (1) but also the image blur in the X-axis direction can be calculated by the above equation (3). I need it. Otherwise, the image blur in the X-axis direction corresponding to the image blur ⁇ x2 according to the above equation (3) cannot be suppressed and remains.
  • the image blur ⁇ x2 increases as the position at which the image blur is calculated moves toward the periphery of the image plane 70, that is, as the image height increases.
  • the CPU 21 determines the position where image blur is calculated on the image plane 70 in the same manner as in the first embodiment. That is, the CPU 21 selects one of the methods (4) from the method (1), and determines the position where the image blur is calculated on the image plane 70. Then, the angle blur calculation unit 201 calculates an image blur at a position determined by the CPU 21. The blur correction optical system target position calculation unit 203 calculates an image blur amount based on the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the image blur correction in the second embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the image blur correction in the second embodiment includes correction in the Y-axis direction when the camera 1 rotates in the pitch direction and correction in the X-axis direction when the camera 1 rotates in the Yaw direction.
  • the position of the image blur calculated by the translation blur calculation unit 202 on the image plane 70 is different.
  • the outline of the second embodiment is as follows.
  • the angle blur calculation unit 201 calculates the image blur by determining the position where the image blur is calculated as any position on the image plane 70. At this time, for example, when the rotation angle ⁇ in the pitch direction is detected, not only the image blur in the Y axis direction is calculated by the above equation (1) but also the image blur in the X axis direction is calculated by the above equation (3). .
  • the translation blur calculation unit 202 calculates the image blur by determining the position where the image blur is calculated, for example, at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 203 determines whether the image blur calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the shake correction device of the camera 1 includes a shake sensor 39 that detects a shake in the Y-axis direction of the device, and an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 39. And a blur correction unit 21a that calculates the amount of blurring.
  • the blur correction unit 21a calculates image blur in the X-axis direction that intersects the Y-axis direction. As a result, it is possible to suppress image blurring in the direction of the X-axis that intersects the Y-axis in which the shake is detected.
  • the blur correction unit 21a calculates the image blur in the Y-axis direction, and thus can suppress the image blur in the Y-axis direction in which the shake is detected.
  • the blur correction apparatus further includes a CPU 21 that determines a position on the image plane 70.
  • the blur correction unit 21 a calculates the image blur amount in the X-axis direction and the Y-axis direction based on the determined position by the CPU 21 and the rotation angle in the Y-axis direction detected by the shake sensor 39. Thereby, even when the position of the image plane 70 determined by the CPU 21 is a position other than the center of the image plane 70, the image blur can be appropriately suppressed.
  • FIG. 9 is a diagram illustrating an example in which distortion (for example, barrel shape) is generated by the interchangeable lens 3.
  • distortion for example, barrel shape
  • a large number of solid circles represent images of the subject 80 when it is assumed that the interchangeable lens 3 has no distortion.
  • a large number of hatched circles indicate images of the subject 80 that are distorted by the influence of barrel distortion based on the optical characteristics of the interchangeable lens 3.
  • the distortion aberration of the interchangeable lens 3 varies depending on the design, but is often large in a wide-angle lens having a short focal length.
  • the amount of distortion increases as the distance from the optical axis L1 of the imaging optical system increases (when the center O of the image plane 70 is aligned with the optical axis L1, the distance from the center O of the image plane 70). growing.
  • the distortion amount appears as a positional deviation between the solid line circle and the hatched circle shown in FIG.
  • the positional deviation between the solid circle and the hatched circle becomes the largest at a position where the distance from the center O of the image plane 70 is long (in other words, the image height is high).
  • the positional deviation is ⁇ x in the X-axis direction and ⁇ y in the Y-axis direction.
  • the schematic diagram illustrated in FIG. 8 is represented as having no distortion due to the imaging optical system, as indicated by a solid circle in FIG. Therefore, for example, when the position at which image blur is calculated on the image plane 70 is set at a position away from the center O of the image plane 70, if the image blur correction described in the second embodiment is performed as it is, distortion aberration exists. In this case, image blur that cannot be corrected occurs.
  • FIG. Image blur correction is performed assuming that there is distortion due to the imaging optical system, such as a hatched circle in FIG.
  • distortion aberration information indicating which position on the image plane 70 is in which direction and in which amount of distortion is known as design information of the interchangeable lens 3. . Therefore, information on distortion aberration of the interchangeable lens 3 attached to the camera body 2 is recorded in the memory 28 in advance.
  • the CPU 21 detects that the interchangeable lens 3 having a large distortion aberration is attached, the CPU 21 reads out the corresponding distortion aberration information from the memory 28 and uses it for the above-described calculation to calculate the image blur.
  • the blur correction optical system target position calculation unit 203 of the blur correction unit 21a is configured to perform image blur calculated by the angle blur calculation unit 201 and image blur calculated by the translation blur calculation unit 202 for each of the X axis and the Y axis. Then, depending on the direction of the distortion aberration information read from the memory 28, a positive / negative sign is added to perform the addition calculation. Then, the amount of image blur at the position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the interchangeable lens 3A is attached to the camera body 2A.
  • the interchangeable lens 3A is different from the interchangeable lens 3 in that a shake correction unit 40 is added.
  • a detection signal from the shake sensor 39 is sent to the shake correction unit 40.
  • the camera body 2A is different from the camera body 2 in that a shake sensor (motion detection unit, shake detection unit) 31 is added.
  • a detection signal from the shake sensor 31 is sent to the CPU 21 (blur correction unit 21a).
  • the shake sensor 31 has the same function as the shake sensor 39.
  • the interchangeable lens 3A including the blur correction drive mechanism 37 when the interchangeable lens 3A including the blur correction drive mechanism 37 is attached to the camera body 2A, image blur correction performed by operating the blur correction drive mechanism 37 of the interchangeable lens 3A, and the camera Image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is also used.
  • the interchangeable lens 3A that does not include the shake correction drive mechanism 37 is attached to the camera body 2A, the shake correction drive mechanism 26 of the camera body 2A is operated to modify the first embodiment. Image blur correction similar to 1 is performed.
  • FIG. 10 is a diagram showing a main configuration of a camera 1A according to the third embodiment.
  • the camera 1A includes a camera body 2A and an interchangeable lens 3A.
  • the interchangeable lens 3A is attached to the camera body 2A via a mount portion (not shown).
  • the camera body 2A and the interchangeable lens 3A are electrically connected, and communication is possible between the camera body 2A and the interchangeable lens 3A. Communication between the camera body 2A and the interchangeable lens 3A may be performed by wireless communication.
  • FIG. 10 the same components as those in FIG. 1 are denoted by the same reference numerals as those in FIG.
  • FIG. 11 is a diagram illustrating the blur correction unit 40 of the interchangeable lens 3A.
  • the shake correction unit 40 includes an angle shake calculation unit 401, a translational shake calculation unit 402, and a shake correction optical system target position calculation unit 403.
  • the angle blur calculation unit 401 uses the detection signal around the axis parallel to the X axis (Pitch direction) detected by the angular velocity sensor 39a, and the image blur in the Y axis direction due to the rotational motion and, if necessary, the image in the X axis direction. Calculate blur.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) detected by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
  • the translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402.
  • the image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
  • the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), an image blur amount at a position to be described later on the image plane 70 is calculated.
  • the blur correction optical system target position calculation unit 403 calculates the target position of the blur correction optical system 33 based on the calculated image blur amount in order to operate the blur correction drive mechanism 37 of the interchangeable lens 3A. Then, the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the camera 1 ⁇ / b> A may be a single-lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A are integrated. You may comprise as a lens integrated camera.
  • the image blur calculation by the angle blur calculation unit 201 and the image blur calculation by the translation blur calculation unit 202 are the same as those in the first embodiment and the second embodiment. However, it differs from the first embodiment and the second embodiment in the following points.
  • One of the differences is that the center of the image plane 70 is selected as the position for calculating the image blur in the image blur correction using the interchangeable lens 3A, and any position on the image plane 70 is used as the position for calculating the image blur in the image blur correction using the camera body 2A. It is a point to choose the position.
  • Another difference is that the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A are performed based on the sharing ratio determined by the CPU 21 of the camera body 2A. The explanation of the sharing ratio will be described later.
  • the CPU 21 determines the position at which the blur correction unit 40 of the interchangeable lens 3A calculates image blur at the center of the image plane 70, for example, and sets the position at which the blur correction unit 21a of the camera body 2A calculates image blur on the image plane 70. Set in any position. Accordingly, the angle blur calculation unit 401 of the interchangeable lens 3A calculates the blur correction amount (L) based on the image blur at the center position of the image plane 70 and the sharing ratio of the interchangeable lens 3A determined by the CPU 21. .
  • the angle blur calculation unit 201 of the camera body 2A is based on the image blur at a position different from the center of the image plane 70 determined by the CPU 21 and the share ratio of the camera body 2A determined by the CPU 21. ) Is calculated.
  • the CPU 21 determines the position by any one of methods (1) to (4) in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
  • the CPU 21 determines a sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A. For example, the CPU 21 of this example sets the sharing ratio to 50:50. This ratio may be 70:30 or 40:60.
  • the angle blur calculation unit 401 of the interchangeable lens 3A obtains an image blur V (L) to be shared by the interchangeable lens 3A as shown in the following equation (4).
  • the reason for halving the right side is that the sharing ratio is set to 50%.
  • ⁇ y1 is an image blur in the Y-axis direction at the center of the image plane 70.
  • the rotation angle in the pitch direction (which represents a camera shake angle and is generally about 0.5 degrees) is defined as ⁇ .
  • the symbol f represents the focal length of the interchangeable lens 3A.
  • the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (5).
  • d ⁇ y2 ⁇ y1.
  • ⁇ y2 is an image blur in the Y-axis direction at a position different from the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402.
  • the target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
  • the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
  • the shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
  • the image blur correction based on the image blur calculated by the angle blur calculation unit 401 at the center position of the image plane 70 is performed by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated by the angle blur calculation unit 201 at a position different from the center of the image plane 70 is performed by the image blur correction by the camera body 2A.
  • the image blur correction in the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the description of the third embodiment described above is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction. Since the correction in the Y-axis direction when the camera 1A is rotated in the pitch direction and the correction in the X-axis direction when the camera 1A is rotated in the Yaw direction are the same except for the direction, the description in the X-axis direction is the same. Is omitted.
  • the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is similar to that in the first and second embodiments. Even if the position is different, it is treated as almost constant.
  • the outline of the third embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 ⁇ / b> A calculates image blur at the center position of the image plane 70.
  • the angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 ⁇ / b> A calculates image blur at a position different from the center of the image plane 70.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, the sharing ratio of 50%) as 1/2 of the image blur ⁇ y1 at the center of the image plane 70, and the camera body 2A.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d. d is the difference between the image blur ⁇ y2 at a position different from the center of the image plane 70 and the ⁇ y1.
  • the translation blur calculation unit 402 of the interchangeable lens 3 ⁇ / b> A sets the image blur to be assigned to the interchangeable lens 3 ⁇ / b> A (for example, a sharing ratio of 50%) to, for example, 1 ⁇ 2 of the image blur at the center of the image plane 70.
  • the translation blur calculation unit 202 of the camera body 2 ⁇ / b> A sets the image blur shared by the camera body 2 ⁇ / b> A to, for example, half of the image blur at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3 ⁇ / b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the center position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at a position different from the center of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39.
  • the interchangeable lens 3 ⁇ / b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40.
  • a shake sensor 31 that detects a shake of the apparatus
  • a shake correction unit 21 b that calculates a shake amount of an image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a
  • a blur correction drive mechanism 26 that moves the image sensor 22 that captures an image of the subject 80 on the image plane 70 in a direction that suppresses the blur amount
  • a CPU 21 that determines a position on the image plane 70 are provided. Provided in the camera body 2A.
  • the blur correction unit 40 of the interchangeable lens 3A calculates an image blur ⁇ y1 based on a first position (the center of the image plane 70) predetermined on the image plane 70 and the shake detected by the shake sensor 39.
  • the blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to 1 ⁇ 2 of the image blur ⁇ y1.
  • the blur correction unit 21 b of the camera body 2 ⁇ / b> A has an image blur ⁇ y ⁇ b> 2 based on the second position (a position different from the center) determined by the CPU 21 and the shake detected by the shake sensor 31, and a predetermined image plane 70.
  • An image blur ⁇ y1 based on one position (the center of the image plane 70) and the shake detected by the shake sensor 31 is calculated.
  • the blur correction unit 21b further calculates a difference d between the image blur ⁇ y2 and the image blur ⁇ y1.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d. Thereby, even when the position determined by the CPU 21 is other than the center of the image plane 70, the image blur can be appropriately suppressed.
  • the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur ⁇ y1 to the blur correction drive mechanism 37, and the blur correction unit 21a of the camera body 2A The remaining 50% of the image blur ⁇ y1 and the difference d are output to the correction drive mechanism 26.
  • the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
  • the sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A.
  • the blur V (B) is d. d is the difference between the image blur ⁇ y 2 at a position different from the center of the image plane 70 and the image blur ⁇ y 1 at the center of the image plane 70.
  • the CPU 21 determines, for example, two positions (referred to as a first position and a second position) on the image plane 70 as positions for calculating the image blur.
  • the angle blur calculation unit 401 of the interchangeable lens 3A calculates image blur for the first position determined by the CPU 21.
  • the angle blur calculation unit 201 of the camera body 2A calculates image blur for the first position and the second position determined by the CPU 21.
  • the CPU 21 determines the first position and the second position for calculating the image blur by one of the methods (1) to (4) in the first embodiment.
  • Modification 4 of the third embodiment is different from the third embodiment in that the first position and the second position include a case where both the first position and the second position are different from the center of the image plane 70.
  • the CPU 21 determines the sharing ratio between the image blur correction by the interchangeable lens 3A and the image blur correction by the camera body 2A, as in the third embodiment. It is.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction can be expressed by the above formula (2) as described in the first embodiment. ). Further, when the first position and the second position for calculating the image blur are positions different from the center of the image plane 70, the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is as described in the first embodiment. The above formula (1).
  • the angle blur calculation unit 401 of the interchangeable lens 3A performs image blur V (L) to be shared by the interchangeable lens 3A as shown in the following formula (6).
  • V (L) ⁇ y2a / 2 (6)
  • ⁇ y2a is an image blur in the Y-axis direction at a first position different from the center of the image plane 70.
  • the angle blur calculation unit 201 of the camera body 2A performs image blur V (B) to be shared by the camera body 2A as shown in the following equation (7).
  • Ask. V (B) ⁇ y2a / 2 + d2 (7)
  • d2 ⁇ y2b ⁇ y2a.
  • ⁇ y2b is an image blur in the Y-axis direction at a second position different from the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3A is based on the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402.
  • the target position of the blur correction optical system 33 in the image blur correction performed by operating the blur correction drive mechanism 37 is calculated.
  • the blur correction optical system target position calculation unit 203 of the camera body 2A is based on the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202.
  • the target position of the image sensor 22 in the image blur correction performed by operating the blur correction drive mechanism 26 of the body 2A is calculated.
  • the shake correction optical system target position calculation unit 403 of the interchangeable lens 3A further sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the shake correction optical system target position calculation unit 203 of the camera body 2A further sends a signal indicating the target position to the shake correction drive mechanism 26 of the camera body 2A.
  • the image blur correction unit 401 performs image blur correction based on the image blur calculated at the first position of the image plane 70 by the image blur correction by the interchangeable lens 3A. Further, the image blur correction based on the image blur calculated at the second position of the image plane 70 by the angle blur calculation unit 201 is performed by the image blur correction by the camera body 2A.
  • the image blur correction in the fourth modification of the third embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction. Including.
  • the description of Modification 4 of the above-described third embodiment is representative of the correction in the Y-axis direction when the camera 1A rotates in the pitch direction. For this reason, when the camera 1A rotates in the Yaw direction, correction similar to the correction described above is necessary for the X-axis direction.
  • the image blur calculated by the translation blur calculation unit 202 and the translation blur calculation unit 402 is the same as in the first to third embodiments. Even if the positions on the image plane 70 (the imaging plane of the imaging device 22) are different, they are treated as being substantially constant.
  • the outline of Modification 4 of the third embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3 ⁇ / b> A calculates the image blur at the first position on the image plane 70.
  • the angle blur calculation unit 201 of the blur correction unit 21 a of the camera body 2 ⁇ / b> A calculates image blur at the second position on the image plane 70.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) as 1/2 of the image blur ⁇ y2a at the first position of the image plane 70, and
  • the angle blur calculation unit 201 of the body 2A sets the image blur V (B) shared by the camera body 2A to V (L) + d2.
  • d2 is the difference between the image blur ⁇ y2b at the second position of the image plane 70 and the ⁇ y2a.
  • the translation blur calculation unit 402 of the interchangeable lens 3 ⁇ / b> A sets the image blur to be assigned to the interchangeable lens 3 ⁇ / b> A (for example, a sharing ratio of 50%) to, for example, 1 ⁇ 2 of the image blur at the center of the image plane 70.
  • the translation blur calculation unit 202 of the camera body 2 ⁇ / b> A sets the image blur shared by the camera body 2 ⁇ / b> A to, for example, half of the image blur at the center of the image plane 70.
  • the blur correction optical system target position calculation unit 403 of the interchangeable lens 3 ⁇ / b> A converts the image blur V (L) calculated by the angle blur calculation unit 401 and the image blur calculated by the translation blur calculation unit 402 into the X axis and the Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, an image blur amount at the first position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction optical system target position calculation unit 203 of the camera body 2A uses the image blur V (B) calculated by the angle blur calculation unit 201 and the image blur calculated by the translation blur calculation unit 202 as the X axis and Y axis, respectively. Depending on the direction of the direction of each axis, a positive or negative sign is added to perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the camera shake correction apparatus of the camera 1A calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the shake sensor 39 that detects the shake of the apparatus and the output of the shake sensor 39.
  • the interchangeable lens 3 ⁇ / b> A includes a shake correction unit 40 that moves and a shake correction drive mechanism 37 that moves the shake correction optical system 33 in a direction that suppresses the amount of shake based on the output of the shake correction unit 40.
  • a shake sensor 31 that detects the shake of the apparatus
  • a shake correction unit 21 a that calculates the shake amount of the image of the subject 80 formed on the image plane 70 by the imaging optical system based on the output of the shake sensor 31, and a shake correction Based on the output of the unit 21a, the image pickup device 22 that picks up the image of the subject 80 on the image plane 70 is moved in a direction that suppresses the amount of shake, and the first position and the second position on the image plane 70.
  • the camera body 2A is provided with a CPU 21 that determines the above.
  • the blur correction unit 40 of the interchangeable lens 3A calculates an image blur ⁇ y2a based on the first position and the shake detected by the shake sensor 39.
  • the blur correction unit 40 sets the image blur V (L) to be assigned to the interchangeable lens 3A (for example, a sharing ratio of 50%) to 1 ⁇ 2 of the image blur ⁇ y2a.
  • the camera shake correction unit 21a of the camera body 2A generates an image blur ⁇ y2a based on the first position and the shake detected by the shake sensor 31, and an image blur ⁇ y2b based on the second position and the shake detected by the shake sensor 31. Calculate.
  • the blur correction unit 21b further calculates a difference d2 between the image blur ⁇ y2a and the image blur ⁇ y2b.
  • the angle blur calculation unit 201 sets the image blur V (B) shared by the camera body 2A to V (L) + d2.
  • image blur can be appropriately suppressed at the second position determined by the CPU 21 other than the center of the image plane 70.
  • the blur correction unit 40 of the interchangeable lens 3A outputs 50% of the image blur ⁇ y2a to the blur correction drive mechanism 37, and the blur correction unit 21b of the camera body 2A The remaining 50% of ⁇ y2a and the difference d2 are output to the correction drive mechanism 26.
  • the movement distances by the shake correction drive mechanism 26 and the shake correction drive mechanism 37 can be suppressed to be small.
  • the sharing ratio determined by the CPU 21 may be set to 100% for image blur correction by the interchangeable lens 3A and 0% for image blur correction by the camera body 2A.
  • the angle blur calculation unit 401 of the interchangeable lens 3A sets the image blur V (L) shared by the interchangeable lens 3A to 100%, and the angle blur calculation unit 201 of the camera body 2A shares the image shared by the camera body 2A.
  • the blur V (B) is d2.
  • d2 is the difference between the image blur ⁇ y2a at a first position different from the center of the image plane 70 and the image blur ⁇ y2b at a second position different from the center of the image plane 70.
  • Image blur correction calculation based on the image blur V (B) of the above formula (5) and the above formula (7) is performed by the blur correction unit 40 of the interchangeable lens 3A, and the image blur of the above formula (4) and the above formula (6) is performed.
  • the blur correction calculation based on V (L) may be performed by the blur correction unit 21a of the camera body 2A.
  • the position of the image plane 70 for calculating the image blur for the image blur correction by the interchangeable lens 3A and the image blur for the image blur correction by the camera body 2A are calculated.
  • the position of the image plane 70 to be switched can be exchanged with the case of the third embodiment or the fourth modification of the third embodiment.
  • the angle blur calculation unit 201 and the angle blur calculation unit 401 perform addition calculation by adding a positive / negative sign to the X axis and the Y axis depending on the blur direction.
  • image blur correction is performed exclusively by the interchangeable lens 3A using the camera 1A of FIG.
  • the camera 1 ⁇ / b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
  • the CPU 21 of the camera body 2A in the fourth embodiment may cause an image of the main subject on the image plane 70 by, for example, any one of the methods (1) to (4) in the first embodiment. Determine the high position. Then, the CPU 21 transmits information indicating the position determined on the image plane 70 to the blur correction unit 40 of the interchangeable lens 3A.
  • the timing at which the CPU 21 of the camera body 2A transmits information on the position at which image blur is calculated on the image plane 70 to the blur correction unit 40 is determined, for example, by the CPU 21 at which the image blur is calculated on the image plane 70 (newly determined). And the case of updating).
  • the CPU 21 includes the position information in steady communication between the camera body 2A and the interchangeable lens 3A, or includes the position information in communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A. The position information is immediately notified to the shake correction unit 40.
  • the angle blur calculation unit 401 of the blur correction unit 40 calculates the image blur at the position indicated by the information received from the CPU 21, and performs the image blur correction based on the image blur.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is the above formula (1) as described in the first embodiment. is there.
  • the image blur correction in the fourth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction.
  • the same correction as that described above with respect to the X-axis direction is necessary. Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different. Treat as.
  • the outline of the fourth embodiment is as follows.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates the image blur by setting the position where the image blur is calculated on the image plane 70 to the position notified from the CPU 21 of the camera body 2A.
  • the translation blur calculation unit 402 calculates an image blur at the center of the image plane 70, for example.
  • the blur correction optical system target position calculator 403 determines whether the image blur calculated by the angle blur calculator 401 and the image blur calculated by the translation blur calculator 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the amount of image blur at the position of the image plane 70 notified from the CPU 21 of the camera body 2A is calculated based on the image blur in the X-axis direction and the Y-axis direction after addition.
  • the blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and information on the position determined by the CPU 21 as an interchangeable lens.
  • a camera body 2A having a CPU 21 for transmission to 3A, a blur correction optical system 33 for blur correction, a blur correction unit 40 for receiving position information from the camera body 2A, and a position and shake sensor received from the camera body 2A.
  • An interchangeable lens 3A having a blur correction unit 40 that calculates an image blur ⁇ y2 based on the shake detected in 39, and a blur correction drive mechanism 37 that moves the blur correction optical system 33 in a direction to suppress the image blur ⁇ y2. .
  • image blur can be appropriately suppressed at a position other than the center of the image plane 70 determined by the CPU 21 of the camera body 2A.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the image blur ⁇ y2 based on the output of the shake sensor 39 and the focal length f of the interchangeable lens 3A.
  • the image blur ⁇ y2 can be appropriately calculated at a position other than the center of the image plane 70, and the image blur can be appropriately suppressed based on the image blur ⁇ y2.
  • the camera 1A shown in FIG. 10 is used, as in the fourth embodiment.
  • the image blur correction according to the fifth embodiment is performed exclusively by operating the blur correction drive mechanism 37 of the interchangeable lens 3A.
  • the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A It differs from the fourth embodiment in that both perform computations.
  • the camera 1 ⁇ / b> A may be a single lens reflex type illustrated in FIG. 10 or a mirrorless type without the mirror 24.
  • the interchangeable lens 3A and the camera body 2A may be configured as a lens-integrated camera.
  • the CPU 21 of the camera body 2A determines a position where the image of the main subject is highly likely to exist on the image plane 70 by, for example, any one of methods (1) to (4) in the first embodiment. Then, the CPU 21 sets the center of the image plane 70 as the first position, and sets the position determined as described above as the second position.
  • the blur correction unit 21 a of the CPU 21 calculates image blur at the first position and the second position of the image plane 70.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the X axis (Pitch direction) from the angular velocity sensor of the shake sensor 31 and the image blur in the Y axis direction due to the rotational motion and necessary. In such a case, the image blur in the X-axis direction is calculated.
  • the angle blur calculation unit 201 uses the detection signal around the axis parallel to the Y axis (Yaw direction) from the angular velocity sensor of the shake sensor 31 to cause image blur in the X axis direction due to rotational motion, and when necessary. Calculates the image blur in the Y-axis direction.
  • the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the mathematical expression representing the image blur ⁇ y2 in the Y-axis direction is as described in the first embodiment.
  • the blur correction unit 21a of the CPU 21 further calculates a ratio g between the image blur ⁇ y1 at the first position and the image blur ⁇ y2 at the second position by the following equation (8).
  • g ⁇ y2 / ⁇ y1 (8)
  • the g is referred to as a correction coefficient g.
  • the CPU 21 transmits information indicating the correction coefficient g to the blur correction unit 40 of the interchangeable lens 3A.
  • the CPU 21 may transmit information indicating the difference between ⁇ y2 and ⁇ y1 to the blur correction unit 40 of the interchangeable lens 3A instead of the information indicating the ratio between ⁇ y2 and ⁇ y1.
  • the timing at which the CPU 21 of the camera body 2A transmits the information indicating the correction coefficient g to the blur correction unit 40 is determined by, for example, a first position and a second position at which the CPU 21 calculates image blur on the image plane 70 (newly determined). And the case where the correction coefficient g is calculated.
  • the CPU 21 includes the information of the correction coefficient g in the steady communication between the camera body 2A and the interchangeable lens 3A, or the correction coefficient in the communication instructing the start of image blur correction from the camera body 2A to the interchangeable lens 3A.
  • Information on the correction coefficient g is promptly notified to the shake correction unit 40 by including information on g.
  • the angle blur calculation unit 401 of the camera shake correction unit 40 uses a detection signal around the axis parallel to the X axis (Pitch direction) by the angular velocity sensor 39a.
  • image blur in the Y-axis direction due to rotational movement and image blur in the X-axis direction are calculated if necessary.
  • the angle blur calculation unit 401 uses the detection signal around the axis parallel to the Y axis (Yaw direction) by the angular velocity sensor 39a, and the image blur in the X axis direction due to the rotational motion and, if necessary, the Y axis direction. The image blur is calculated.
  • the blur correction unit 40 in the fifth embodiment calculates the image blur at the same position as the first position defined by the CPU 21 of the camera body 2A, in this example, the center of the image plane 70. Since the position where the image blur is calculated is the center of the image plane 70, the mathematical expression representing the image blur ⁇ y1 in the Y-axis direction is the above formula (2) as described in the first embodiment.
  • the angle blur calculation unit 401 multiplies the image blur ⁇ y1 in the Y-axis direction by a correction coefficient g based on information received from the camera body 2A by the receiving unit, thereby obtaining an image in the Y-axis direction of the second position of the image plane 70.
  • the blur ⁇ y2 is calculated.
  • the angle blur calculation unit 401 calculates the image blur ⁇ y2 by adding the received information to the image blur ⁇ y1.
  • the translation blur calculation unit 402 calculates the image blur in the X-axis direction due to the translational motion using the detection signal in the X-axis direction by the acceleration sensor 39b.
  • the translation blur calculation unit 402 calculates the image blur in the Y-axis direction due to the translational motion using the detection signal in the Y-axis direction from the acceleration sensor 39b.
  • the shake correction optical system target position calculation unit 403 is an image shake in the X-axis direction and the Y-axis direction calculated by the angle shake calculation unit 401 and an image in the X-axis direction and the Y-axis direction calculated by the translational shake calculation unit 402.
  • the image blur in the X axis direction and the Y axis direction is calculated by adding the blur.
  • the shake correction optical system target position calculation unit 403 performs image shake in the X-axis direction and the Y-axis direction after addition, a photographing magnification (calculated based on the position of the zoom optical system 31), and a subject from the camera 1A. Based on the distance up to 80 (calculated based on the position of the focus optical system 32), the image blur amount at the second position of the image plane 70 is calculated.
  • the blur correction optical system target position calculation unit 403 operates the blur correction drive mechanism 37 of the interchangeable lens 3A to perform image blur correction, and therefore moves the blur correction optical system 33 in a direction to cancel the calculated image blur amount.
  • the target position of the blur correction optical system 33 is calculated.
  • the shake correction optical system target position calculation unit 403 sends a signal indicating the target position to the shake correction drive mechanism 37 of the interchangeable lens 3A.
  • the image blur correction in the fifth embodiment includes correction in the Y-axis direction when the camera 1A rotates in the pitch direction and correction in the X-axis direction when the camera 1A rotates in the Yaw direction.
  • the above formula (1) and the above formula (2) represent correction in the Y-axis direction when the camera 1A rotates in the pitch direction.
  • the same correction as that described above with respect to the X-axis direction is necessary. Since the correction in the Y-axis direction when the camera 1A rotates in the pitch direction and the correction in the X-axis direction when the camera 1A rotates in the Yaw direction are the same except for the direction, the correction in the X-axis direction is the same. The description of is omitted.
  • the image blur calculated by the translation blur calculation unit 402 is substantially constant even if the position on the image plane 70 (imaging plane of the image sensor 22) is different.
  • the outline of the fifth embodiment is as follows.
  • the angle blur calculation unit 201 of the blur correction unit 21a of the camera body 2A calculates image blurs ⁇ y1 and ⁇ y2 at the first position (center of the image plane 70) and the second position of the image plane 70.
  • the blur correction unit 21a calculates a correction coefficient g that is a ratio between the image blur ⁇ y1 at the first position and the image blur ⁇ y2 at the second position, and information indicating the correction coefficient g is used as the blur correction unit 40 of the interchangeable lens 3A. Send to.
  • the angle blur calculation unit 401 of the blur correction unit 40 of the interchangeable lens 3A calculates an image blur at the first position of the image plane 70 (the center of the image plane 70).
  • the angle blur calculation unit 401 further calculates the image blur at the second position of the image plane 70 by multiplying the image blur at the first position by the correction coefficient g based on the information received from the camera body 2A by the receiving unit.
  • the translation blur calculation unit 402 of the blur correction unit 40 calculates image blur at the first position, for example.
  • the shake correction optical system target position calculation unit 403 of the shake correction unit 40 determines whether the image shake at the second position and the image shake calculated by the translational shake calculation unit 402 are positive or negative depending on the direction of each axis of the X axis and the Y axis. Add the sign and perform the addition operation. Then, the image blur amount at the second position of the image plane 70 is calculated based on the image blur in the X-axis direction and the Y-axis direction after the addition.
  • the blur correction apparatus includes an imaging element 22 that captures a subject image formed on the image plane 70 by the interchangeable lens 3A, a CPU 21 that determines a position on the image plane 70, and a first position that is predetermined on the image plane 70. Based on (the center of the image plane 70) and the second position determined by the CPU 21 and the shake detected by the shake sensor 31, the image blur ⁇ y1 and the image blur ⁇ y2 at the first position (the center of the image plane 70) and the second position are obtained.
  • a camera body 2A having a camera shake correction unit 21a that calculates, a CPU 21 that transmits information about a correction coefficient g or a difference between the image camera shake ⁇ y1 and the image camera shake ⁇ y2 to the interchangeable lens 3A, and a camera shake correction optical system that performs camera shake correction.
  • the image blur ⁇ y1 at the first position (center of the image plane 70) of the image sensor 22 based on the first position (center of the image plane 70) and the shake detected by the shake sensor 39.
  • the shake correction unit 40 that calculates, the shake correction unit 40 that receives information from the camera body 2A, and the image blur ⁇ y1 calculated by the shake correction unit 40 are corrected to suppress the corrected image blur.
  • An interchangeable lens 3 ⁇ / b> A having a shake correction drive mechanism 37 that moves the shake correction optical system 33 in the direction is provided.
  • the blur correction unit 40 of the interchangeable lens 3A can appropriately suppress the image blur at the second position determined by the CPU 21 of the camera body 2A, for example.
  • the blur correction unit 21a of the camera body 2A calculates the image blur ⁇ y1 and the image blur ⁇ y2 based on the output of the shake sensor 31 and the focal length f of the interchangeable lens 3A, and the blur correction unit 40 of the interchangeable lens 3A
  • the image blur ⁇ y1 is calculated from the output of the shake sensor 39 and the focal length f. Accordingly, the blur correction unit 40 of the interchangeable lens 3A can appropriately calculate the image blur ⁇ y2 at the second position other than the center of the image plane 70, and can appropriately suppress the image blur based on the image blur ⁇ y2.
  • Image blur correction is performed by operating both the camera shake correction drive mechanism 26 of the camera body 2A and the camera shake correction drive mechanism 37 of the interchangeable lens 3A, which is the same as in the fourth modification of the third embodiment. Also, the point that both the blur correction unit 21a of the CPU 21 of the camera body 2A and the blur correction unit 40 of the interchangeable lens 3A perform the calculation is common to the fifth embodiment.
  • the CPU 21 of the camera body 2A sends to the shake correction unit 40 of the interchangeable lens 3A (a) information on the first position for calculating image blur on the image plane 70, and (b) image blur correction by the interchangeable lens 3A and the camera.
  • Information indicating a sharing ratio with the image blur correction by the body 2A is transmitted.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the image blur at the first position of the image plane 70, and obtains the image blur V (L) shared by the interchangeable lens 3A by the above equation (6).
  • the blur correction unit 21a of the camera body 2A calculates the angular blur at the first position of the image plane 70 and the image blur at the second position of the image plane 70, and then calculates the camera body 2A by the above equation (7). Find the image blur V (B) to be shared.
  • the blur correction unit 40 of the interchangeable lens 3A calculates the target position of the blur correction optical system 33 based on the calculated image blur V (L) and the image blur calculated by the translation blur calculation unit 402, thereby replacing the interchangeable lens.
  • the image blur correction is performed by operating the blur correction drive mechanism 37 of 3A.
  • the blur correction unit 21a of the camera body 2A calculates the target position of the image sensor 22 based on the calculated image blur (B) and the image blur calculated by the translational blur calculation unit 202, so that the camera body 2A.
  • the image blur correction performed by operating the image blur correction drive mechanism 26 is performed.
  • image blur at a position where blur is desired to be corrected is corrected. For this reason, it may be assumed that image blur at a position determined by the CPU 21 on the image plane 70 is suppressed while image blur remains at other positions on the image plane 70. In such a case, image restoration by image processing may be combined.
  • the CPU 21 sends an instruction to the signal processing circuit 27 to make the image blurring conspicuous, for example, by strongly applying edge enhancement processing to the data corresponding to the other positions among the image data generated by the signal processing circuit 27.
  • the image restoration process to be eliminated is executed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Adjustment Of Camera Lenses (AREA)
  • Structure And Mechanism Of Cameras (AREA)

Abstract

La présente invention concerne une lentille interchangeable pouvant être fixée à un corps de caméra et détachée de celui-ci, le corps de caméra étant pourvu d'un élément d'imagerie pour capturer une image de sujet, et est pourvue: d'un système optique d'imagerie qui forme l'image de sujet sur une surface d'image; une unité d'entrée dans laquelle une quantité de flou détectée par la lentille interchangeable et/ou le corps de caméra est entrée; une unité de réception qui reçoit des informations qui sont utilisées pour calculer une quantité de correction hors axe pour corriger un flou se trouvant hors d'un axe optique sur la surface d'image; et une unité d'entraînement qui, au moins sur la base des informations et de la quantité de flou, entraîne une partie mobile qui est au moins une partie du système optique d'imagerie dans un plan orthogonal à l'axe optique.
PCT/JP2018/011477 2017-03-31 2018-03-22 Lentille interchangeable et corps de caméra WO2018180909A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-072590 2017-03-31
JP2017072590A JP2020095071A (ja) 2017-03-31 2017-03-31 交換レンズおよび撮像装置

Publications (1)

Publication Number Publication Date
WO2018180909A1 true WO2018180909A1 (fr) 2018-10-04

Family

ID=63677062

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/011477 WO2018180909A1 (fr) 2017-03-31 2018-03-22 Lentille interchangeable et corps de caméra

Country Status (2)

Country Link
JP (1) JP2020095071A (fr)
WO (1) WO2018180909A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7378613B2 (ja) 2020-05-29 2023-11-13 富士フイルム株式会社 機上現像型平版印刷版原版、平版印刷版の作製方法、及び平版印刷方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015233248A (ja) * 2014-06-10 2015-12-24 キヤノン株式会社 撮像装置及び撮像方法
JP2017044876A (ja) * 2015-08-27 2017-03-02 オリンパス株式会社 撮像装置及び像ブレ補正方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015233248A (ja) * 2014-06-10 2015-12-24 キヤノン株式会社 撮像装置及び撮像方法
JP2017044876A (ja) * 2015-08-27 2017-03-02 オリンパス株式会社 撮像装置及び像ブレ補正方法

Also Published As

Publication number Publication date
JP2020095071A (ja) 2020-06-18

Similar Documents

Publication Publication Date Title
WO2018180916A1 (fr) Dispositif de correction de flou, lentille de remplacement et dispositif d'imagerie
JP7484866B2 (ja) ブレ補正装置、交換レンズ、撮像装置及び像ブレ補正方法
US9602727B2 (en) Imaging apparatus and imaging method
WO2020088133A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur
JP6685843B2 (ja) 撮像装置
JP2020042078A (ja) 光学機器
WO2014156731A1 (fr) Dispositif de saisie d'images, élément transistorisé de saisie d'images, module de caméra, dispositif électronique et procédé de saisie d'images
WO2017120771A1 (fr) Procédé et appareil d'acquisition d'informations de profondeur, et dispositif de collecte d'image
US10616503B2 (en) Communication apparatus and optical device thereof
JP4832013B2 (ja) 像振れ補正装置
JP6543946B2 (ja) ブレ補正装置、カメラ及び電子機器
US20190222767A1 (en) Shake correction device, imaging apparatus, and shake correction method
JP2023041748A (ja) カメラ、レンズ装置、制御方法、およびコンピュータプログラム
JP2019145958A (ja) 撮像装置およびその制御方法ならびにプログラム
WO2019151030A1 (fr) Dispositif d'imagerie, élément d'imagerie à semi-conducteurs, module de caméra, unité de controle de commande et procédé d'imagerie
JP2019164338A (ja) カメラ、レンズ装置、制御方法、およびコンピュータプログラム
WO2018180909A1 (fr) Lentille interchangeable et corps de caméra
JP2017044876A (ja) 撮像装置及び像ブレ補正方法
WO2018180908A1 (fr) Dispositif de correction de flou, lentille de remplacement et dispositif d'imagerie
US20150294442A1 (en) Camera system and imaging method
JP2006259114A (ja) デジタルカメラ
US11770614B2 (en) Image processing device and method, and program
JP6943323B2 (ja) 交換レンズ
JP7210256B2 (ja) 撮像装置及び表示制御方法
JP2018197825A (ja) 制御装置及び方法、及び撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18776246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18776246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP