JP6411829B2 - Imaging apparatus and image blur correction method - Google Patents

Imaging apparatus and image blur correction method Download PDF

Info

Publication number
JP6411829B2
JP6411829B2 JP2014189303A JP2014189303A JP6411829B2 JP 6411829 B2 JP6411829 B2 JP 6411829B2 JP 2014189303 A JP2014189303 A JP 2014189303A JP 2014189303 A JP2014189303 A JP 2014189303A JP 6411829 B2 JP6411829 B2 JP 6411829B2
Authority
JP
Japan
Prior art keywords
image
imaging
acceleration
speed
amount
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014189303A
Other languages
Japanese (ja)
Other versions
JP2016061912A (en
Inventor
仁司 土屋
仁司 土屋
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2014189303A priority Critical patent/JP6411829B2/en
Publication of JP2016061912A publication Critical patent/JP2016061912A/en
Application granted granted Critical
Publication of JP6411829B2 publication Critical patent/JP6411829B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an imaging apparatus and an image blur correction method including an image blur correction unit that performs image blur correction in a translational direction among image blurs caused by camera shake and the like.

  Conventionally, an optical image formed by an imaging optical system is converted into a photoelectric conversion element such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. And so on (hereinafter referred to as an image sensor), and the image signal obtained thereby is stored in a storage medium as image data in a predetermined form (for example, digital image data representing a still image or a moving image). An image display device that displays a still image or a moving image based on the digital image data, such as a liquid crystal display (LCD) device or an organic electroluminescence (organic EL) display device. Image blur correction means for correcting image blur caused by camera shake and the like. Imaging device for Bei, for example an image pickup apparatus such as a digital camera or a video camera are widely put into practical use in general.

  In image pickup apparatuses that have been put into practical use in recent years, the performance improvement of camera shake correction means is remarkable, and even when holding the image pickup apparatus in the hand and executing an image pickup operation, anyone can take care of it without worrying about camera shake etc. An environment in which a clear image with little image blur can be easily taken is being prepared.

  In an image blur correction unit in a conventional imaging apparatus, generally, a rotational motion generated in the apparatus main body is detected, and an image blur on the imaging surface caused by the rotational motion, so-called angular blur is corrected. Here, it is known that the above-described angular blur is greatly affected by the image blur depending on the focal length of the imaging optical system of the imaging apparatus. For example, in general, 1 / focal length (focal length converted to an angle of view of an imaging optical system applied to an imaging apparatus using a 35 mm film) · second (sec.) Shutter speed lower than the shutter speed. It is said that deterioration of image quality due to image blur or the like starts to stand out. The shutter speed (1 / focal length / second) in this case is called a camera shake limit second time.

  In recent years, with the improved performance of angular velocity sensors that detect rotational motion, it has become possible to correct angular shake correction to a level that poses almost no problem. Even if the speed is lowered (for example, about four steps in the number of exposure steps), it is possible to obtain an imaging result with less image blur.

  On the other hand, image blur that occurs in the imaging apparatus includes so-called translational blur that occurs when the imaging apparatus moves in a parallel direction with respect to the imaging surface, in addition to the angular blur due to the rotational motion. This translational blur does not depend on the focal length of the imaging optical system. However, for example, as the imaging magnification increases, the image blur tends to increase. For this reason, even with an imaging device having excellent image blur correction performance, there is a problem that image blurring is likely to occur, for example, when a proximity imaging operation is performed with the imaging device held in a hand. is there.

  In view of this, various proposals have been made in the conventional imaging apparatus as a device for suppressing translational blur, for example, in Japanese Patent Laid-Open No. 7-225405.

  The imaging device disclosed in the above Japanese Patent Application Laid-Open No. 7-225405 and the like includes an acceleration sensor and an angular velocity sensor that detect triaxial acceleration and angular velocity acting on the camera, and translates in a translational direction according to an output from each sensor. The amount (movement amount) is detected.

JP 7-225405 A

  However, in the means disclosed in the above-mentioned Japanese Patent Application Laid-Open No. 7-225405, etc., the gravitational acceleration component is calculated by the attitude detecting means, and the gravity component is subtracted from the acceleration detected by the acceleration sensor. It needs to be realized. However, it is technically difficult to accurately obtain the initial posture, and there is a problem that the calculation load increases in order to follow the posture change.

  Furthermore, the amount of movement is obtained by performing integration twice for the acceleration after gravity subtraction, but if there is an error in the acceleration after gravity calculation, the error is increased by integration, and the integrated value is There is a problem that it diverges and causes erroneous correction.

  In order to solve such problems, for example, a countermeasure such as periodically correcting the speed obtained by integrating the acceleration to a correct speed can be considered. The description is not disclosed in the above publication.

  In general, the output of the acceleration sensor may detect a gravitational acceleration component, which may cause a detection error.

  The present invention has been made in view of the above-described points, and the object thereof is to periodically update the reference value and the moving speed of the acceleration sensor without requiring complicated processing such as gravity correction. Accordingly, it is an object of the present invention to provide an image pickup apparatus and an image blur correction method including an image blur correction unit that can prevent erroneous correction and can perform correction with relatively high accuracy.

In order to achieve the above object, an imaging apparatus according to one embodiment of the present invention includes an image blur correction unit that corrects an image blur caused by a change in posture.
An imaging unit that captures a subject image at a predetermined imaging frame rate and obtains image data for each frame; and a subject image based on a correlation between a plurality of consecutive image data that are captured for each frame. A first image movement amount calculation unit that calculates one movement amount; an acceleration sensor that detects an acceleration associated with a change in posture of the imaging apparatus; an acceleration value detected by the acceleration sensor; and the first image movement. the basis of the detection reference value calculating section for calculating a detection reference value of the acceleration sensor based on the value of the first amount of movement calculated by the amount calculating unit, a detected acceleration value by the acceleration sensor to the detection reference value And an acceleration correction unit that calculates a corrected acceleration value and corrects the subject image to the imaging device based on the corrected acceleration value calculated by the acceleration correction unit. And second tool Preparations and an image shift amount calculation unit for calculating a second movement amount for correcting the translational image blur amount generated on the imaging surface by moving in translation direction, the detection reference value calculating section further The first image movement speed is calculated from the detection result of the first movement amount by the integration unit that sequentially accumulates the corrected acceleration values and the first image movement amount calculation unit, and is detected at the previous frame. A first speed change amount calculating unit that calculates a first speed change amount from a difference between one image moving speed and the first image moving speed detected at the time of the current frame; and a frame of the first image moving amount calculating unit. A second speed change amount calculating unit that calculates a second speed change amount from an integrated value of detection results of the acceleration sensor in a period corresponding to a period from the exposure center to the exposure center of the next frame. , The detection reference value calculation unit calculates the first speed change amount and the previous speed change amount. Based on the second speed change amount, it calculates the detection reference value of the acceleration sensor.

An image blur correction method according to an aspect of the present invention is an image blur correction method for correcting an image blur caused by a change in posture of an imaging apparatus, and captures a subject image at a predetermined imaging frame rate, and each frame. An imaging step of acquiring image data every time, a step of detecting a first movement amount of a subject image based on a correlation between a plurality of successive image data picked up for each frame, and the imaging device by an acceleration sensor a step of detecting an acceleration value caused by the posture change, calculating a detection reference value of the acceleration sensor based on the first moving amount and the detected acceleration, the detected by the acceleration sensor acceleration Correcting the value based on the detection reference value and calculating the corrected acceleration value obtained thereby, and moving in the translation direction based on the corrected acceleration value Holders of Bei a step of calculating a second movement amount for correcting the translational image blur amount generated on the imaging surface by a, process for calculating the detection reference value, further sequentially accumulating the corrected acceleration values, The first image movement speed is calculated from the detection result of the first movement amount obtained in the step of calculating the first movement amount, and the first image movement speed detected at the previous frame and the detection at the current frame are detected. The first speed change amount is calculated from the difference between the first image moving speeds, and the period from the exposure center of the frame to the exposure center of the next frame is calculated in the step of calculating the first moving amount. The step of calculating the second speed change amount from the integrated value of the detection results of the acceleration sensor during the period to calculate the detection reference value is based on the first speed change amount and the second speed change amount, to calculate the detection reference value of the acceleration sensor .

  According to the present invention, a complicated process such as gravity correction is not required, and the correction value can be corrected relatively regularly by updating the reference value and the moving speed of the acceleration sensor periodically. It is possible to provide an image pickup apparatus and an image blur correction method that include an image blur correction unit that can perform the above-described processing.

1 is an external perspective view of an image pickup apparatus including an image blur correction unit according to a first embodiment of the present invention. FIG. 1 is a block configuration diagram showing an outline of an internal configuration of the imaging apparatus of FIG. Display example of finder image when performing imaging operation using imaging apparatus of embodiment Display example of finder image (enlarged image of focus area) when performing imaging operation using imaging apparatus of embodiment The conceptual diagram of the image frame (Fn-Fn + 3) which is continuous in the live view display image in the imaging device of this embodiment. In the imaging device of the present embodiment, a time chart showing a temporal relationship between the exposure time of the imaging device, the acquisition timing of the live view image, the detected acceleration of the acceleration sensor, and the like. Table showing the relationship between acceleration detected by the acceleration sensor and time in the imaging apparatus of the present embodiment The block block diagram which shows the function of the signal processing principal part in the imaging device of this embodiment The flowchart which shows the flow of control of the system controller in the imaging device of this embodiment The flowchart which shows the flow of control at the time of imaging operation standby of the blurring correction microcomputer in the imaging device of this embodiment The flowchart which shows the flow of control at the time of imaging operation of the blurring correction microcomputer in the imaging device of this embodiment. Timing chart showing control timing at the time of still image capturing operation using the imaging apparatus of the present embodiment The block block diagram which shows the function of the signal processing principal part in the imaging device of the 2nd Embodiment of this invention. The block block diagram which shows the function of the signal processing principal part in the imaging device of the 3rd Embodiment of this invention. 10 is a flowchart illustrating a control processing sequence of a system controller in an imaging apparatus according to a third embodiment of the present invention.

  The present invention will be described below with reference to the illustrated embodiments. In addition, each drawing used for the following description is shown schematically, and in order to show each component to the extent that it can be recognized on the drawing, the dimensional relationship and scale of each member are varied for each component. May show. Therefore, the present invention is limited to the illustrated embodiments with respect to the number of components, the shape of the components, the size ratio of the components, the relative positional relationship of the components, and the like described in the drawings. It is not something.

  In each embodiment of the present invention, for example, an optical image formed by an imaging optical system is converted into a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type, for example. A photoelectric conversion device such as an image sensor or the like (hereinafter referred to as an imaging device) sequentially performs photoelectric conversion, and image signals obtained thereby are converted into image data in a predetermined form (for example, digital image data representing a still image or a moving image). As an image display device, such as a liquid crystal display (LCD) or organic electroluminescence (LCD), which stores and reproduces still images or moving images based on digital image data stored in the storage medium. Organic EL (Organic Electro-Luminescence: OEL) It illustrates the imaging apparatus such as cameras and video cameras.

[First Embodiment]
FIG. 1 is an external perspective view of an imaging apparatus including an image blur correction unit according to the first embodiment of the present invention, and is a diagram illustrating a coordinate system and a blur direction set in the imaging apparatus.

  First, the movement at the time of posture change caused by camera shake or the like in the imaging apparatus of the present embodiment will be described below.

  As shown in FIG. 1, the imaging apparatus exemplified in the present embodiment is mainly configured by a camera body 1 and a lens barrel 2 having an imaging optical system. This is a so-called interchangeable lens type imaging device configured to be detachable. The imaging apparatus to which the present invention can be applied is not limited to this type of imaging apparatus, and may be an imaging apparatus having a lens barrel fixed to the camera body, for example.

  A lens barrel 2 having an imaging optical system that receives light from a subject and forms a subject image is mounted on the front surface of the camera body 1. A direction along the optical axis O of the imaging optical system in the lens barrel 2 is defined as a Z axis. Here, the positive direction of the Z-axis is the direction from the camera body 1 to the subject (see arrow Z in FIG. 1). The negative direction of the Z axis is the reverse direction of the positive direction.

  In the standard posture of the camera body 1 (so-called horizontal position; the state shown in FIG. 1), the horizontal direction of the camera body 1 is taken as the X axis. Here, the positive direction of the X axis is the right hand direction from the subject side toward the camera body 1 (that is, the left hand direction when the camera body 1 is viewed from the user (user)) (see arrow X in FIG. 1). ). The negative direction of the X axis is the reverse direction of the positive direction.

  Further, in the standard posture of the camera body 1, the vertical direction of the camera body 1 is taken as the Y axis. Here, the positive direction of the Y axis is the upward direction in the standard posture (see arrow Y in FIG. 1). The negative direction of the Y axis is the reverse direction of the positive direction.

  In FIG. 1, the origin position of the coordinate system is shifted to prevent the XYZ coordinate system from being overlapped with the camera body 1 and becoming difficult to see. However, the origin of this coordinate system is actually at the center of the imaging surface of the imaging device 5 (described later; see FIG. 2), and generally the imaging surface and the optical axis O of the imaging optical system intersect. It is a point to do. This coordinate system is a coordinate system fixed to the camera body 1, and if the camera body 1 moves or rotates, the coordinate system also moves or rotates with respect to the earth. In this coordinate system, the XY plane is a surface that coincides with the imaging surface.

  In such a coordinate system, the rotational motion around the Z axis is called a roll, the rotational motion around the X axis is called a pitch, and the rotational motion around the Y axis is called a yaw.

  Furthermore, in the following description, for example, when the Z axis positive direction is viewed from the origin of the coordinate system, the left rotation around the Z axis is referred to as the roll positive direction rotation, and the X axis positive direction is viewed from the origin. The left rotation around the X axis is referred to as positive pitch rotation, and the right rotation around the Y axis when viewed from the origin in the Y axis positive direction is referred to as the yaw positive rotation.

  Note that the positive / negative direction of the coordinate axis and the positive / negative of the rotation direction described above are for convenience depending on the mounting direction of an angular velocity sensor 9 and an acceleration sensor 10 (see FIG. 2 and the like) described later, and are theoretically limited to the above. It is not a thing.

  In the coordinate system described above, when the rotation center is at the origin (or the rotation center is in the camera body 1 including the origin), the angle blurring is mainly caused, and when the rotation center is outside the camera body 1, the angle blurring is caused. In addition, it causes translational blurring. Therefore, it can be considered that the translational blurring that requires the blurring correction occurs substantially when the rotation center is outside the camera body 1.

  Regarding angle blurring, it can be considered as rotational movement around the origin. That is, the optical axis O is swung left and right by the rotational movement in the yaw direction, and the subject image formed on the image sensor 5 moves left and right, and the optical axis O is swung up and down by the rotational movement in the pitch direction. It is well known that the subject image formed on the top 5 moves up and down. It is also well known that rotational movement in the roll direction results in a horizontal and vertical position on the screen and an oblique position in between.

  Next, an outline of the internal configuration of the imaging apparatus of the present embodiment will be described below. FIG. 2 is a block configuration diagram illustrating an outline of an internal configuration of the imaging apparatus according to the present embodiment.

  The imaging apparatus of the present embodiment is mainly configured by a camera body 1 and a lens barrel 2. Among these, the lens barrel 2 mainly includes an imaging optical system 2a and an encoder 3 for detecting the position of the imaging optical system 2a on the optical axis O.

  The lens barrel 2 forms an object image by transmitting light from the object, and forms an image on the image pickup surface of the image pickup device 5. An image pickup optical system including a plurality of optical lenses and the like, and a plurality of pieces constituting the image pickup optical system. A plurality of lens holding members for holding the optical lens and the like, and a predetermined lens holding member among the plurality of lens holding members is moved forward and backward in a direction along the optical axis O at a predetermined timing to perform a focus adjustment operation (focusing ) And a driving member for performing a zooming operation. Although the lens barrel 2 is configured to include other various components, these components are not directly related to the present invention, and thus illustration and description thereof are omitted.

  Inside the camera body 1 are a shutter mechanism 4, an image sensor 5, a drive unit 6, a system controller 7, a shake correction microcomputer 8, an angular velocity sensor 9, an acceleration sensor 10, and a release SW (switch) 11. An electronic viewfinder (EVF) 12, a memory card 13 and the like are disposed.

  The shutter mechanism 4 is, for example, a component that is disposed on the front side (the imaging optical system 2a side) of the imaging element 5 and controls the exposure time by performing an opening / closing operation. When the shutter mechanism 4 is in an open state, the imaging surface of the image sensor 5 is exposed, and when the shutter mechanism 4 is in a closed state, the imaging surface of the image sensor 5 is light-shielded. As the shutter mechanism 4 in the imaging apparatus of the present embodiment, it is assumed that, for example, a focal plane type shutter mechanism is applied. However, other types of shutter mechanism 4 can be applied. For example, a lens shutter mechanism disposed inside the lens barrel 2 may be applied.

  The imaging element 5 is a photoelectric conversion element that converts a subject image formed on the imaging surface into an electrical signal based on a control instruction from the system controller 7. The converted electrical signal is output to the system controller 7, and predetermined signal processing is performed in the system controller 7 to generate a predetermined form of image signal. The imaging element 5 is supported by a drive unit 6 described later so as to be movable in a two-dimensional direction parallel to the imaging surface.

  In the present embodiment, the image pickup element 5 is described as a photoelectric conversion element as described above. However, the image pickup element 5 is actually not only a photoelectric conversion element itself as an electrical component, but also a photoelectric conversion element. A component unit that includes a plurality of electrical components that constitute an electronic circuit for driving an image signal output from the photoelectric conversion element, and an electrical board on which these electrical components are mounted. Shall. In brief, the image sensor 5 is a structural unit that functions as an image capturing unit that captures a subject image at a predetermined image frame rate and acquires image data for each frame.

  As described above, the drive unit 6 movably supports the image sensor 5 and, based on a control instruction of a shake correction microcomputer (hereinafter referred to as a shake correction microcomputer) 7 described later, the X direction shown in FIG. And a drive unit that drives the image sensor 5 to move in the Y direction.

  The system controller 7 is a control unit that receives an input signal from the image sensor 5 and generates a predetermined form of an image signal, and integrally performs various controls related to the overall functions of the image pickup apparatus. As will be described below, the system controller 7 controls the shake correction microcomputer 8 to perform shake detection and performs control to perform shake correction based on the shake detection result.

  The angular velocity sensor 9 is a sensor configured as an angular velocity detection unit that detects a rotational motion, detects an angular change per unit time as an angular velocity, and outputs it to the shake correction microcomputer 8. The angular velocity sensor 9 detects the angular velocities of the yaw rotational motion around the Y axis, the pitch rotational motion around the X axis, and the roll rotational motion around the Z axis.

  The acceleration sensor 10 is an acceleration detection unit that detects at least acceleration in the X-axis direction (X acceleration) and acceleration in the Y-axis direction (Y acceleration). Furthermore, the acceleration sensor 10 in the present embodiment can also detect acceleration in the Z-axis direction (Z acceleration). Then, the acceleration sensor 10 outputs the detected acceleration in each direction to the shake correction microcomputer 8.

  The angular velocity sensor 9 and the acceleration sensor 10 described above perform detection in time series with different detection times (that is, at predetermined time intervals), and sequentially output the detection results to the shake correction microcomputer 8. ing.

Based on the control instruction of the system controller 7, the shake correction microcomputer 8 receives the output of the angular velocity sensor 9 and the output of the acceleration sensor 10 and calculates the shake amount of the camera body 1. In addition, the blur correction microcomputer 8 outputs a control instruction for driving the image sensor 5 by the detected blur amount to the drive unit 6 in a direction opposite to the detected blur direction. In response to this, the driving unit 6 drives the image sensor 5 in order to cancel image blur on the imaging surface. As a result, it is possible to correct the image blur that occurs in the captured image obtained by the image sensor 5.

  Here, the image pickup device 5 is driven to perform image blur correction, but instead of this, or in addition to this, a part of the optical lens of the image pickup optical system 2a is placed on the optical axis O. The image blur correction may be performed by driving in a direction perpendicular to the optical axis or in a direction inclined with respect to the optical axis O.

  Note that the shake amount detection device is constituted by components including the shake correction microcomputer 8, the angular velocity sensor 9, the acceleration sensor 10, and the like. An image blur correction device is configured including the blur amount detection device and the drive unit 6. The image blur correction device is a structural unit that functions as an image blur correction unit for correcting image blur caused by a change in the attitude of the imaging device.

  The release SW (switch) 11 is a switch member that interlocks with a predetermined operation member (not shown). The release SW (switch) 11 is connected to the system controller 7 and is a switch member that transmits a predetermined instruction signal, that is, a release instruction signal to the system controller 7. More specifically, the release SW (switch) 11 is an imaging operation input unit that gives, for example, a still image exposure start instruction to the imaging unit.

  Specifically, for example, a two-stage press switch is applied to the release SW (switch) 11 and automatic focus adjustment is performed by a first release signal by a first-stage pressing operation (half-pressing operation or 1st. Release operation). Operation (AF operation) and automatic exposure operation (AE operation) are executed, and exposure operation including driving of the shutter mechanism is executed by a second release signal by the second-stage pressing operation (full pressing operation or 2nd. Release operation). .

  The electronic viewfinder (EVF) 12 is a display device configured to include a display device such as a liquid crystal panel, for example, and can receive and display an image signal acquired by the image sensor 5 and generated by the system controller 7 or the like. It is a component which converts into image data of a form and displays it so that a user (user) can visually recognize it.

  The memory card 13 is a constituent unit including, for example, a non-volatile storage medium configured to be detachable from the camera body 1 and a card driving mechanism corresponding thereto. Here, the card drive mechanism receives, for example, an image signal acquired by the imaging device 5 and generated by the system controller 7 or the like, converts it into a recordable data file, and stores the data file thus generated. A component that reads and writes from / to a medium.

  Although not shown in the figure, the camera body 1 is a storage medium different from the memory card 13, for example, a control program executed by the system controller 7, and various types used for various controls. A nonvolatile storage medium in which parameters and the like are stored in advance is provided.

  In addition to the above-described constituent members, the imaging device includes various constituent members. However, those constituent members whose illustration and description are omitted are not directly related to the present invention. It is assumed that an equivalent to a conventional imaging device is provided.

  A schematic flow of the operation of the imaging apparatus of the present embodiment configured as described above will be briefly described below with reference to FIGS.

  The light flux from the subject passes through the imaging optical system 2a and forms an image of the subject on the imaging surface of the imaging element 5.

  The shutter mechanism 4 switches between an exposure state and a light shielding state of the image sensor 5 by performing an opening / closing operation based on a control instruction from the system controller 7.

  When the user (user) pushes down a predetermined operation member (shutter release button; not shown), a predetermined instruction signal (release signal) is generated from the release SW 11 linked thereto. In response to the release signal, the system controller 7 controls the drive of the shutter mechanism 4 based on any of the shutter speed values set in advance or set by the AE operation executed in response to the first release signal. The exposure state that exposes the imaging surface of the imaging device 5 is set to an open state for a predetermined time.

  When the imaging surface is in an exposure state, the imaging element 5 executes a photoelectric conversion process from a subject image formed on the imaging surface. The system controller 7 reads out the electric charge acquired by the image sensor 5 as an image signal. The system controller 7 performs various signal processing on the read image signal and then outputs it to the EVF 12 to display it as a live view image, or outputs it to the memory card 13 and records it as image data. To do.

  The angular velocity sensor 9 detects the angular velocity of the rotational motion around the axis and notifies the blur correction microcomputer 8 of the detected angular velocity. Here, the angular velocity sensor 9 is a rotational movement about the horizontal direction of the imaging apparatus, that is, the pitch direction, on the plane orthogonal to the optical axis O of the imaging optical system 2a, and similarly the vertical direction, that is, the rotation about the yaw direction. Detect the angular velocity of each motion.

  The shake correction microcomputer 8 calculates an angle change based on the acquired value of the angular velocity, and calculates an angle shake amount on the image pickup surface of the image pickup device 5 based on the angle change amount and the focal length information of the image pickup optical system 2a. The drive control of the drive unit 6 is executed in a direction to cancel the image blur caused by the calculation. By such an action, an image with little image blur can be acquired.

  The acceleration sensor 10 detects the acceleration accompanying the parallel movement in the X direction and the Y direction that occurs in the camera body 1 due to the posture change of the imaging apparatus. Here, based on the detected acceleration, the blur correction microcomputer 8 calculates the amount of movement in the translation direction. Further, the blur correction microcomputer 8 performs conversion into a translational blur amount on the imaging surface of the image sensor 5, and based on these data, along with the angular blur amount, the drive unit 6 in a direction to cancel the image blur at that time. The drive control is executed.

  Here, the basic concept of the present invention will be described below with reference to FIGS. 3 and 4 show display examples of a finder image when performing an imaging operation using the imaging apparatus of the present embodiment.

  Here, FIG. 3 is an example of a finder image displayed on the EVF 12 when performing a proximity imaging operation using, for example, a flower as a subject. In FIG. 3, a rectangular frame line displayed at a substantially central portion of the screen frame indicates a region to be a focus adjustment operation, that is, a so-called focus area.

  FIG. 4 shows an enlarged image of the focus area shown in FIG. In this state, image blurring in the shift direction (translation direction) is likely to occur in the screen. In order to represent such a situation, the subject image is indicated by a solid line and a dotted line. That is, when image blurring occurs under such a proximity imaging operation situation, the display position of the subject image is deviated between the previous display frame and the current display frame in the live view display image. In FIG. 4, if the image indicated by the broken line is the subject image of the previous display frame and the image indicated by the solid line is the subject image of the current display frame, the movement amount indicated by the arrow shown in FIG. Become.

  Assuming that the angle blur correction has been performed on the live view display image, the angle blur is almost corrected, so the amount of movement detected here is the movement of the imaging device in the translation direction. This is considered to be the amount of shake accompanying the movement of the subject or the amount of shake accompanying the movement of the subject side. If the subject is a completely stationary object, the image movement speed between frames can be calculated from the detected movement amount, and the focal length information of the imaging optical system 2a and the image magnification information obtained from the focus position Thus, the translational movement speed of the imaging device can be obtained.

  FIG. 5 is a conceptual diagram of continuous image frames (Fn to Fn + 3) in the live view display image in the imaging apparatus.

  In FIG. 5, the average moving speed Vn between frames is obtained from the amount of image shift between the preceding frame Fn and the immediately following succeeding frame Fn + 1. Similarly, the average moving speed Vn + 1 between the frames is obtained from the image shift amount between the subsequent frame Fn + 1 and the subsequent frame Fn + 2 immediately after that. Hereinafter, the average moving speed between subsequent frames can be obtained in exactly the same manner.

  Then, the speed change between the frames is obtained from the preceding average moving speed Vn and the subsequent average moving speed Vn + 1 obtained as described above, and the average acceleration an between the frames can be calculated. Similarly, a change in speed between the frames is obtained from the subsequent average moving speed Vn + 1 and the subsequent average moving speed Vn + 2 immediately thereafter, and the average acceleration an + 1 between the frames can be calculated.

  FIG. 6 is a time chart showing the time relationship between the exposure time of the image sensor, the acquisition timing of the live view image, the detected acceleration of the acceleration sensor, and the like in the image sensor of this embodiment.

  6A shows a vertical synchronization signal VD that synchronizes the readout timing of the image sensor 5. FIG. The vertical synchronization signal VD indicates the start point of each frame.

  6B shows the exposure period of each line in the image sensor 5. FIG. The image sensor 5 is read to the system controller 7 line by line by, for example, a rolling shutter system. That is, the exposure period causes a time difference in the readout time between lines. From this, the shape of the exposure period for one frame is shown by a substantially diamond shape as shown in the figure.

  6C shows an image representing the moving speed of the image obtained by comparing the image of the latest frame with the image of the preceding frame after calculating each frame and calculating the moving amount of the image between them. The surface moving speed is shown.

  In FIG. 6, (D) shows the value of an analog signal (detection signal) of the acceleration value detected by the acceleration sensor 10. This acceleration signal is input to the shake correction microcomputer 8, and the shake correction microcomputer 8 performs conversion into a digital signal, that is, AD conversion processing.

  In FIG. 6, (E) shows a change in the main body speed obtained by integrating the acceleration, and shows an integrated value for each predetermined period. In FIG. 6, integration from the exposure center to the exposure center of each successive frame is performed.

  The basic idea of the present invention is that the speed change obtained from the image plane moving speed between the frames and the speed change obtained by integrating the accelerations detected from the acceleration sensor 10 should match. The detection reference value of the acceleration sensor 10 is adjusted so as to match.

  FIG. 7 is a table showing the relationship between the detected acceleration of the acceleration sensor 10 and time in the imaging apparatus of the present embodiment.

  In FIG. 7, the average acceleration between frames calculated from the live view image is indicated by a broken line, and the average value of acceleration during the integration period (t1 to t2) is indicated by a solid line. The difference between the two is shown as a sensor offset generated in the detection result of the acceleration sensor.

  After the integration period t2, it is possible to detect an acceleration with little error by subtracting the detected offset. Although the offset detected here is caused by various factors such as temperature and posture change, there is an advantage that there is no fear of accumulation of errors because the reference value is corrected every period.

  FIG. 8 is a block configuration diagram showing functions of a main part of signal processing in the imaging apparatus of the present embodiment.

  As shown in FIG. 8, the system controller 7 includes a movement vector extraction unit 71, an interframe parallel movement speed calculation unit 72, and an interframe speed change calculation unit 73. In addition, the blur correction microcomputer 8 includes an offset subtractor 81, a first integrator 82, a second integrator 83, an acceleration offset calculator 84, a speed calculator 85, and a speed change adder 86. The third integrator 88 and the image magnification multiplier 89 are provided.

  Here, the image signal acquired by the image sensor 5 is input to the movement vector extraction unit 71 of the system controller 7. Then, the movement vector extraction unit 71 detects the amount of image movement between frames in the input image signal, and executes processing for extracting a movement vector based on the amount of image movement.

In this case, the frames to be processed may be two consecutive frames, or two frames extracted by thinning out a predetermined number of frames. Normally, Zobu Les Ingredient translational motion, since the low frequency is dominant, even trouble free for processing the results of thinning out the frames.

  Subsequently, for the movement vector extracted by the movement vector extraction unit 71, the inter-frame parallel movement speed calculation unit 72 separates the frame into the movement speeds in the X direction and the Y direction on the imaging surface of the image sensor 5. A process for calculating the moving speed is executed. Here, the movement speed of the image plane between frames can be calculated based on the movement amount of the image on the imaging plane and the time data between the compared frames. Further, the inter-frame speed change calculation unit 73 calculates the speed change between the frames from the continuously calculated inter-frame parallel movement speed data, and converts it into an average acceleration.

  Therefore, the movement vector extraction unit 71, the inter-frame parallel movement speed calculation unit 72, and the inter-frame speed change calculation unit 73 in the system controller 7 are imaged for each frame and are based on the correlation between a plurality of consecutive image data. It is a component that functions as a first image movement amount calculation unit that calculates a first movement amount of a subject image.

  The average acceleration obtained by the inter-frame speed change calculation unit 73 is a value on the image plane, but can be made the average acceleration of the imaging apparatus by dividing using the image magnification data.

  Specifically, for example, when the image movement amount on the imaging surface is 10 μm (= 0.01 mm), and the image magnification is 0.1, the movement amount of the imaging device is 0.1 mm. It can be converted into a thing.

  As described above, the average acceleration value calculated by the system controller 7 and data such as the inter-frame parallel movement speed value are notified to the shake correction microcomputer 8 via, for example, a communication interface.

  The shake correction microcomputer 8 acquires acceleration data from the acceleration sensor 10 with a period of 1 ms, for example, and subtracts an offset correction value based on the acceleration data, and then performs integration processing with the first integrator 82.

  In the first integrator 82, timing notification from the system controller 7 to the blur correction microcomputer 8 is performed at the exposure center timing for each frame, and the integration result is cleared at this timing. Accordingly, the result of integration by the first integrator 82 is a cumulative addition value of acceleration from the exposure timing to the exposure timing.

  The acceleration offset calculation unit 84 calculates the difference between the average acceleration value of the data notified from the system controller 7 and the average acceleration value obtained by dividing the acceleration integration value calculated by the first integrator 82 by the number of integrations. The acceleration offset value when subtracting in the offset subtractor 81 is used.

  That is, the acceleration offset calculation unit 84 is based on the acceleration value detected by the acceleration sensor 10 and the value of the first image movement amount calculated by the first image movement amount calculation unit (71, 72, 73). This is a component that functions as a detection reference value calculation unit that calculates the detection reference value of the acceleration sensor 10.

  The offset subtractor 81 corrects the acceleration value detected by the acceleration sensor 10 based on the detection reference value calculated by the acceleration offset calculation unit 84, and calculates the corrected acceleration value. It is a component that functions as.

  The above is the flow of control by the system controller 7 during standby of the imaging operation of the imaging apparatus of the present embodiment.

  Subsequently, when the imaging operation is started, the system controller 7 notifies the blur correction microcomputer 8 of the inter-frame parallel movement speed data (first image movement speed) obtained immediately before the start of the imaging operation. In response to this, the blur correction microcomputer 8 executes a process in the speed calculation unit 85 for obtaining the initial speed (pre-exposure translation speed) at the start of the imaging operation based on the inter-frame parallel movement speed notified from the system controller 7. To do.

  On the other hand, the second integrator 83 starts an integration process of the acceleration corrected for the offset from the acceleration from the acceleration sensor 10 from the start of the imaging operation. This is to calculate the speed change from the start of exposure, that is, to calculate the translation speed during still image exposure.

  In the speed change adder 86, the parallel movement speed can be obtained by adding the initial speed obtained by the speed calculator 85 and the speed change calculated by the second integrator 83.

  The third integrator 88 integrates the parallel movement speed during exposure to calculate the parallel movement amount of the imaging apparatus, and the image magnification multiplier 89 multiplies the image magnification. As a result, the amount of blur on the imaging surface of the image sensor 5 is converted. Using the shake amount calculated in this way, the drive unit 6 is driven and controlled in a direction to cancel the shake amount. As a result, translational blur can be corrected.

  Here, the third integrator 88 is based on the corrected acceleration value calculated by the offset subtractor 81 (acceleration correcting unit), and the subject image moves in the translation direction with respect to the imaging device. It functions as a second image movement amount calculation unit that calculates a second movement amount that corrects the translational image blur amount generated in step S2.

  Further, the second movement amount calculation unit is calculated by the speed calculation unit 85 immediately before taking a still image when an instruction to start taking a still image is given by the release SW (switch) 11 (imaging operation input unit). The translation speed before exposure of the imaging device is calculated based on the first image moving speed, and the second integrator 83 integrates the acceleration detected by the acceleration sensor 10 to the calculated translation speed before exposure, thereby stationary. A translation speed during image exposure is calculated, and a third integrator 88 calculates a translational blur correction amount generated on the imaging surface based on the calculated translation speed during still image exposure.

  Next, the control flow of the system controller 7 and the shake correction microcomputer 8 in the imaging apparatus of the present embodiment will be described below based on the flowcharts of FIGS.

  FIG. 9 is a flowchart showing a control flow of the system controller 7 in the imaging apparatus of the present embodiment, and shows a movement during display of a live view image. Note that various control processes are performed during the actual imaging operation. In FIG. 9, only the process related to translational blur correction directly related to the present invention is described, and the other operation flow is illustrated. And description is abbreviate | omitted.

  First, it is assumed that the imaging apparatus is in a state of being activated with the power turned on, and the operation mode is set to an imaging operation mode in which an imaging operation can be performed. In this state, the system controller 7 controls the shutter mechanism 4 to open, for example. As a result, the subject image generated by the imaging optical system 2 a of the lens barrel 2 is formed on the imaging surface of the imaging device 5. The system controller 7 drives and controls the image sensor 5 to execute a photoelectric conversion process of the subject image.

  In step S1 shown in FIG. 9, the system controller 7 receives the image signal output from the image sensor 5 and performs predetermined image processing to generate live view image data that can be displayed on the EVF 12, and generates this EVF 12. (Live view image acquisition process).

  Next, in step S <b> 2, the system controller 7 controls the movement vector extraction unit 71 to extract a movement vector between frames for each focus area in two frames before and after in the live view image data. Execute. In this case, the frames to be processed may be two consecutive frames, or two frames extracted by thinning out a predetermined number of frames.

  Subsequently, in step S3, the system controller 7 controls the interframe parallel movement speed calculation unit 72, and based on the movement vector extracted in the process of step S2 described above, the X direction on the imaging surface of the image sensor 5 and A process of calculating a movement amount in the Y direction and calculating an interframe average moving speed from the frame period is executed.

  In step S4, the system controller 7 controls the inter-frame speed change calculation unit 73 to obtain a speed change based on the inter-frame moving speed continuously acquired in the process of step S3 described above, and the inter-frame average acceleration. The process of calculating is executed.

  In step S5, the system controller 7 executes a process for calculating the image magnification. Here, the image magnification can be calculated from focal length information and subject distance information determined based on position information on the optical axis O of the imaging optical system 2a obtained by the encoder 3 (see FIG. 2).

  Next, in step S6, the system controller 7 performs arithmetic processing for converting the inter-frame average acceleration converted to the image plane into the average acceleration in the imaging apparatus based on the image magnification data calculated in the processing in step S5 described above. Run.

  Subsequently, in step S <b> 7, the system controller 7 notifies the blur correction microcomputer 8 of the average acceleration data acquired in the process of step S <b> 6 described above. Thereby, the processing on the system controller 7 side ends.

  FIG. 10 is a flowchart showing the flow of control during standby of the imaging operation of the blur correction microcomputer 8 in the imaging apparatus of the present embodiment. When the process on the system controller 7 side is completed in the process of step S7 of FIG. 9, the process proceeds to the flowchart of FIG.

  In step S <b> 11 of FIG. 10, the shake correction microcomputer 8 acquires an acceleration value based on the output signal of the acceleration sensor 10.

  Next, in step S12, the blur correction microcomputer 8 executes a calculation process for accumulating acceleration.

  Subsequently, in step S <b> 13, the blur correction microcomputer 8 confirms whether or not a predetermined integration period has passed based on the end notification of the acceleration integration period periodically sent from the system controller 7. If it is confirmed that the integration period has elapsed, the process proceeds to the next step S14. If the integration period has not elapsed, the process returns to step S11 described above.

  In step S <b> 14, the blur correction microcomputer 8 executes a calculation process for calculating an acceleration average value by dividing the acceleration integration value by the number of integrations.

  Next, in step S15, the blur correction microcomputer 8 calculates an offset value based on the difference between the inter-frame average acceleration value notified from the system controller 7 and the acceleration integrated value calculated in the process of step S14 described above. The calculation process to calculate is performed.

  Subsequently, in step S16, the blur correction microcomputer 8 executes a process of updating the offset correction value with the offset value calculated in the process of step S15 described above.

  In step S17, the blur correction microcomputer 8 executes a process of clearing the acceleration integrated value.

  In step S18, the blur correction microcomputer 8 confirms whether or not an instruction to start the imaging operation is generated. Here, when the imaging start instruction is confirmed, the process proceeds to the imaging operation process (the flowchart in FIG. 11). If the imaging start instruction is not confirmed, the process returns to the above-described step S11 and the subsequent processes are repeated.

  FIG. 11 is a flowchart showing the flow of control during the imaging operation of the shake correction microcomputer 8 in the imaging apparatus of the present embodiment.

  First, in step S21 of FIG. 11, the blur correction microcomputer 8 executes a process of calculating the initial speed at the start of imaging based on the interframe average speed value notified from the system controller 7. This process is performed only once at the start of imaging.

  Next, in step S <b> 22, the blur correction microcomputer 8 acquires an acceleration value based on the output signal of the acceleration sensor 10.

  Next, in step S23, the blur correction microcomputer 8 executes a calculation process for subtracting the offset value (see the process in step S15 in FIG. 10) obtained during the imaging operation standby from the angular velocity.

  Subsequently, in step S24, the blur correction microcomputer 8 executes a calculation process for calculating the speed change by integrating the acceleration. The speed change calculated here is a speed change from the start of the imaging operation since the integral value = 0 is cleared at the start of the imaging operation.

  Next, in step S <b> 25, the blur correction microcomputer 8 adds the acceleration integration result and the initial speed, and executes a calculation process for calculating the translational movement speed of the imaging apparatus.

  In step S <b> 26, the blur correction microcomputer 8 executes a calculation process for calculating the translational movement amount by integrating the translational movement speed calculated in the process of step S <b> 25 described above.

  In step S27, the blur correction microcomputer 8 calculates the blur amount on the imaging surface by multiplying the translational movement amount calculated in the process of step S26 described above by the image magnification.

  Next, in step S28, the blur correction microcomputer 8 drives and controls the drive unit 6 in a direction to cancel out the blur amount calculated in the process of step S26 described above. This corrects translational blurring.

  In step S29, the blur correction microcomputer 8 checks whether or not the imaging operation has been completed. Here, when the end of the imaging operation is confirmed, the processing sequence ends, and for example, the process returns to the imaging standby state. If the end of the imaging operation is not confirmed, the process returns to the above-described step S22 and the subsequent processes are repeated.

  FIG. 12 is a timing chart showing the control timing at the time of still image capturing operation using the imaging apparatus of the present embodiment.

  12A shows a vertical synchronization signal VD that synchronizes the readout timing of the image sensor 5. The vertical synchronization signal VD indicates the start timing of each frame.

  FIG. 12B shows a front curtain control signal and a rear curtain control signal as control signals for the shutter mechanism 4. These front curtain control signal and rear curtain control signal are in a state in which the front curtain and the rear curtain of the shutter curtain are respectively attracted in the high state.

  FIG. 12C shows the exposure period of each line in the image sensor 5. Since the exposure period has a time difference between the lines, the shape of the exposure period for one frame is indicated by a substantially diamond shape as shown in the figure.

  In FIG. 12, (D) is an image representing the moving speed of an image obtained by comparing the image of the latest frame and the image of the preceding frame after calculating each frame and calculating the moving amount of the image between them. The surface moving speed is shown.

  In FIG. 12, (E) shows the average acceleration obtained by averaging the time between frames based on the amount of change in the moving speed of successive images.

  In FIG. 12, (F) indicates an acceleration representing a detection signal from the acceleration sensor 10.

  In FIG. 12, (G) shows the main body speed change obtained by integrating the acceleration, and shows the integrated value for each predetermined period.

  In FIG. 12, (H) indicates a correction speed representing a translational movement speed calculated for correcting image blur occurring during a still image capturing operation.

  In FIG. 12, (I) shows the translation correction amount calculated based on the result of integrating the translational movement speed.

  Here, the flow of control in the image sensor of the present embodiment will be briefly described with reference to FIG. First, when reading of an image signal from the image sensor 5 is completed, an image plane moving speed is calculated. For example, when the reading of the frame fn + 2 (see FIG. 12C) immediately before the start of the still image capturing operation is completed, the image movement amount is calculated in comparison with the previous frame fn + 1 (see FIG. 12C). . The moving speed of the image plane (see FIG. 12D) is obtained from the calculated image movement amount and frame period. Here, the moving speed IVt2-t3 is an average moving speed from t2 to t3 which is the exposure center of each frame.

  Next, a speed change is obtained from the difference between IVt1-t2 and IVt2-t3, and an average acceleration (see FIG. 12E) Iαt2-t3 is obtained from the frame period.

  In parallel with this, the speed change AVt2-t3 generated in the imaging apparatus is obtained by integrating the acceleration between the exposure center timings, and the average acceleration Aαt2-t3 can be calculated by averaging in the integration period.

  The acceleration average value Iαt2-t3 obtained from the image and the averaged speed Aαt2-t3 obtained from the output of the acceleration sensor 10 are both values in the period from t2 to t3 that are exposure center timings and should match. Therefore, this difference is used for the acceleration correction calculation as offset noise generated in the acceleration sensor 10.

  Further, the difference between the integrated value at t3 of the integration results of t2 to t3 and the average speed change value is obtained, and the difference is added to the average speed IVt2-t3 calculated from the image, whereby the translation speed at t3 is obtained. . The translation speed at t3 is the initial speed of the correction speed calculated when correction is performed during the still image capturing operation. The translational speed during the still image capturing operation can be calculated by adding a speed change obtained by integrating the acceleration to the initial speed.

  t4 is the start timing of the still image capturing frame, t5 is the exposure start timing, and the translational movement amount can be obtained by integrating the translation speed from the exposure start. Based on this translational movement amount, it is converted into an image movement amount on the image plane, and by moving the image sensor 5 in a direction to cancel it, translational blur correction during a still image imaging operation can be performed.

  As described above, according to the first embodiment, since the acceleration is calculated from the movement amount obtained from the image signal and the reference value of the acceleration sensor is calculated, the actual image blur and the detection result are obtained. It is possible to prevent the possibility of incorrect correction without matching. In addition, since correction is performed including all of gravitational acceleration, temperature drift, and the like, the processing load required for correction of the detected value can be reduced. Furthermore, since every cycle, speed, and reference value are updated, it is possible to avoid erroneous correction due to error accumulation.

[Second Embodiment]
Next, an imaging device according to a second embodiment of the present invention will be described below. The configuration of the imaging apparatus of this embodiment is basically substantially the same as that of the first embodiment described above. In the image pickup apparatus of the present embodiment, a subject blur determination unit having a function of detecting subject blur is provided in the blur correction microcomputer 8 so as to prevent erroneous correction of image blur correction caused by movement of the subject. The only difference is the construction. Therefore, the detailed description of the same configuration as that of the first embodiment is omitted, and only a different configuration will be described below.

  FIG. 13 is a block configuration diagram illustrating functions of a signal processing main part in the imaging apparatus according to the second embodiment of the present invention. FIG. 13 corresponds to FIG. 8 in the first embodiment described above.

  As shown in FIG. 13, the imaging apparatus according to the present embodiment is different from the first embodiment only in that a subject blur determination unit 90 that is subject blur detection means is additionally arranged.

  Here, the subject blur determination unit 90 as subject blur detection means is configured to subject the subject based on at least one information of the first image moving speed, the first speed change amount, and the second speed change amount. Detect whether there is any movement.

That is, the subject blur determination unit 90 determines the presence or absence of subject blur based on the inter-frame translational speed obtained from the image signal, the speed change and the translational movement speed change obtained from the acceleration, and determines that there is subject blur. In this case, after the acceleration offset value calculated by the acceleration offset calculation unit 84 is invalidated, the image magnification is multiplied by 0 so that the blur amount is not calculated. It should be noted that, in a normal time when it is determined that there is no subject blur, the image magnification based on the state of the optical system 2a is multiplied.
Specifically, the image magnification is determined by the relationship between the focal length related to the zooming operation of the optical system 2a and the subject distance that can be specified related to the focus adjustment operation.

  That is, the subject blur determination unit 90 (subject blur detection unit) stops the reference value calculation processing of the acceleration sensor 10 and the calculation of the translational blur correction amount when it is determined that the subject is moving.

  In the subject blur determination unit 90, for example, the inter-frame translation speed obtained from the image signal is a translation that is assumed when the user (user) holds the imaging apparatus in his / her hand to perform a normal imaging operation. If it is larger than the movement amount, there is a possibility that an operation for changing the framing is performed, such as whether the subject is moving or the image pickup apparatus is panned.

  When holding an image pickup apparatus in a normal image pickup operation, an average of 1 mm / sec. A certain amount of translational movement occurs. Therefore, a value greatly exceeding this value, for example, 5 mm / sec. When the above translational movement amount occurs, the acceleration offset value is invalidated.

  Note that the threshold value for this determination is not limited to the above example, and it is needless to say that the threshold value may be determined according to the accuracy to be detected.

  If the difference between the speed change calculated from the image signal and the speed change obtained by integrating the acceleration is large, it is determined that there is a subject blur, the acceleration offset calculation result is invalidated, and the image magnification is multiplied by 0. However, the blur amount is not calculated. Other configurations and operations are substantially the same as those in the first embodiment.

  As described above, according to the second embodiment, substantially the same effect as that of the first embodiment can be obtained. In the present embodiment, when subject blur is detected, neither the offset correction of acceleration nor the calculation of blur amount is performed. Therefore, it is possible to reduce the possibility that the reference value is erroneously calculated due to subject blurring, resulting in erroneous correction.

  It should be noted that the configuration of the subject blur determination unit in the present embodiment is not limited to the above-described configuration, and it is possible to detect so-called camera work such as subject blur detection and imaging device panning. Other configurations may be used as long as they can be configured. For example, a configuration may be adopted in which camerawork is detected based on the detection result of the angular velocity sensor 9 and it is determined that panning has been performed. Alternatively, control may be performed such that the correlation is detected by pattern matching in the movement vector detection region, and the acceleration offset value is invalidated when the correlation is low.

[Third Embodiment]
Next, an imaging device according to a third embodiment of the present invention will be described below. The configuration of the imaging apparatus of the present embodiment is basically substantially the same as that of the first embodiment described above. The imaging apparatus according to the present embodiment is different from the imaging apparatus of the present embodiment in that a moving image blur correction unit having a function of detecting a blur of a moving image and correcting the blur is provided in the system controller 7. Therefore, the detailed description of the same configuration as that of the first embodiment is omitted, and only a different configuration will be described below.

  FIG. 14 is a block configuration diagram illustrating functions of a main part of signal processing in the imaging apparatus according to the third embodiment of the present invention. FIG. 14 corresponds to FIG. 8 in the first embodiment described above.

  The moving image blur correction unit 74 is a moving image blur correction unit that cuts out a moving image from the image signal read out from the image sensor 5. At this time, the moving image blur correction unit 74 cancels the image movement (image movement amount) based on the movement vector (image movement amount) extracted by the movement vector extraction unit 71 (first image movement amount calculation unit). Move the moving image cutout position in the direction.

  The image for moving image cut out by the moving image blur correction unit 74 is subjected to various image processing by various image processing units in the system controller 7, and then the image signal subjected to signal processing in a form suitable for reproduction is sent to the EVF 12. Is output. The EVF 12 displays the image as a live view image, that is, as a viewfinder image. Other configurations are substantially the same as those in the first embodiment.

  Next, the operation of the system controller 7 in the imaging apparatus of this embodiment will be described. FIG. 15 is a flowchart illustrating a control processing sequence of the system controller in the imaging apparatus of the present embodiment. FIG. 15 corresponds to FIG. 9 in the first embodiment described above.

  In addition, in the control processing sequence of the system controller in the imaging apparatus of the present embodiment, the processing in step S2A, that is, the moving image is performed as a processing sequence after the processing in step S2 with respect to the processing in the first embodiment. The only difference is that blur correction processing is added.

  The moving image blur correction process in step S2A is a process of switching the image cutout position based on the movement vector extracted by the interframe movement vector extraction process in the process of step S2. Other processing sequences are substantially the same as those in the first embodiment.

  As described above, according to the third embodiment, substantially the same effect as that of the first embodiment described above can be obtained. Further, in this embodiment, in the viewfinder image (live view image) at the time of the framing operation performed while waiting for the imaging operation, by adding the processing for correcting the motion blur based on the movement vector extracted between frames. In addition, it is possible to remove the influence of image blur due to translational movement and comfortably perform pre-imaging operations such as composition determination using a finder image without image blur.

  Each processing sequence described in each of the above-described embodiments can allow a procedure to be changed as long as it does not contradict its nature. Therefore, for each of the above-described processing sequences, for example, the order of the processing steps is changed each time the processing order is changed, the processing steps are executed simultaneously, or a series of processing sequences are executed. It may be. That is, regarding the operation flow in the claims, the specification, and the drawings, even if it is described using “first,” “next,” etc. for convenience, it is essential to carry out in this order. It doesn't mean. In addition, it goes without saying that the steps constituting these operation flows can be omitted as appropriate for portions that do not affect the essence of the invention.

  Of the technologies described here, many of the controls and functions described mainly in the flowcharts are often settable by a software program. The computer program reads and executes the software program described above. Control and functions can be realized. The software program is stored in advance as a computer program product in the product manufacturing process, such as the above-mentioned storage medium or storage unit, specifically, a portable medium such as a flexible disk or CD-ROM, a non-volatile memory, a hard disk, a volatile Electronic data stored or stored in whole or in part in a storage medium such as a volatile memory. Apart from this, it can be distributed or provided at the time of product shipment or via a portable medium or a communication line. Even after the product has been shipped, users can download and install these software programs on their computers via a communication network, the Internet, etc. Accordingly, the imaging apparatus according to the present embodiment can be easily realized.

  The present invention is not limited to the above-described embodiments, and it is needless to say that various modifications and applications can be implemented without departing from the spirit of the invention. Further, the above embodiments include inventions at various stages, and various inventions can be extracted by appropriately combining a plurality of disclosed constituent elements. For example, even if several constituent requirements are deleted from all the constituent requirements shown in the above-described embodiment, if the problem to be solved by the invention can be solved and the effect of the invention can be obtained, this constituent requirement is deleted. The configured structure can be extracted as an invention. Furthermore, constituent elements over different embodiments may be appropriately combined.

  Note that the present invention is not limited to the above-described embodiment, and various modifications and applications can of course be implemented without departing from the spirit of the invention. Further, the above embodiments include inventions at various stages, and various inventions can be extracted by appropriately combining a plurality of disclosed constituent elements. For example, even if several constituent requirements are deleted from all the constituent requirements shown in the above-described embodiment, if the problem to be solved by the invention can be solved and the effect of the invention can be obtained, this constituent requirement is deleted. The configured structure can be extracted as an invention. Furthermore, constituent elements over different embodiments may be appropriately combined.

  The present invention is not limited to an imaging apparatus that is an electronic device specialized for an imaging function, such as a digital camera, and other forms of electronic devices having an imaging function, such as a digital movie camera, a mobile phone, a smartphone, and an electronic device. Widely applied to electronic devices with various imaging functions such as notebooks, electronic dictionaries, personal digital assistants, personal computers, tablet terminal devices, game devices, televisions, watches, navigation devices using GPS (Global Positioning System) Can do.

  Furthermore, the present invention can be similarly applied to an electronic device having a function of acquiring an image using an image sensor and displaying the acquired image on a display device, for example, an observation device such as a telescope, binoculars, and a microscope.

1 …… Camera body,
2 ... Lens barrel,
2a: Imaging optical system,
3 …… Encoder,
4 …… Shutter mechanism,
5 …… Image sensor,
6 …… Drive unit,
7 …… System controller,
8 …… Compensation for blurring,
9 …… Angular velocity sensor,
10 …… Acceleration sensor,
13 …… Memory card,
71 ...... Movement vector extraction unit,
72 .. Inter-frame parallel movement speed calculation unit,
73 …… Inter-frame speed change calculator,
74 …… Video blur correction unit,
81 …… Offset subtractor,
82 …… First integrator,
83 …… Second integrator,
84 …… Acceleration offset calculator,
85 …… Speed calculator
86 …… Speed change adder,
88 …… Third integrator,
89 …… Image magnification multiplier,
90 …… Subject blur determination unit,

Claims (6)

  1. In an imaging apparatus including an image blur correction unit that corrects an image blur caused by a change in posture,
    An imaging unit that captures a subject image at a predetermined imaging frame rate and acquires image data for each frame;
    A first image movement amount calculation unit that calculates a first movement amount of a subject image based on a correlation between a plurality of consecutive image data captured for each frame;
    An acceleration sensor for detecting an acceleration accompanying a change in posture of the imaging device;
    An acceleration value detected by the acceleration sensor, on the basis of the value of the first amount of movement calculated by the first image movement amount calculating section, the detection reference value calculating section for calculating a detection reference value of the acceleration sensor When,
    An acceleration correction unit that corrects an acceleration value detected by the acceleration sensor based on the detection reference value and calculates a corrected acceleration value;
    Based on the corrected acceleration value calculated by the acceleration correction unit, a second movement amount for correcting the translational image blur amount generated on the imaging surface when the subject image moves in the translational direction with respect to the imaging device. A second image movement amount calculation unit for calculating;
    Was immediately Bei,
    The detection reference value calculation unit further includes:
    An integration unit for sequentially integrating the corrected acceleration values;
    The first image movement speed is calculated from the detection result of the first movement amount by the first image movement amount calculation unit, and the first image movement speed detected at the previous frame and the first image movement speed detected at the current frame are calculated. A first speed change amount calculating unit that calculates a first speed change amount from the difference in the image moving speed of one;
    The first image movement amount calculation unit calculates the second speed change amount from the integrated value of the detection result of the acceleration sensor in the period corresponding to the period from the exposure center of the frame to the exposure center of the next frame. A second speed change amount calculation unit;
    Comprising
    The detection reference value calculating section, the first speed change amount and on the basis of the second speed change amount, an imaging apparatus characterized that you calculate the detection reference value of the acceleration sensor.
  2. The first image movement amount calculation unit includes:
    Calculating a first movement amount of the subject image based on a correlation between the image data of two frames extracted by thinning out a predetermined number of frames;
    The second speed change amount calculation unit calculates a second speed change amount from the integrated value of the detection result of the acceleration sensor in a period from the exposure center of the frame to the exposure center of the frame after thinning out the predetermined number of frames. calculating the imaging apparatus according to claim 1, wherein.
  3. An imaging operation input unit that gives a still image exposure start instruction to the imaging unit;
    The second image movement amount calculation unit is based on the first image movement speed calculated immediately before the still image imaging when the imaging operation input unit gives an imaging start instruction for still image imaging. The translation speed before exposure of the imaging device is calculated, and the translation speed during still image exposure is calculated by adding the acceleration detected by the acceleration sensor to the translation speed before exposure, and based on the translation speed during the still image exposure. The translation device according to claim 1, wherein a translational blur correction amount generated on the imaging surface is calculated.
  4. A subject for detecting whether or not the subject is moving based on at least one of the first image moving speed, the first speed change amount, and the second speed change amount. Further comprising a blur detection means,
    The subject blur detection means stops the calculation of the reference value of the acceleration sensor and the translation blur correction amount when it is determined that the subject is moving. The imaging apparatus of Claim 1.
  5. The image blur correction means is further provided,
    The imaging apparatus according to claim 1, wherein the moving image blur correction unit moves the image cutout position in a direction to cancel the movement amount based on the movement amount detected by the first image movement amount calculation unit. .
  6. In an image blur correction method for correcting image blur caused by a change in posture of an imaging device,
    An imaging step of imaging a subject image at a predetermined imaging frame rate and acquiring image data for each frame;
    Detecting a first movement amount of a subject image based on a correlation between a plurality of consecutive image data captured for each frame;
    Detecting an acceleration value associated with a change in posture of the imaging device by an acceleration sensor;
    Calculating a detection reference value of the acceleration sensor based on the first moving amount and the detected acceleration,
    Correcting the acceleration value detected by the acceleration sensor based on the detection reference value, and calculating a corrected acceleration value obtained thereby;
    Calculating a second movement amount for correcting a translational image blur amount generated on the imaging surface by moving in the translation direction based on the corrected acceleration value;
    Was immediately Bei,
    The process of calculating the detection reference value further includes:
    The corrected acceleration values are sequentially integrated,
    Calculating a first image moving speed from a detection result of the first moving amount obtained by the step of calculating the first moving amount;
    A first speed change amount is calculated from a difference between the first image moving speed detected at the previous frame and the first image moving speed detected at the current frame,
    In the step of calculating the first movement amount, the second speed change amount is calculated from the integrated value of the detection result of the acceleration sensor in the period corresponding to the period from the exposure center of the frame to the exposure center of the next frame. Calculate
    Said step of calculating a detection reference value, based on the second speed change amount and the first speed change amount, the image shake correction of an image pickup apparatus which is characterized that you calculate the detection reference value of the acceleration sensor Method.
JP2014189303A 2014-09-17 2014-09-17 Imaging apparatus and image blur correction method Active JP6411829B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014189303A JP6411829B2 (en) 2014-09-17 2014-09-17 Imaging apparatus and image blur correction method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014189303A JP6411829B2 (en) 2014-09-17 2014-09-17 Imaging apparatus and image blur correction method
CN201510583611.8A CN105430245B (en) 2014-09-17 2015-09-14 Photographic device and as shake correction method

Publications (2)

Publication Number Publication Date
JP2016061912A JP2016061912A (en) 2016-04-25
JP6411829B2 true JP6411829B2 (en) 2018-10-24

Family

ID=55508156

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014189303A Active JP6411829B2 (en) 2014-09-17 2014-09-17 Imaging apparatus and image blur correction method

Country Status (2)

Country Link
JP (1) JP6411829B2 (en)
CN (1) CN105430245B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019178872A1 (en) * 2018-03-23 2019-09-26 华为技术有限公司 Video image anti-shake method and terminal

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3513950B2 (en) * 1993-12-14 2004-03-31 株式会社ニコン Image stabilization camera
JPH09139881A (en) * 1995-11-16 1997-05-27 Sony Corp Camera-shake correction device
JP4290100B2 (en) * 2003-09-29 2009-07-01 キヤノン株式会社 Imaging apparatus and control method thereof
JP2006091279A (en) * 2004-09-22 2006-04-06 Canon Inc Optical instrument
JP5274130B2 (en) * 2008-07-15 2013-08-28 キヤノン株式会社 Image blur correction apparatus, optical apparatus, image pickup apparatus, and image blur correction apparatus control method
JP5328307B2 (en) * 2008-11-14 2013-10-30 キヤノン株式会社 Image capturing apparatus having shake correction function and control method thereof
KR101537949B1 (en) * 2008-12-24 2015-07-20 삼성전자주식회사 Photographing apparatus
JP2011118011A (en) * 2009-12-01 2011-06-16 Panasonic Corp Imaging apparatus
JP5121911B2 (en) * 2010-10-19 2013-01-16 キヤノン株式会社 Anti-shake control device, imaging device, and anti-shake control method
EP2819394A1 (en) * 2011-06-10 2014-12-31 Canon Kabushiki Kaisha Shake compensation apparatus, shake compensation control method, and image capturing apparatus and control method thereof

Also Published As

Publication number Publication date
CN105430245A (en) 2016-03-23
JP2016061912A (en) 2016-04-25
CN105430245B (en) 2018-11-30

Similar Documents

Publication Publication Date Title
JP5111088B2 (en) Imaging apparatus and image reproduction apparatus
CN101521747B (en) Imaging apparatus provided with panning mode for taking panned image
JP2007171786A (en) Vibration-proof control device and imaging apparatus
JP2006317848A (en) Still picture imaging apparatus
JP5028574B2 (en) Digital camera system
EP2815569A1 (en) Video image stabilization
JP4389779B2 (en) Method for correcting distortion of captured image signal and distortion correction apparatus for captured image signal
JP2012088465A (en) Vibration-proof controller, imaging apparatus, and vibration-proof control method
JP4310645B2 (en) Method for correcting distortion of captured image signal and distortion correction apparatus for captured image signal
ES2523462T3 (en) Method and system for image stabilization
JP2009225072A (en) Imaging apparatus
US20060132612A1 (en) Motion picture taking apparatus and method
KR101034109B1 (en) Image capture apparatus and computer readable recording medium storing with a program
JP2005215388A (en) Interchangeable lens and camera system using the same
US9185297B2 (en) Image capture system, control method thereof and image capture apparatus
US9426350B2 (en) Image capturing apparatus and control method thereof
JP4390068B2 (en) Method for correcting distortion of captured image signal and distortion correction apparatus for captured image signal
EP1998558B1 (en) Image pickup apparatus equipped with function of detecting image shaking
JP4306679B2 (en) Shake detection device, shake correction device, imaging device
US20140267803A1 (en) Photographing apparatus, image display apparatus, and display control method of image display apparatus
US8098286B2 (en) Shake correction control circuit and image pickup apparatus provided with the same
KR101575626B1 (en) Digital camera and controlling method thereof
US7907205B2 (en) Optical apparatus with unit for correcting blur of captured image caused by displacement of optical apparatus in optical-axis direction
TWI655869B (en) Imaging device, solid-state imaging device, camera module, electronic device, and imaging method
JP4501994B2 (en) Imaging device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170425

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180124

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180130

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180306

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180904

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180927

R151 Written notification of patent or utility model registration

Ref document number: 6411829

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151