JP2018060126A - Image blur correction device, control method of the same and imaging apparatus - Google Patents

Image blur correction device, control method of the same and imaging apparatus Download PDF

Info

Publication number
JP2018060126A
JP2018060126A JP2016198955A JP2016198955A JP2018060126A JP 2018060126 A JP2018060126 A JP 2018060126A JP 2016198955 A JP2016198955 A JP 2016198955A JP 2016198955 A JP2016198955 A JP 2016198955A JP 2018060126 A JP2018060126 A JP 2018060126A
Authority
JP
Japan
Prior art keywords
subject
angular velocity
image blur
face detection
calculation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2016198955A
Other languages
Japanese (ja)
Inventor
仁志 宮澤
Hitoshi Miyazawa
仁志 宮澤
Original Assignee
キヤノン株式会社
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社, Canon Inc filed Critical キヤノン株式会社
Priority to JP2016198955A priority Critical patent/JP2018060126A/en
Publication of JP2018060126A publication Critical patent/JP2018060126A/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23251Motion detection
    • H04N5/23261Motion detection by distinguishing pan/tilt from motion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/64Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
    • G02B27/646Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for small deviations, e.g. due to vibration or shake
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00228Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00228Detection; Localisation; Normalisation
    • G06K9/00261Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23203Remote-control signaling for television cameras, cameras comprising an electronic image sensor or for parts thereof, e.g. between main body and another part of camera
    • H04N5/23209Remote-control signaling for television cameras, cameras comprising an electronic image sensor or for parts thereof, e.g. between main body and another part of camera for interchangeable parts of camera involving control signals based on electric image signals provided by an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23218Control of camera operation based on recognized objects
    • H04N5/23219Control of camera operation based on recognized objects where the recognized objects include parts of the human body, e.g. human faces, facial parts or facial expressions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23251Motion detection
    • H04N5/23254Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23251Motion detection
    • H04N5/23258Motion detection based on additional sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23264Vibration or motion blur correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23264Vibration or motion blur correction
    • H04N5/2328Vibration or motion blur correction performed by mechanical compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23248Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor for stable pick-up of the scene in spite of camera body vibration
    • H04N5/23264Vibration or motion blur correction
    • H04N5/2328Vibration or motion blur correction performed by mechanical compensation
    • H04N5/23287Vibration or motion blur correction performed by mechanical compensation by shifting the lens/sensor position

Abstract

PROBLEM TO BE SOLVED: To increase the reliability of the panning photography of a subject person and the like.SOLUTION: An imaging apparatus detects a blur detection signal and a movement amount of an image in an imaging screen when the panning photography assist mode for assisting the panning photography is set, sets a search range of the face detection position for the subject (S510), performs processing of changing a calculation method of a subject vector (motion vector) on the basis of the face detection result, performs the first calculation processing of calculating the subject vector from a motion vector group within a detection range based on the face detection position (S513) when the face detection position of the subject is acquired (YES in S511, S512), and performs the second calculation processing of calculating the subject vector from the motion vector group within the detection range based on a focus detection frame set in the imaging screen (S515) when the face detection position of the subject is not acquired (NO in S511), and the subject vector can be detected (YES in S514).SELECTED DRAWING: Figure 8

Description

  The present invention relates to an image blur correction apparatus for correcting image blur due to camera shake, subject blur, and the like, and a control method thereof.

  Panning, which is one of the shooting methods, is a technique in which a photographer follows the camera according to the movement of a subject moving in the horizontal direction, for example. Set to long seconds. Since panning generally requires skill and experience, it is difficult for beginners. The first reason is that it is difficult to take a picture while well following the movement of the subject while moving the camera. The second reason is that the beginner does not know how much shutter speed should be set in order to make the subject feel lively. The slower the shutter speed, the greater the background flow and the greater the feeling of movement, but on the other hand, camera shake and subject shake tend to occur.

  In order to easily realize panning, Patent Document 1 discloses a method of detecting a difference between the speed of a subject and the speed of moving a camera and correcting a shift amount corresponding to the difference with a camera shake correction function. . Immediately before photographing, the angular velocity sensor for detecting the panning of the camera following the subject is detected by the angular velocity sensor in the camera, and the difference between the angular velocity of the subject and the panning speed of the camera is corrected.

  As a method for detecting the amount of movement of the subject image on the imaging surface, there is a detection of a motion vector, and a correlation method based on a correlation calculation, a block matching method, and the like are known. For example, in the block matching method, an input image signal is divided into a plurality of block regions having appropriate sizes, and a difference from a predetermined range of pixels in the previous frame is calculated in units of blocks. The block of the previous frame that minimizes the sum of the absolute values of the differences is searched, and the relative shift between the screens indicates the motion vector of the block.

JP 2006-317848 A

In the conventional technology, there is a possibility that a correct motion vector cannot be detected or a vector with low reliability may be detected depending on imaging conditions such as low contrast. In the case of a person's panning, a motion vector with low reliability may be detected because the movement of the limbs is irregular. When the motion vector of the whole body of a person including limbs is used, if the calculated subject angular velocity is erroneously calculated, the subject may be overcorrected and the subject shake may be noticeable.
An object of the present invention is to increase the certainty of a panning shot of a subject person or the like.

  An apparatus according to an embodiment of the present invention is an image blur correction apparatus that corrects an image blur by a correction unit, and when a specific mode is set, a shake detection signal of an imaging apparatus or a lens apparatus, and a shooting screen A calculation unit that acquires a movement amount of the image, acquires a face detection position of the subject, calculates a movement amount of the subject, and calculates angular velocity data indicating movement of the subject relative to the imaging apparatus; and the calculated angular velocity data Control means for controlling the correction means. The calculation means includes a first calculation process for calculating a motion amount of a subject within a detection range based on the face detection position when the face detection position is acquired, and a shooting screen when the face detection position is not acquired. A second calculation process is performed to calculate the amount of movement of the subject within the detection range based on the focus detection frame set inside.

  According to the present invention, the certainty of a panning shot of a subject person or the like can be improved.

1 is an overall configuration diagram of an imaging apparatus according to an embodiment of the present invention. It is a control block diagram of a shake correction system in an embodiment of the present invention. It is a figure explaining the motion vector detection at the time of a panning. It is a figure which shows the to-be-photographed object vector detection at the time of a panning. It is a figure explaining the process at the time of a person's panning shot in embodiment of this invention. It is a figure explaining the process at the time of panning shot other than a person in the embodiment of the present invention. It is a flowchart explaining the process at the time of panning in the embodiment of the present invention. 5 is a flowchart of subject angular velocity calculation processing in the embodiment of the present invention. It is a flowchart of the calculation process of the shutter speed in embodiment of this invention.

  Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. The present invention can be applied to an imaging apparatus such as a compact digital camera, a video camera, a surveillance camera, and a Web camera equipped with an image blur correction apparatus. In this specification, as a technique for supporting the user's panning, a method of suppressing the difference between the moving speed of the subject and the panning speed (or tilting speed) by moving the movable optical member is called panning assist. A mode in which the panning assist is set and the user supports the panning is called a panning assist mode. In this specification, the mode for performing the method for suppressing the difference between the moving speed of the subject and the panning speed (or the tilting speed) by the movement of the movable optical member is referred to as the panning assist mode. Any specific mode may be used.

  FIG. 1 is a block diagram illustrating a configuration example of an imaging apparatus according to an embodiment of the present invention. In the present embodiment, an example of an imaging system in which the lens device (interchangeable lens 100) can be attached to the camera body 131 will be described. However, the present invention can also be applied to an imaging device in which the lens is integrated with the camera body. It is.

  The interchangeable lens 100 includes a photographic lens unit 101. The photographing lens unit 101 includes a main imaging optical system 102, a zoom lens group 103 that can change a focal length, and a shift lens group 104. The shift lens group 104 is a movable optical member that corrects the shake of the image with respect to the optical axis caused by the shake of the imaging apparatus, and optically corrects the image blur by moving in a direction perpendicular to the optical axis of the photographing lens unit 101. . The zoom encoder 106 detects the position of a zoom lens group (hereinafter referred to as a zoom lens) 103. The position detection unit 105 detects the position of a shift lens group (hereinafter referred to as a shift lens) 104.

  The interchangeable lens 100 includes an angular velocity sensor 109. The angular velocity sensor 109 is an example of a detection unit that detects a shake and outputs a shake detection signal. A shake detection signal (angular velocity detection signal) from the angular velocity sensor 109 is acquired and processed by the lens control unit 110. The lens control unit 110 includes a lens system control microcomputer and controls each unit in the interchangeable lens 100. A drive unit 107 that performs image blur correction drives the shift lens 104 in accordance with a control command from the lens control unit 110. The amplifier circuit 108 amplifies the output of the position detection unit 105 and outputs a position detection signal to the lens control unit 110. The interchangeable lens 100 includes a mount contact portion 113 with the camera body 131.

  The lens control unit 110 includes a camera shake correction control unit 111 that performs camera shake correction control and a panning control unit 112 that performs control for the panning assist mode. In FIG. 1, internal processing of the lens control unit 110 is expressed as functional blocks as control units 111 and 112. The lens control unit 110 also performs focus adjustment control by driving the focus lens, aperture control, and the like, which are omitted for simplification of illustration. In actual camera shake correction, shake detection and image blur correction are performed on two orthogonal axes, for example, the vertical direction and the horizontal direction related to the camera posture. Since these two-axis shake detection and image blur correction have the same configuration except for the difference in detection direction, only one axis will be described below. The image pickup apparatus of the present embodiment employs optical image blur correction, and includes an image blur correction device that performs image blur correction by driving the movable optical member in a direction (for example, an orthogonal direction) different from the optical axis direction. .

  The camera body 131 includes a shutter 114 that adjusts the exposure time and an image sensor 115 such as a CMOS (complementary metal oxide semiconductor) sensor. An analog signal processing circuit (AFE) 116 processes the output of the image sensor 115 and outputs it to the camera signal processing circuit 117. The camera signal processing circuit 117 includes a motion vector detection unit 118 that performs image motion detection. The motion vector detection unit 118 acquires image signals at different shooting times output from the image sensor 115, and detects the movement of the subject or background within the angle of view. The camera body 131 includes a timing generator (TG) 120 and sets operation timings of the image sensor 115 and the analog signal processing circuit 116. The operation unit 130 of the camera body 131 includes a power switch, a release switch, and the like.

  The camera main body 131 includes a camera control unit 122. The camera control unit 122 includes a microcomputer that controls the entire camera system, and performs various controls by reading a predetermined program from the memory and executing it. The camera control unit 122 includes a shutter control unit 125, a subject angular velocity calculation unit 126 that calculates the angular velocity of the main subject, and a panning shutter speed calculation unit 127. In FIG. 1, internal processing of the camera control unit 122 is expressed by functional blocks. The driver 121 drives the shutter driving motor 119 in accordance with a control command from the camera control unit 122. The memory card 128 is a recording medium for recording video data after shooting. The display unit 129 includes, for example, a liquid crystal panel (LCD), monitors an image at the time of shooting, and displays the shot image on a screen. The camera main body 131 includes a mount contact part 123 with the interchangeable lens 100. The lens control unit 110 and the camera control unit 122 perform serial communication at a predetermined timing via the mount contact units 113 and 123.

  An outline of the operation of the imaging apparatus in FIG. 1 will be described. When the user attaches the interchangeable lens 100 to the camera main body 131 and turns on the power of the imaging apparatus using the operation unit 130, the camera control unit 122 detects the change in state. The camera control unit 122 supplies power to each circuit of the camera main body 131 and performs initial settings. In addition, power is supplied to the interchangeable lens 100, and the lens control unit 110 performs initial setting of the interchangeable lens 100. Then, communication is started between the lens control unit 110 and the camera control unit 122 at a predetermined timing. For example, information such as the state of the imaging device and shooting settings is transmitted from the camera control unit 122 to the lens control unit 110 by communication. Further, focal length information, angular velocity information, and the like of the interchangeable lens 100 are transmitted from the lens control unit 110 to the camera control unit 122.

  When the panning assist mode is not selected by a user operation using the operation unit 130, a camera shake correction operation is performed. That is, in the interchangeable lens 100, the angular velocity sensor 109 detects a shake applied to the camera due to a hand shake or the like. The camera shake correction control unit 111 acquires a shake detection signal and performs drive control of the shift lens 104 to perform camera shake correction. Hereinafter, the shake correction function of the imaging apparatus will be described.

  FIG. 2 is a configuration diagram relating to a shake correction operation. Components that are the same as those in FIG. 1 are denoted by the same reference numerals, and description thereof is omitted. Each unit 201 to 208 in FIG. 1 shows detailed components of the camera shake correction control unit 111. The offset removal unit 201 is a filter calculation unit configured with, for example, a high-pass filter. The offset removing unit 201 removes a direct current component included in the output of the angular velocity sensor 109. The gain phase calculation unit 202 includes an amplifier that amplifies the angular velocity data from which the DC component has been removed by the offset removal unit 201 with a predetermined gain, and a phase compensation filter. The integrator 203 has a function of changing its characteristics in an arbitrary frequency band. The integrator 203 integrates the output of the gain phase calculation unit 202 and calculates the driving amount of the shift lens 104.

  The image stabilization (image blur correction) control determination unit 204 switches a control signal for driving the shift lens 104 according to the output of the camera information acquisition unit 225. When the panning assist mode is set, the image stabilization control determination unit 204 employs the output of the integrator 224 calculated by the panning control unit 112. When a mode other than the panning assist mode is set, the image stabilization control determination unit 204 employs the output of the integrator 203 calculated by the camera shake correction control unit 111.

  The output of the position detection unit 105 of the shift lens 104 is amplified by the amplifier circuit 108 and then converted into digital data by the A / D converter 206. This digital data is sent to the subtraction unit 205 as a negative input. The subtraction unit 205 acquires the output of the image stabilization control determination unit 204 as a positive input, subtracts the digital data from the A / D converter 206 from the output, and outputs the subtraction result (deviation data) to the controller 207. To do. The controller 207 includes an amplifier that amplifies input data with a predetermined gain, and a phase compensation filter. The controller 207 performs signal processing using an amplifier and a phase compensation filter on the deviation data output from the subtracting unit 205 and outputs the deviation data to the pulse width modulation (PWM) unit 208. The pulse width modulation unit 208 performs modulation to a PWM waveform that changes the duty ratio of the pulse wave based on the output data of the controller 207, and supplies it to the drive unit 107 of the shake correction system. The drive unit 107 is a voice coil type motor that drives the shift lens 104, and moves the shift lens 104 in a direction perpendicular to the optical axis according to the output of the pulse width modulation unit 208.

  Next, the panning control unit 112 will be described in detail. When the user performs an operation to set the panning assist mode using the operation unit 130, the camera control unit 122 switches to the panning assist control. The setting information of the panning assist mode is transmitted from the camera control unit 122 to the lens control unit 110, and the lens control unit 110 shifts to control of the panning assist mode. The camera information acquisition unit 225 acquires setting information and release information for the panning assist mode. The angular velocity output unit 211 acquires the output of the offset removing unit 201 and outputs the angular velocity data of the angular velocity sensor 109 in the interchangeable lens 100 to the camera control unit 122. The subject angular velocity acquisition unit 222 acquires the angular velocity data of the main subject calculated by the subject angular velocity calculation unit 126 in the camera body 131 described later. This angular velocity data is acquired via the mount contact part 113 and the communication control part 210. The subtraction unit 223 performs subtraction with the output of the offset removal unit 201 as a positive input and the output of the subject angular velocity acquisition unit 222 as a negative input. That is, the subtraction unit 223 calculates a difference between the angular velocity data detected in the interchangeable lens 100 and the angular velocity data of the main subject detected in the camera body 131, and outputs the difference (deviation data) to the integrator 224. To do. The integrator 224 integrates the deviation data and outputs the integrated data to the image stabilization control determination unit 204.

  Here, a method of calculating the angular velocity of the main subject will be described. In the camera body 131 in which the panning assist mode is set, the motion vector detection unit 118 in the camera signal processing circuit 117 detects and outputs the motion vector of the main subject from the captured video information. At the same time, the camera control unit 122 receives angular velocity data detected by the angular velocity sensor 109 in the interchangeable lens 100 from the lens control unit 110.

  In panning, two types of vectors are obtained as a motion vector output from the motion vector detecting unit 118: a vector of a main subject that the photographer is about to shoot and a vector of a flowing background. In this case, since the purpose is panning, the main subject vector is adopted from the two types of detected motion vectors. A method for employing the main subject vector will be described with reference to FIGS. FIG. 3 shows an example of an image of a panning shooting scene. In this example, the motion vector detection blocks 302 arranged on the screen are arranged in 8 rows and 8 columns. In each detection block, the amount of motion between the previous frame image and the current frame image is detected, and the vector of the subject 301 and the background vector are detected.

  FIG. 4 is a histogram display of the motion vectors detected by the motion vector detection unit 118. The horizontal axis represents the amount of movement (unit: pixels), and the vertical axis represents the frequency (frequency). In the present embodiment, angular velocity data that is an output of the angular velocity sensor 109 is used to accurately distinguish between the subject vector and the background vector. If the photographer is able to catch the subject well with the imaging apparatus and follow the subject, the subject vector exists in the vicinity of 0 pix. However, as a photographer who is unaccustomed to shooting, the amount of movement of the subject increases and the subject vector moves away from 0 pix, making it difficult to distinguish between the subject vector and the background vector. Therefore, in this embodiment, the angular velocity data that is the output of the angular velocity sensor 109 is converted into the image plane movement amount 403. For the conversion process, focal length data and frame rate data of the imaging optical system are used. A group of vectors existing within a certain range (background range) 404 is determined as the background vector 402 with reference to the value of the image plane movement amount 403. A vector group existing outside the certain range 404 is determined as the subject vector 401. If there are a plurality of subjects in the imaging screen, there are a plurality of subject vectors. In this case, the subject vector closest to the camera focus frame (see the focus detection frame 602 in FIG. 5) is employed. The reason is that the photographer always sets the focus frame to the main subject to be photographed regardless of panning. The value of the subject vector thus determined is determined as the amount of movement of the main subject on the image plane. In this embodiment, the angular velocity data used for the histogram may be angular velocity data output from the angular velocity sensor 124 in the camera main body 131.

  With reference to FIG. 5, motion vector processing at the time of panning of a person will be described. FIG. 5 shows an example of an image relating to a panning shot of a person whose entire body is within an angle of view. The horizontal direction in FIG. 5 is defined as the X direction, and the vertical direction is defined as the Y direction. For the motion vector detection block 302, the position of the upper left corner is the reference position, the positive (right) direction in the X direction is the column increasing direction, and the negative (down) direction in the Y direction is the increasing direction of the rows. A subject 601a and a subject 601b exist within the angle of view, and both are persons. The subject 601a is in front of the subject 601b (camera side). Assume that the photographer sets the focus detection frame 602 (for example, in the case of a one-point AF frame) to the subject 601a with the subject 601a as the main subject.

  In the present embodiment, a face detection search range (hereinafter simply referred to as a search range) 604 within a certain range from the center position of the focus detection frame 602 is set. The camera control unit 122 performs face detection processing of the subject (person, animal, etc.) within the search range 604, and changes the subject vector integration method based on the face detection result. In the panning shot of a person, the movement of the limb of the subject 601a is an irregular movement, and the detected motion vector result may be a low-reliability vector. On the other hand, a part where a highly reliable motion vector can be detected is a face or a torso where the motion is stable. Therefore, in the present embodiment, a face is detected as a part where a highly reliable motion vector can be detected in a person's panning, and a subject vector near the face image is used.

  A specific example of the search range 604 will be described with reference to FIG. The search range 604 is set as a rectangular range starting from the center position of the focus detection frame 602 and having the Y direction as the long side direction and the X direction as the short side direction. For example, a range corresponding to 80% of the angle of view in the Y direction where focus detection is possible and 40% of the angle of view in the X direction is set as the search range 604. When the photographer holds the camera in the vertical direction (vertical holding state), the search range is set with the Y direction as the short side direction and the X direction as the long side direction. For the determination of the posture of the camera, detection information by an acceleration sensor (not shown) in the camera main body 131 is used. From the output of the three axes (X-axis, Y-axis, and Z-axis) of the acceleration sensor, the axis that is affected by the direction of gravity can be determined. For example, when the user shoots with the camera held sideways, the Y-axis direction is affected by gravity. Is done.

  The camera control unit 122 uses the focus detection frame 602 when setting the search range 604. That is, even when a plurality of subjects 601a and 601b exist within the angle of view, the photographer aligns the focus detection frame 602 with the subject 601a that is the main subject whose movement is to be stopped. For this reason, it is possible to detect the motion vector of the subject 601a with the focus detection frame 602, not the subject 601b.

  Next, the subject vector integration method will be described in detail. Assume that a face detection position 603 exists within the search range 604 of FIG. In this case, the detection block closest to the face detection position 603 is the third row and the fourth column. Using this detection block as a starting block, the second row, fourth column (upper), fourth row, fourth column (lower), third row, third column (left), and third row, fifth column (right) respectively located above, below, and left and right. The respective detection blocks are to be integrated. That is, a detection block adjacent to the starting point block in the cross direction is selected. However, a detection block having a motion vector value within a predetermined threshold from the value of the motion vector of the starting block (3rd row, 4th column) is an integration target. That is, when the value of the motion vector of the detection block is within the motion vector ± threshold value (for example, 3 pixels) of the starting block (3rd row, 4th column), integration is possible. The reason is to prevent erroneous calculation of the subject vector. For example, in the detection block at the third row and the third column at the left position and the third row and the fifth column at the right position on the basis of the starting block, when the face of the subject 601 is small, the face area does not overlap the detection block and the background block There is a possibility of accumulating the amount of movement. In such a case, when the value of the motion vector of the detection block is not within a predetermined threshold from the value of the motion vector of the starting block, it is excluded from the integration target.

  In the case of shooting conditions such as low contrast, there is a possibility that a motion vector is erroneously detected for a detection block that is a vector result with low reliability. Therefore, in this embodiment, it is not included in the subject vector integration target. Further, since the number of detected blocks covering the face area of the subject is very small with respect to the total number of detected blocks, the first threshold value (for example, 1) for determining the subject vector is set for a case other than a person. Is set to be smaller than a second threshold (for example, 4).

  With reference to FIG. 6, a description will be given of panning of a moving body (automobile) other than a person. The left-right direction in FIG. 6 is defined as the X direction, and the up-down direction is defined as the Y direction. In the panning shot of a moving object (automobile), the movement of the subject 701 is a stable movement in a certain direction. Since the subject 701 overlaps many detection blocks 302, a large number of subject vectors can be detected. Therefore, the detection block (5th row, 7th column) close to the position of the focus detection frame 602 that is in focus is selected as the starting point block. The detection blocks are searched and integrated in a circular shape (concentric shape) around the starting block. However, there may be a detection block indicating a motion vector result with low reliability when the contrast is low. In this case, in order to reduce the possibility of erroneous detection of the motion vector, the detection result of the detection block is not included in the subject vector integration target.

  Thus, in this embodiment, the main subject vector (motion vector) intended by the user can be calculated by combining the face detection result and the search range 604. For example, even when there are a plurality of subjects 601a and 601b (FIG. 5) in the shooting screen, the main subject vector can be calculated from a vector near the face image of the subject 601a that the photographer wants to stop moving. With respect to the calculated main subject vector, the subject angular velocity can be calculated in the reverse procedure to the case of converting the angular velocity data into the image plane movement amount.

  FIG. 7 to FIG. 9 are flowcharts for explaining processing when the panning assist mode is set in this embodiment. The following processing is realized by the CPU of the camera control unit 122 and the lens control unit 110 reading out and executing a program from the memory.

First, the process of FIG. 7 will be described.
(S501) Mode Determination Processing The camera control unit 122 determines whether the photographer has performed an operation for setting the panning assist mode using the operation unit 130. If the panning assist mode is set, the process proceeds to S502. If a setting other than the panning assist mode is set, the process ends without controlling the panning assist mode.

(S502) Lens Determination Processing The camera control unit 122 determines whether or not the interchangeable lens 100 attached to the camera main body 131 is an interchangeable lens compatible with the panning assist mode. If the interchangeable lens is compatible with the panning assist mode, the process proceeds to S503. If the interchangeable lens is not compatible with the panning assist mode, the process proceeds to S506. Note that whether or not the interchangeable lens 100 is compatible with the panning assist mode is determined using a signal transmitted from the lens control unit 110 to the camera control unit 122.

(S503) Motion Vector Detection Processing The motion vector detection unit 118 detects the amount of motion of the image in the shooting screen as a motion vector, and outputs the detection result to the camera control unit 122.

(S504) Image Surface Movement Amount Calculation Processing The camera control unit 122 uses the angular velocity data obtained by the angular velocity sensor 109 in the interchangeable lens 100 and the focal length and frame rate data of the imaging optical system to move on the image plane. Is calculated (see FIG. 4).

(S505) Subject Angular Velocity Calculation Process The subject angular velocity calculator 126 calculates the subject angular velocity. Details of the calculation process will be described in S508 to S521 of FIG.

(S506) Shutter Speed Calculation Processing The panning shutter speed calculation unit 127 calculates a shutter speed for panning assist. Details of the calculation process will be described in S522 to S529 of FIG.

(S507) Shift Lens Drive Control When the camera control unit 122 drives the shift lens 104 during the exposure period of the image sensor 115 from the subject angular velocity calculated in S505 and the shutter speed for panning assist calculated in S506. The driving amount of is determined. The lens control unit 110 acquires the determined driving amount and controls the driving of the shift lens 104.

Next, the subject angular velocity calculation process (S505) in FIG. 7 will be described with reference to FIG.
(S508) Histogram Generation Processing The subject angular velocity calculation unit 126 performs calculations related to histogram generation of all motion vectors detected in S503.

(S509) Camera posture position acquisition The subject angular velocity calculation unit 126 acquires detection information of the camera posture position. In the camera posture determination process, an axis affected by the direction of gravity is detected from the outputs of the three axes (X axis, Y axis, and Z axis) of the acceleration sensor provided in the camera body 131. It can be determined how the user holds the camera based on the output of the acceleration sensor.

(S510) Setting of Search Range The subject angular velocity calculation unit 126 sets the search range 604 according to the camera posture position acquired in S509 with reference to the center position of the focus detection frame 602.

(S511) Face Detection Determination The subject angular velocity calculation unit 126 determines whether or not a face detection result exists within the search range 604 set in S510. If a face detection result exists in the search range 604, the process proceeds to S512, and if a face detection result does not exist in the search range 604, the process proceeds to S514.

(S512) Acquisition of face detection position Since a face detection result exists in the search range 604, the subject angular velocity calculation unit 126 calculates the subject angular velocity using the motion vector near the face detection. The face detection position of the subject person is acquired.

(S513) Subject vector calculation 1 (first calculation process)
The subject angular velocity calculation unit 126 uses the detection block of the face detection position acquired in S512 or the detection block closest to the position as a starting point, and the detection block (detection range) in the vertical and horizontal directions (cross direction) is targeted for integration. . However, when the motion vector within the detection range is within a predetermined threshold (for example, ± 3 pixels) from the motion vector of the detection block closest to the face detection position, the subject vector is calculated as an integration target. The number of vectors to be integrated (first integrated number) is A1.

(S514) Subject Vector Detectability Determination The subject angular velocity calculation unit 126 determines whether or not the subject vector can be detected. If it is determined that the subject vector can be detected, the process proceeds to S515. If the subject vector cannot be detected, the process proceeds to S518. Here, as the detection criterion of the subject vector, it is determined that the subject vector can be detected when the frequency of the subject vector shown in FIG. 4 is equal to or greater than a predetermined threshold (eg, frequency 4). If the frequency of the subject vector is less than a predetermined threshold value, it is determined that it cannot be detected.

(S515) Subject vector calculation 2 (second calculation process)
The subject angular velocity calculation unit 126 calculates the subject vector by integrating the motion vectors in the detection block (detection range) concentrically starting from the detection block closest to the focus detection frame 602. It is assumed that the number of motion vectors to be integrated (second integrated number) is A2, and A2 is larger than A1 in S513.

(S516) Subject Angle Calculation The subject angular velocity calculation unit 126 calculates the subject angular velocity from the subject vector calculated in S513 or S515. In step S504, the angular velocity data output from the angular velocity sensor 109 is converted into the angular velocity of the subject in the reverse procedure to the case where the focal length of the imaging optical system and the frame rate data are converted into the amount of movement on the image plane. Is done.

(S517) Subject shake correction amount calculation The subtraction unit 223 subjects the angular velocity data transmitted from the camera control unit 122 to the lens control unit 110 and the angular velocity data of the angular velocity sensor 109 in the interchangeable lens 100 (output of the offset removal unit 201). The difference is calculated. The calculated difference is integrated by the integrator 224. Thereby, a target control value (subject shake correction amount) of shake correction control is calculated.

(S518) Angular Velocity Acquisition In this case, since the subject vector has not been detected in S514, the lens control unit 110 switches the control of the shift lens 104 to the camera shake correction control. A process of acquiring angular velocity data that is an output of the angular velocity sensor 109 in the interchangeable lens 100 is performed.

(S519) Offset removal The offset removal unit 201 includes a high-pass filter and has a function of changing its characteristics in an arbitrary frequency band. The DC component superimposed on the angular velocity data is removed by outputting the signal in the high frequency band after blocking the low frequency component included in the angular velocity data.

(S520) Gain / Phase Calculation In the gain phase calculation unit 202, the amplifier amplifies the angular velocity data from which the offset has been removed in S519 with a predetermined gain, and the phase compensation filter performs signal processing.

(S521) Camera shake correction amount calculation The angular velocity data subjected to the signal processing in S520 is integrated by the integrator 203, and a target control value (camera shake correction amount) of the camera shake correction control is calculated.
After (S517) or (S521), the process proceeds to return processing.

Next, the shutter speed calculation process (S506) in FIG. 7 will be described with reference to FIG.
(S522) Acquisition of Background Flow Amount The shutter speed calculation unit 127 acquires a set value of the background flow amount set by the photographer using the operation unit 130.

(S523) Acquisition of Focal Length The camera control unit 122 acquires the focal length data transmitted from the lens control unit 110 in the interchangeable lens 100 via the mount contact units 113 and 123.

(S524) Determination of presence / absence of angular velocity sensor in camera main body portion It is determined whether or not the angular velocity sensor 124 is mounted in the camera main body portion 131. If the angular velocity sensor 124 is mounted in the camera body 131, the process proceeds to S525. If the angular velocity sensor 124 is not mounted in the camera body 131, the process proceeds to S526.

(S525) Acquisition of angular velocity in camera body unit Angular velocity data is acquired by the angular velocity sensor 124 in the camera body unit 131. Next, the process proceeds to S528.

(S526) Correction Lens Mounting Determination The camera control unit 122 determines whether the shift lens 104 is mounted on the interchangeable lens 100 based on information acquired from the lens control unit 110. When the shift lens 104 that is an image blur correction lens is mounted on the interchangeable lens 100, the process proceeds to S527. When the shift lens 104 is not mounted on the interchangeable lens 100, it is determined that neither the interchangeable lens 100 nor the camera body 131 is mounted with an angular velocity sensor, and the process proceeds to return processing.

(S527) Obtaining Angular Velocity in Interchangeable Lens The fact that the shift lens 104 is mounted in the interchangeable lens 100 means that the angular velocity sensor 109 is mounted in the interchangeable lens 100. In this case, angular velocity data is acquired by the angular velocity sensor 109. Next, the process proceeds to S528.

(S528) Subject angular velocity acquisition The subject angular velocity data calculated in S516 is acquired. Note that the subject angular velocity acquired when the subject vector has not been detected in S514 is 0 dps (degree per second).

(S529) Shutter speed calculation The panning shutter speed calculation unit 127 calculates the shutter speed for panning assist based on the equation (1) using each data acquired in S522 to S528.
TV = α / f / (ωg−ωs) (1)
In the equation (1), TV: shutter speed, α: background sink effect coefficient, f: focal length, ωg: camera angular velocity, ωs: main subject angular velocity.

  After S529, the process proceeds to return processing. If the shift lens 104 is not mounted in the interchangeable lens 100 in S526, the background angular velocity is calculated from the histogram using the background vector in S508, and the shutter speed for panning is calculated using the calculated value. May be. Alternatively, a value (for example, 1/60 seconds) programmed in advance by the panning shutter speed calculation unit 127 may be set.

  The angular velocity data received from the interchangeable lens 100 corresponds to the panning speed of the camera. Therefore, when the difference between the received angular velocity data and the angular velocity data calculated from the movement amount of the main subject on the image plane and the current focal length of the imaging optical system is calculated, the result indicates the movement of the main subject with respect to the camera. Angular velocity data. The camera control unit 122 transmits the calculated angular velocity data of the main subject to the lens control unit 110. The lens control unit 110 calculates the drive amount of the shift lens 104 according to the setting information of the camera and performs shake correction control.

According to the present embodiment, in the panning assist, the success rate of the panning of the subject person can be increased by combining the face detection position of the subject and the calculation result of the subject vector.
As mentioned above, although preferable embodiment of this invention was described, this invention is not limited to these embodiment, A various deformation | transformation and change are possible within the range of the summary.

109, 124 Angular velocity sensor 110 Lens control unit 118 Motion vector detection unit 122 Camera control unit

Claims (12)

  1. An image blur correction apparatus that corrects image blur by correction means,
    When a specific mode is set, the shake detection signal of the imaging device or the lens device and the amount of movement of the image in the shooting screen are acquired and the face detection position of the subject is acquired to calculate the amount of movement of the subject. Calculating means for calculating angular velocity data indicating movement of a subject relative to the imaging device;
    Control means for controlling the correction means based on the calculated angular velocity data,
    The calculation means includes a first calculation process for calculating a motion amount of a subject within a detection range based on the face detection position when the face detection position is acquired, and a shooting screen when the face detection position is not acquired. An image blur correction apparatus that performs a second calculation process for calculating the amount of movement of a subject within a detection range based on a position of a focus detection frame set therein.
  2.   The image blur correction apparatus according to claim 1, wherein the calculation unit acquires the detection information of the posture of the imaging apparatus and sets a search range related to the face detection position.
  3. The amount of motion of the image is a motion vector acquired from image signals at different shooting times,
    The image blur correction apparatus according to claim 1, wherein the calculation unit calculates a motion amount of the subject by integrating motion vectors in the detection range in the first calculation process.
  4.   The calculating means calculates the motion vector of the subject with the first integration number in the first calculation process, and with a second integration number larger than the first integration number in the second calculation process. The image blur correction apparatus according to claim 3, wherein a motion vector of the subject is calculated.
  5.   In the first calculation process, the calculation means determines that a motion vector value in the detection range is within a threshold value from a motion vector at the face detection position or a motion vector value at a position closest to the face detection position. The image blur correction apparatus according to claim 3, wherein a motion vector of the subject is calculated in a certain case and the motion vector of the subject is calculated.
  6.   The direction in which the calculation unit integrates the motion vectors in the first calculation process is different from the direction in which the motion vectors are integrated in the second calculation process. Image blur correction device.
  7.   The calculation means integrates motion vectors at positions adjacent in the cross direction with the face detection position as a reference in the first calculation process, and uses the position of the focus detection frame as a reference in the second calculation process. The image blur correction apparatus according to claim 6, wherein motion vectors are integrated within a circular detection range.
  8.   The image blur correction apparatus according to claim 1, wherein the specific mode is a mode that supports panning.
  9.   An imaging device comprising the image blur correction device according to claim 1.
  10. Detecting means for detecting an angular velocity of shake of the imaging device;
    Shutter speed calculating means for calculating the shutter speed from the angular velocity data detected by the detecting means and the angular velocity data of the subject when the specific mode is set;
    The imaging apparatus according to claim 9, further comprising a shutter control unit that performs shutter control based on the calculated shutter speed.
  11. A lens device can be attached to the main body of the imaging device,
    A shutter speed calculating means for calculating a shutter speed from the angular velocity data detected by the detecting means provided in the lens device and the angular velocity data of the subject when the specific mode is set;
    The imaging apparatus according to claim 9, further comprising a shutter control unit that performs shutter control based on the calculated shutter speed.
  12. A control method executed by an image blur correction apparatus that corrects an image blur by a correction unit,
    When a specific mode is set, the shake detection signal of the imaging device or the lens device and the amount of movement of the image in the shooting screen are acquired and the face detection position of the subject is acquired to calculate the amount of movement of the subject. A calculation step of calculating angular velocity data indicating movement of the subject relative to the imaging device;
    A control step of controlling the correction means based on the calculated angular velocity data,
    The calculation step includes
    A first calculation step of calculating a movement amount of a subject in a detection range based on the face detection position when the face detection position is acquired;
    An image blur correction apparatus comprising: a second calculation step of calculating a movement amount of a subject within a detection range based on a focus detection frame set in a shooting screen when the face detection position is not acquired. Control method.

JP2016198955A 2016-10-07 2016-10-07 Image blur correction device, control method of the same and imaging apparatus Pending JP2018060126A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016198955A JP2018060126A (en) 2016-10-07 2016-10-07 Image blur correction device, control method of the same and imaging apparatus

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016198955A JP2018060126A (en) 2016-10-07 2016-10-07 Image blur correction device, control method of the same and imaging apparatus
CN201710911852.XA CN107920200A (en) 2016-10-07 2017-09-29 Image blur compensation device and its control method and camera device
US15/722,158 US10419674B2 (en) 2016-10-07 2017-10-02 Image blur correction device, control method thereof, and imaging device

Publications (1)

Publication Number Publication Date
JP2018060126A true JP2018060126A (en) 2018-04-12

Family

ID=61830305

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016198955A Pending JP2018060126A (en) 2016-10-07 2016-10-07 Image blur correction device, control method of the same and imaging apparatus

Country Status (3)

Country Link
US (1) US10419674B2 (en)
JP (1) JP2018060126A (en)
CN (1) CN107920200A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10171739B2 (en) * 2016-03-02 2019-01-01 Panasonic Intellectual Property Management Co., Ltd. Image pickup device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5463443A (en) * 1992-03-06 1995-10-31 Nikon Corporation Camera for preventing camera shake
JP2006310969A (en) * 2005-04-26 2006-11-09 Olympus Imaging Corp Imaging apparatus
JP2006317848A (en) 2005-05-16 2006-11-24 Canon Inc Still picture imaging apparatus
JP4434151B2 (en) * 2006-01-23 2010-03-17 富士フイルム株式会社 Imaging device
JP4457358B2 (en) * 2006-05-12 2010-04-28 富士フイルム株式会社 Display method of face detection frame, display method of character information, and imaging apparatus
JP4823179B2 (en) * 2006-10-24 2011-11-24 三洋電機株式会社 Imaging apparatus and imaging control method
US8130845B2 (en) * 2006-11-02 2012-03-06 Seiko Epson Corporation Method and apparatus for estimating and compensating for jitter in digital video
JP4254873B2 (en) * 2007-02-16 2009-04-15 ソニー株式会社 Image processing apparatus, image processing method, imaging apparatus, and computer program
RU2423015C2 (en) * 2007-12-18 2011-06-27 Сони Корпорейшн Data processing device, data processing method and data medium
JP2013066164A (en) * 2011-09-02 2013-04-11 Sony Corp Image processing device, image processing method, and program
JP2014126861A (en) * 2012-12-27 2014-07-07 Canon Inc Imaging device and control method of the same, program and storage medium
JP6335058B2 (en) * 2013-09-24 2018-05-30 キヤノン株式会社 Imaging apparatus and imaging method
JP2015102775A (en) * 2013-11-27 2015-06-04 キヤノン株式会社 Image tremor correction device and control method of the same

Also Published As

Publication number Publication date
US20180103207A1 (en) 2018-04-12
US10419674B2 (en) 2019-09-17
CN107920200A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
US9197817B2 (en) Image blurring correction apparatus, control method thereof, optical device and imaging apparatus
US9185297B2 (en) Image capture system, control method thereof and image capture apparatus
JP5274130B2 (en) Image blur correction apparatus, optical apparatus, image pickup apparatus, and image blur correction apparatus control method
US8194140B2 (en) Image pickup apparatus for performing a desirable self timer shooting and an automatic shooting method using the same
JP5094606B2 (en) Image shake correction apparatus, optical apparatus including the same, image pickup apparatus, and image shake correction apparatus control method
US9426350B2 (en) Image capturing apparatus and control method thereof
US8692888B2 (en) Image pickup apparatus
EP2037320B1 (en) Imaging apparatus, imaging apparatus control method, and computer program
JP5223486B2 (en) Electronic binoculars
US9602727B2 (en) Imaging apparatus and imaging method
US7639932B2 (en) Imaging apparatus
JP5328307B2 (en) Image capturing apparatus having shake correction function and control method thereof
JP5553472B2 (en) Focus adjustment device, imaging device, and control method thereof
US7830415B2 (en) Camera having an image stabilizer
JP2007171786A (en) Vibration-proof control device and imaging apparatus
US8213787B2 (en) Camera system and image forming apparatus
US9723209B2 (en) Image shake correction device, image pickup apparatus, and control method
US9973676B2 (en) Interchangeable lens digital camera
JP4914420B2 (en) Imaging device, compound eye imaging device, and imaging control method
US20080037970A1 (en) Camera body and camera system including the same
US8743221B2 (en) Image capture apparatus and control method thereof
US9413923B2 (en) Imaging apparatus
JP2008070644A (en) Vibration isolation controller and optical equipment
JP2009115981A (en) Photographing device, its control method, and program
JP2006317848A (en) Still picture imaging apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20191004