US9692962B2 - Control device, optical apparatus, imaging apparatus, and control method - Google Patents

Control device, optical apparatus, imaging apparatus, and control method Download PDF

Info

Publication number
US9692962B2
US9692962B2 US15/099,810 US201615099810A US9692962B2 US 9692962 B2 US9692962 B2 US 9692962B2 US 201615099810 A US201615099810 A US 201615099810A US 9692962 B2 US9692962 B2 US 9692962B2
Authority
US
United States
Prior art keywords
subject
gravity center
photographed image
tracking
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/099,810
Other languages
English (en)
Other versions
US20160316137A1 (en
Inventor
Nobushige Wakamatsu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAKAMATSU, NOBUSHIGE
Publication of US20160316137A1 publication Critical patent/US20160316137A1/en
Application granted granted Critical
Publication of US9692962B2 publication Critical patent/US9692962B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23219
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • H04N5/23258
    • H04N5/23264
    • H04N5/23287
    • H04N5/23293

Definitions

  • the present invention relates to a control device, an optical apparatus, an imaging apparatus, and a control method.
  • imaging apparatuses such as digital cameras
  • important imaging jobs such as exposure decisions or focus manipulations are fully automated.
  • imaging apparatuses on which anti-vibration control devices preventing image blur caused due to camera shake or the like are mounted factors causing photographing mistake of photographers are mostly resolved.
  • Imaging apparatuses that have a face detection function or a human body detection function of detecting the faces or bodies of people included in subjects have been proposed.
  • patterns by which the faces of people are determined are decided in advance, and thus portions matching the patterns included in images can be detected as the faces of people.
  • the detected faces of people are referred to, for example, for focus control or exposure control.
  • photographing in a state such that subjects are moving or photographing in telephoto states such that a focal distance becomes large causes following problems.
  • a subject is moving and deviates from a photographed image, it is necessary for photographers to perform special techniques in order to track the continuously moving subject by performing manipulations with high precision.
  • photographing is performed with cameras including telephoto lenses in which focal distance grow larger, influences of image blur caused due to camera shake increase. Therefore, it is difficult to maintain main subjects at the centers of the photographed images.
  • Even when the photographer manipulates the camera to get the subject back inside the photographed image camera shake amounts manipulated with intention by the photographer are also subjected to blurring correction. Therefore, it is difficult to minutely adjust the subject inside the photographed image or at the center of the photographed image due to the influences of anti-vibration control.
  • Japanese Patent Laid-Open No. 2010-93362 discloses an imaging apparatus that automatically tracks a subject by moving a part of an optical system in a direction intersecting an optical axis.
  • Japanese Patent Laid-Open No. H7-226873 discloses an imaging apparatus that extracts a target subject from a photographing signal to output the central position of the subject and tracks a subject using a rotary camera platform or the like so that the central position of the subject is output in the vicinity of the center of a photographed image.
  • the present invention provides a device capable of realizing subject tracking so that a plurality of subjects present in a photographed image are within a photographed image as much as possible.
  • FIG. 1 is a diagram schematically illustrating an imaging apparatus according to an embodiment.
  • FIG. 2 is a diagram illustrating the configuration of the imaging apparatus.
  • FIGS. 3A and 3B are diagrams for describing tracking control on a detected subject.
  • FIG. 4 is a block diagram illustrating an example of a function of a tracking amount calculating unit.
  • FIGS. 5A and 5B are diagrams for describing an example in which the imaging apparatus performs tracking control on one main subject.
  • FIGS. 6A and 6B are diagrams for describing tracking control performed by an imaging apparatus according to a first embodiment.
  • FIG. 7 is a flowchart for describing an example of tracking control on a subject.
  • FIGS. 8A to 8C are diagrams for describing tracking control performed by an imaging apparatus according to a second embodiment.
  • FIG. 1 is a diagram schematically illustrating an imaging apparatus according to an embodiment.
  • FIG. 2 is a diagram illustrating the configuration of the imaging apparatus.
  • a release button 104 is provided on the body of a camera 101 .
  • An opening or closing signal of a switch generated by a manipulation of the release button 104 is transmitted to a CPU 105 .
  • the CPU 105 functions as a control device according to the embodiment.
  • the present invention can be applied to any optical device including the CPU 105 .
  • a correcting lens 114 and an image sensor 106 are located on an optical axis 102 of an imaging optical system.
  • An angular velocity meter 103 is an angular velocity unit that detects an angular shake in a rotation indicated by an arrow 103 p (pitch) and an arrow 103 y (yaw). An output of the angular velocity meter 103 is input to the CPU 105 .
  • a shake correction angle calculating unit 108 calculates a shake correction angle based on the output of the angular velocity meter 103 . Specifically, the shake correction angle calculating unit 108 cuts a DC component added as detection noise to the angular velocity meter 103 from the output of the angular velocity meter 103 , subsequently performs an integration process, and outputs an angle signal. In the cutting of the DC component, for example, a highpass filter (HPF) or a highpass transmission filter is used.
  • HPF highpass filter
  • the output of the shake correction angle calculating unit 108 is input to a sensitivity adjusting unit 109 .
  • the sensitivity adjusting unit 109 amplifies the output of the shake correction angle calculating unit 108 based on zoom and focus position information 107 and a focal distance or a photographing magnification obtained from the zoom and focus position information 107 and sets the amplified output as a shake correction target value.
  • the reason for obtaining the shake correction target value based on the zoom and focus position information 107 is that shake correction sensitivity on a camera image surface with respect to shake correction stroke of the correcting lens 114 is changed by a change in optical information such as focus or zoom of a lens.
  • the sensitivity adjusting unit 109 outputs the shake correction target value as a shake correction amount to a drive controlling unit 113 .
  • the correcting lens 114 functions as a movable unit that shifts and moves a subject in a photographed image.
  • the drive controlling unit 113 performs drive controlling on the correcting lens 114 and performs subject tracking control.
  • the drive controlling unit 113 performs image blur correcting (optical anti-vibration) by driving the correcting lens 114 in a different direction from the optical axis.
  • image blur correcting optical anti-vibration
  • FIG. 2 the optical anti-vibration performed using the correcting lens 114 is adopted.
  • a method of correcting image blur a method of correcting image blur by moving the image sensor into a plane perpendicular to the optical axis may be adopted.
  • Electronic anti-vibration in which an influence of shake is reduced by changing a starting position of each photographing frame output by the image sensor may be applied.
  • a plurality of image blur correcting methods may be combined.
  • a subject position detecting unit 110 illustrated in FIG. 2 detects the position of a subject (subject position) in a photographed image.
  • a tracking amount calculating unit 111 calculates a tracking correction amount which is a control amount used for the correcting lens 114 to track a subject based on information regarding the detected subject position.
  • An adder 112 adds a shake correction amount output by the sensitivity adjusting unit 109 and the tracking correction amount output by the tracking amount calculating unit 111 and outputs an added amount to the drive controlling unit 113 .
  • the drive controlling unit 113 calculates a drive amount of the correcting lens 114 based on the output from the adder 112 and drives the correcting lens 114 based on the drive amount to track a subject and correct image blur.
  • the image sensor 106 obtains image information by converting light reflected from a subject into an electric signal.
  • the image information is converted into a digital signal.
  • the image information converted into the digital signal is transmitted to the subject position detecting unit 110 .
  • a first method is a method of detecting a person.
  • the subject position detecting unit 110 detects a face or a human body as a subject in the photographed image. In a face detecting process, a pattern determined as the face of a person is decided in advance, and thus a portion matching the pattern included in the image can be detected as the face of the person. Even when a human body is detected, the human body is also detected based on the degree of matching with the pre-decided pattern.
  • the subject position detecting unit 110 calculates reliability indicating a probability that each of the detected subjects is a subject (face).
  • the reliability is calculated from the degree of matching with the size of a face region or a face pattern in the image. That is, the subject position detecting unit 110 functions as a reliability calculating unit that calculates the reliability of a subject based on the size of the subject in the photographed image or the degree of matching between the subject and a pattern of a subject stored in advance.
  • each region obtained by sectioning a distribution introduced from a histogram of hue, saturation, or the like in an image in which a subject captured in the photographed image is imaged and classifying the captured image for each section may be recognized as a subject.
  • a subject is recognized by sectioning a histogram of a plurality of color components generated in regard to a captured image in a mountain-type distribution range and classifying the captured images in the regions belonging to combination of the same sections.
  • the region of the main subject can be tracked by detecting a similar region to a feature amount from sequentially subsequent captured images, for example, using the feature amount such as a hue distribution or a size.
  • the position information of the detected subject is input to the tracking amount calculating unit 111 .
  • the tracking amount calculating unit 111 calculates the tracking correction amount.
  • the tracking amount calculating unit 111 calculates the tracking correction amount so that the gravity center position of the subject is located in the vicinity of the center of the image.
  • the tracking amount calculating unit 111 obtains one gravity center position corresponding to the plurality of subjects and calculates the tracking correction amount so that the obtained gravity center position is located in the vicinity of the center of the image.
  • FIGS. 3A and 3B are diagrams for describing tracking control on a detected subject.
  • FIG. 3A illustrates a photographed image 301 a before start of the subject tracking control.
  • FIG. 3B illustrates a photographed image 301 b after the start of the subject tracking control.
  • a subject 302 a is located at a position distant from an image center 304 .
  • Reference numeral 303 a denotes the gravity center position of a subject 302 a (subject gravity center position).
  • the CPU 105 performs tracking control so that a distance to the image center 304 of the subject gravity center position 303 a gradually becomes close by the tracking control and the image center 304 finally substantially matches the subject gravity center position.
  • the subject gravity center position 303 a of the subject 302 b in which the tracking is successful matches the image center 304 .
  • FIG. 4 is a block diagram illustrating an example of a function of a tracking amount calculating unit.
  • the tracking amount calculating unit 111 calculates an individual tracking correction amount at each axis in the vertical direction and the horizontal direction of the image. Here, only a single axis will be described.
  • a subtracter 403 subtracts the coordinates of a subject position 401 and the coordinates of an image middle position 402 based on subject position information output by the subject position detecting unit 110 . Accordingly, a distance (center deviation amount) between an image center position and the gravity center position of a subject on an image is calculated and the calculated center deviation amount serves as signed data in which the image center is set to 0.
  • An output of the subtracter 403 is input to a count value table 404 and a count value for tracking is calculated based on the magnitude of a distance of a difference between the subject gravity center position and the image center. The count value is calculated for each control sampling.
  • the count value is set to 0, so that a dead-band region in which no tracking is performed within a predetermined range from the center is provided.
  • the larger the center deviation amount is, the larger the count value.
  • the sign of the count value is calculated in accordance with the sign of the center deviation amount.
  • the output of the count value table 404 is input to a signal selecting unit 406 .
  • An output of a down-count value outputting unit 405 and a state of a tracking switch 407 are also input to the signal selecting unit 406 .
  • the signal selecting unit 406 selects the output of the count value table 404 and outputs the selected output to an adder 408 .
  • the signal selecting unit 406 selects the output of the down-count value outputting unit 405 and outputs the selected output to the adder 408 .
  • the down-count value outputting unit 405 outputs a down-count value.
  • a previous sampled value 410 of the tracking amount calculated in post-processing is also input to the down-count value outputting unit 405 .
  • the previous sampled value 410 of the tracking amount is a tracking correction amount up to the previous sampling.
  • the down-count value outputting unit 405 sets the down-count value to a negative.
  • the down-count value outputting unit 405 sets the down-count value to a positive so that the absolute value of the tracking correction amount decreases.
  • the previous sampled value 410 of the tracking amount is within 0 ⁇ a predetermined range
  • the down-count value outputting unit 405 sets the down-count value to 0.
  • the adder 408 adds an output of the signal selecting unit 406 and the previous sampled value 410 of the tracking amount.
  • An output of the adder 408 is input to an upper and lower limit setting unit 409 .
  • the upper and lower limit setting unit 409 sets the tracking correction amount so that the tracking correction amount is not equal to or greater than a predetermined upper limit and is not equal to or less than a predetermined lower limit.
  • An output of the upper and lower limit setting unit 409 is input to a lowpass filter (LPF or a lowpass transmission filter) 411 . High-frequency noise of subject detection in the output of the upper and lower limit setting unit 409 is cut by the LPF 411 and the output is output to a correction lens amount conversion unit 412 .
  • LPF lowpass filter
  • the correction lens amount conversion unit 412 converts an output of the LPF 411 into a signal used for the correcting lens 114 to track a subject. Accordingly, a final tracking correction amount is calculated.
  • the tracking correction is performed in such a manner that the gravity center position of the subject is gradually located in the vicinity of the image center.
  • FIGS. 5A and 5B are diagrams for describing an example in which the imaging apparatus performs tracking control on one main subject.
  • FIGS. 5A and 5B the imaging apparatus performs tracking control such that the gravity center position of a main subject (face) detected form the photographed image transitions to the center position of the photographed image.
  • FIG. 5A illustrates an image before start of the tracking control.
  • FIG. 5B illustrates an image when the subject is captured in the vicinity of the image center after the start of the tracking control.
  • a main subject 502 a is located at a point distant from the image center.
  • the CPU 105 performs the tracking control such that the distance from the subject gravity center position 503 a to the image center gradually becomes close and the image center substantially matches the subject gravity center position finally.
  • a subject gravity center position 503 a matches the image center.
  • the positions of other subjects 504 a are neglected. Accordingly, in the photographed image 502 b when the main subject is captured at the center, the other subjects 504 b may leave the photographed image.
  • a photographer wants the face of the subject 502 a tracked to the center. Since there is a possibility of the photographer wanting to photograph another face as a main subject, an image not intended by the photographer can be obtained in some cases in the tracking control described with reference to FIGS. 5A and 5B . Even when the photographer shakes the camera to frame a subject desired to be photographed and gets the subject back in the image, the imaging apparatus maintains the subject 502 a recognized in the tracking control at the image center, and thus the framing operation may be affected.
  • the imaging apparatus calculates one gravity center position corresponding to the plurality of subjects as a tracking target position based on the gravity center position and the reliability of each subject.
  • the drive controlling unit 113 included in the imaging apparatus performs tracking control such that the calculated tracking target position is located at a specific position (in this example, the vicinity of an image center). Accordingly, when a plurality of subjects (faces or people) is present in a photographed image, a specific subject may not be mistaken and a subject originally photographed by the photographer may not leave from a screen.
  • FIGS. 6A and 6B are diagrams for describing tracking control performed by an imaging apparatus according to a first embodiment.
  • FIG. 6A illustrates an image before start of the tracking control.
  • FIG. 6B illustrates an image after the start of the tracking control.
  • the tracking amount calculating unit 111 included in the imaging apparatus functions as a gravity center calculating unit that calculates one gravity center position corresponding to the plurality of subjects by Expression 1 based on the number n of detected faces, a gravity center coordinate position b of each face, and reliability a of each face.
  • n is the number of detected faces
  • b is the gravity center coordinate position of each face
  • a is reliability of each face
  • y is one gravity center position corresponding to a plurality of subjects.
  • the tracking amount calculating unit 111 increases the weight of a face with high reliability so that a calculated gravity center position is located in the vicinity of the face with the high reliability.
  • reference numeral 603 a denotes a gravity center position corresponding to the plurality of subjects.
  • the imaging apparatus performs the tracking control such that the gravity center position y transitions to an image center position.
  • the reliability of a subject 602 a is higher than that of the subjects 604 a from the viewpoint of the sizes of the faces.
  • the gravity center position y is not simply calculated to the center position of the face position present in the image, but the calculated gravity center position y is the position to the subject 602 b by Expression (1).
  • subjects 604 b can be prevented from coming off the photographed image as much as possible.
  • FIG. 7 is a flowchart for describing an example of tracking control on a subject.
  • the tracking control illustrated in FIG. 7 is performed at a constant sampling period when a main power supply of the camera 101 is turned on and starts.
  • step S 701 the CPU 105 determines whether an anti-vibration SW is turned on.
  • the process proceeds to S 705 and the CPU 105 sets the shake correction amount to 0. Then, the process proceeds to step S 706 .
  • the process proceeds to S 702 .
  • step S 702 the CPU 105 takes an output of the angular velocity meter 103 .
  • step S 703 the CPU 105 determines whether the camera is a state in which the shake correction is possible. Specifically, when the camera is in a state from supply of power to stabilization of the output of the angular velocity meter 103 , the CPU 105 determines that the shake correction is not possible. When the camera is in a state after the stabilization of the output of the angular velocity meter 103 , the CPU 105 determines that the shake correction is possible. Accordingly, it is possible that the shake correction is not performed in a state in which an output value immediately after the supply of power is unstable. When the camera is not in the state in which the shake correction is possible, the process proceeds to step S 705 . When the camera is in the state in which the shake correction is possible, the process proceeds step S 704 .
  • step S 704 the CPU 105 causes the shake correction angle calculating unit 108 and the sensitivity adjusting unit 109 to calculate the shake correction amount based on the output of the angular velocity meter captured in step S 702 .
  • step S 706 the CPU 105 determines whether the tracking SW is turned on. When the tracking SW is turned off, the process proceeds to step S 713 . Then, in step S 713 , the CPU 105 sets the tracking correction amount to 0 and the process proceeds to step S 714 . When the tracking SW is turned on, the process proceeds to step S 707 .
  • step S 707 the CPU 105 determines whether there is a tracking target subject from an image signal captured by the image sensor 106 . When there is no tracking target subject, the process proceeds to step S 713 . When there is the tracking target subject, the process proceeds to step S 708 .
  • step S 707 the CPU 105 detects the number of subjects. Subsequently, in step S 709 , the CPU 105 detects the gravity center position of each subject. In step S 710 , the CPU 105 detects the reliability of each subject.
  • step S 711 the CPU 105 calculates the subject gravity center position using Expression (1) based on the number of subjects detected in steps S 708 to S 710 , the gravity center position of each subject, and the reliability of each subject. Subsequently, in step S 712 , the CPU 105 causes the tracking amount calculating unit 111 to calculate the tracking correction amount based on the subject gravity center position
  • step S 714 the CPU 105 adds the shake correction amount calculated in step S 704 and the tracking correction amount calculated in step S 712 to calculate a lens drive amount.
  • step S 715 the CPU 105 causes the drive controlling unit 113 to drive the correcting lens 114 based on the lens drive amount. Accordingly, the image blur is corrected and the subject is tracked. Then, the process proceeds to step S 716 to end the shake correction routine and a standby state is entered until a subsequent sampling period.
  • the control device calculates one gravity center position corresponding to the plurality of subjects from the positions of the plurality of subjects in the photographed image and the reliability of each subject and performs the automatic tracking control based on the gravity center position. Accordingly, it is possible to prevent the other subjects from leaving the screen by the tracking of only the specific subject and prevent an automatic tracking operation not intended by the photographer from being performed.
  • optical anti-vibration for movement into a plane perpendicular to the optical axis by using the correcting lens as a shake correcting unit is applied.
  • the present invention is not limited to the optical anti-vibration, but the following configurations can also be applied:
  • FIGS. 8A to 8C are diagrams for describing tracking control performed by an imaging apparatus according to a second embodiment.
  • the imaging apparatus calculates one gravity center position corresponding to a plurality of subjects by Expression (1) based on the number n of detected faces, the gravity center coordinate position b of each face, and the reliability a of each face. Then, the imaging apparatus performs tracking control so that the gravity center position is maintained at an image center.
  • the CPU 105 detects a camera shaking operation (subject tracking manipulation) of a photographer based on an output of the angular velocity meter 103 , that is, a shake detection signal indicating shake applied to that imaging apparatus.
  • the CPU 105 estimates a subject aimed at by the photographer based on the detected camera shake operation and sets reliability of the subject to be high to re-calculate the gravity center position. Then, the tracking control is performed such that the re-calculated gravity center position is maintained at the image center.
  • the tracking control is performed such that a gravity center position 804 a corresponding to two subjects ( 802 a and 803 a ) is maintained at the image center.
  • camera shake in a direction indicated by an arrow 805 a occurs.
  • the imaging apparatus detects the camera shake and estimates that the subject aimed at by the photographer is the subject 803 a based on the detection result. Then, the imaging apparatus sets the reliability of the subject 803 a to be high to re-calculate the gravity center position.
  • the imaging apparatus sets the reliability of the subject 803 a located in the direction of the camera shake to be higher and sets the reliability of the subject 802 a in the opposite direction to the direction of the camera shake or in the vicinity of the center to be lower again to re-calculate the gravity center position.
  • the imaging apparatus performs the tracking control so that the re-calculated gravity center position is maintained at the image center.
  • the tracking control is performed such that a re-calculated gravity center position 804 b is maintained at the image center.
  • the imaging apparatus sets the reliability of the subject 803 b located in the camera shake direction to be higher and sets the reliability of the subject 802 b to be lower again.
  • the gravity center position is re-calculated based on the reliability set newly again and the tracking control is performed such that this gravity center position is maintained at the image center.
  • the tracking control is performed such that a re-calculated gravity center position 804 c is maintained at the image center. According to the imaging apparatus according to the second embodiment, it is possible to prevent the automatic tracking control not intended by the photographer.
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Adjustment Of Camera Lenses (AREA)
US15/099,810 2015-04-21 2016-04-15 Control device, optical apparatus, imaging apparatus, and control method Active US9692962B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-086886 2015-04-21
JP2015086886A JP6504899B2 (ja) 2015-04-21 2015-04-21 制御装置、光学機器、撮像装置および制御方法

Publications (2)

Publication Number Publication Date
US20160316137A1 US20160316137A1 (en) 2016-10-27
US9692962B2 true US9692962B2 (en) 2017-06-27

Family

ID=57148222

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/099,810 Active US9692962B2 (en) 2015-04-21 2016-04-15 Control device, optical apparatus, imaging apparatus, and control method

Country Status (3)

Country Link
US (1) US9692962B2 (fr)
JP (1) JP6504899B2 (fr)
CN (1) CN106067943B (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11265478B2 (en) * 2019-12-20 2022-03-01 Canon Kabushiki Kaisha Tracking apparatus and control method thereof, image capturing apparatus, and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7113447B2 (ja) * 2018-03-12 2022-08-05 東芝エネルギーシステムズ株式会社 医用画像処理装置、治療システム、および医用画像処理プログラム
US11057558B2 (en) * 2018-12-27 2021-07-06 Microsoft Technology Licensing, Llc Using change of scene to trigger automatic image capture
CN111935453A (zh) * 2020-07-27 2020-11-13 浙江大华技术股份有限公司 一种学习监督方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07226873A (ja) 1994-02-16 1995-08-22 Hitachi Ltd 自動追尾撮像装置
JP2010093362A (ja) 2008-10-03 2010-04-22 Nikon Corp 撮像装置および光学装置
US20150222809A1 (en) * 2014-02-05 2015-08-06 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US20160173765A1 (en) * 2014-12-12 2016-06-16 Casio Computer Co., Ltd. Image-capturing apparatus which controls image-capturing direction

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005110160A (ja) * 2003-10-02 2005-04-21 Konica Minolta Holdings Inc 撮像装置
JP2008061184A (ja) * 2006-09-04 2008-03-13 Sony Corp 画像処理装置および方法、プログラム、並びに撮像装置
NO327899B1 (no) * 2007-07-13 2009-10-19 Tandberg Telecom As Fremgangsmate og system for automatisk kamerakontroll
JP5115139B2 (ja) * 2007-10-17 2013-01-09 ソニー株式会社 構図判定装置、構図判定方法、プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07226873A (ja) 1994-02-16 1995-08-22 Hitachi Ltd 自動追尾撮像装置
JP2010093362A (ja) 2008-10-03 2010-04-22 Nikon Corp 撮像装置および光学装置
US20150222809A1 (en) * 2014-02-05 2015-08-06 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US20160173765A1 (en) * 2014-12-12 2016-06-16 Casio Computer Co., Ltd. Image-capturing apparatus which controls image-capturing direction

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11265478B2 (en) * 2019-12-20 2022-03-01 Canon Kabushiki Kaisha Tracking apparatus and control method thereof, image capturing apparatus, and storage medium

Also Published As

Publication number Publication date
US20160316137A1 (en) 2016-10-27
CN106067943B (zh) 2020-01-21
JP6504899B2 (ja) 2019-04-24
JP2016208230A (ja) 2016-12-08
CN106067943A (zh) 2016-11-02

Similar Documents

Publication Publication Date Title
US10277809B2 (en) Imaging device and imaging method
US10321058B2 (en) Image pickup apparatus and motion vector detection method
US9883104B2 (en) Control apparatus, image pickup apparatus, control method, and non-transitory computer-readable storage medium which are capable of performing tilt correction
US10659691B2 (en) Control device and imaging apparatus
US7801432B2 (en) Imaging apparatus and method for controlling the same
US10659676B2 (en) Method and apparatus for tracking a moving subject image based on reliability of the tracking state
US7929042B2 (en) Imaging apparatus, control method of imaging apparatus, and computer program
US9489747B2 (en) Image processing apparatus for performing object recognition focusing on object motion, and image processing method therefor
US10594939B2 (en) Control device, apparatus, and control method for tracking correction based on multiple calculated control gains
US10212347B2 (en) Image stabilizing apparatus and its control method, image pickup apparatus, and storage medium
US10554893B2 (en) Image stabilization apparatus and control method therefor, image capturing apparatus, and storage medium
US10148889B2 (en) Image processing apparatus and control method thereof, image capturing apparatus and storage medium
US9692962B2 (en) Control device, optical apparatus, imaging apparatus, and control method
US10212348B2 (en) Image processing apparatus, its control method, image capturing apparatus, and storage medium
US10551634B2 (en) Blur correction device, imaging apparatus, and blur correction method that correct an image blur of an object in a target image region
US10200612B2 (en) Image stabilization apparatus and control method thereof, and storage medium
US10992867B2 (en) Image capturing apparatus and control method thereof and storage medium
JP6613149B2 (ja) 像ブレ補正装置及びその制御方法、撮像装置、プログラム、記憶媒体
US10389942B2 (en) Image blur correction apparatus, control method thereof, and imaging apparatus
JP6702736B2 (ja) 撮像制御装置および撮像装置の制御方法、プログラム
US20240073524A1 (en) Image stabilization apparatus and method and image capturing apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAKAMATSU, NOBUSHIGE;REEL/FRAME:039210/0131

Effective date: 20160328

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4