CN111246117B - Control device, image pickup apparatus, and control method - Google Patents

Control device, image pickup apparatus, and control method Download PDF

Info

Publication number
CN111246117B
CN111246117B CN202010223939.XA CN202010223939A CN111246117B CN 111246117 B CN111246117 B CN 111246117B CN 202010223939 A CN202010223939 A CN 202010223939A CN 111246117 B CN111246117 B CN 111246117B
Authority
CN
China
Prior art keywords
tracking
control
state
image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010223939.XA
Other languages
Chinese (zh)
Other versions
CN111246117A (en
Inventor
若松伸茂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN111246117A publication Critical patent/CN111246117A/en
Application granted granted Critical
Publication of CN111246117B publication Critical patent/CN111246117B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6815Motion detection by distinguishing pan or tilt from motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction

Abstract

A control device, an image pickup apparatus, and a control method, the control device including a correction control unit that acquires a blur detection signal detected by a blur detection unit to calculate an image blur correction amount and controls an image blur correction unit that corrects image blur; an object detection unit that detects an object position in the captured image and acquires object position information in the captured image; a tracking control unit that performs subject tracking control based on subject position information acquired by the subject detection unit; and a setting unit for setting the control state of the tracking control unit. The setting unit selects a control state to be set to set the tracking control unit from a plurality of control states including a first state in which the tracking object selectable mode is not set and a second state in which the tracking object selectable mode is set. The correction control unit acquires the control state information selected by the setting unit and controls to change the characteristic of calculating the correction amount so that the image blur correction effect in the second state is higher than that in the first state.

Description

Control device, image pickup apparatus, and control method
(the present application is a divisional application filed on 2016, 12, 8, and under the name of "control device and image pickup apparatus" and having an application number of 201611140487.9.)
Technical Field
The invention relates to a control device and an image pickup apparatus.
Background
There is known an image pickup apparatus having an image blur correction function of suppressing blur of an image due to hand shake or the like, and a face detection function and a human body detection function in the case where an object is a human. For example, a pattern for determining a face of a person is determined in advance, and a portion identical to the pattern included in a captured image is detected as a face image. The face image of the detected person is referred to in focus control, exposure control, and the like.
In the case of photographing a moving object or photographing an object in a telephoto mode having a large focal length, the following occurs. When an object moves beyond the imaging angle of view, special skills are required for a photographer to accurately track the object that is moving continuously only by purely manual operation. Further, in the case of photographing using a camera having a telephoto lens, it is difficult to maintain the main object at the center position of the photographed image because the influence of hand shake on image blur increases. In the case where the photographer operates the camera to capture the object within the angle of view again, the image blur is corrected in accordance with the amount of hand-shake that occurs when the photographer intentionally operates the camera. Thus, according to the influence of the image blur correction control, it may be difficult for the photographer to make fine adjustment to capture the subject within the angle of view or to locate the subject image at the center position of the captured image.
Japanese patent laid-open No. 2010-93362 discloses an image pickup apparatus that automatically tracks an object by moving a part of an optical system in a direction intersecting an optical axis. The position of the object is detected from the image signal obtained by the image pickup element, and the object tracking calculation amount is calculated. The object tracking calculation amount and the blur correction calculation amount are combined to perform object tracking control with image blur corrected.
Disclosure of Invention
A control device, comprising: a correction control unit configured to acquire the blur detection signal detected by the blur detection unit to calculate a correction amount of image blur, and to control an image blur correction unit configured to correct the image blur; an object detection unit configured to detect a position of an object in a captured image, and acquire position information of the object in the captured image; a tracking control unit configured to perform tracking control of the object based on the position information of the object acquired by the object detection unit; and a setting unit configured to set a control state of the tracking control unit, wherein the setting unit sets the control state of the tracking control unit by selecting a control state to be set from a plurality of control states including a first state in which a tracking subject selectable mode is not set and a second state in which the tracking subject selectable mode is set but a tracking subject is not selected, and the correction control unit acquires information on the control state selected by the setting unit and controls to change a characteristic for calculating the correction amount such that an image blur correction effect in the second state is higher than an image blur correction effect in the first state.
An image pickup apparatus includes an image pickup element that obtains an object image and the control device described above.
A control device, comprising: a correction control unit configured to acquire the blur detection signal detected by the blur detection unit to calculate a correction amount of image blur, and to control an image blur correction unit configured to correct the image blur; an object detection unit configured to detect an object in a captured image, and acquire position information of the object in the captured image; and a tracking control unit configured to perform tracking control of the object based on the position information of the object acquired by the object detection unit, wherein the correction control unit changes a degree of an image blur correction effect based on an object detection result acquired by the object detection unit.
An image pickup apparatus includes an image pickup element for obtaining an object image and the control device described above.
A control device, comprising: a correction control unit configured to acquire the blur detection signal detected by the blur detection unit to calculate a correction amount of image blur, and to control an image blur correction unit configured to correct the image blur; an object detection unit configured to detect a position of an object in a captured image, and acquire position information of the object in the captured image; a tracking control unit configured to perform tracking control of the object based on the position information of the object acquired by the object detection unit; a determination unit configured to determine whether or not tracking of the object is possible; and a warning instruction unit configured to issue a warning if the determination unit determines that the tracking of the object is not possible.
An image pickup apparatus includes an image pickup element that obtains an object image and the control device described above.
Other features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Drawings
Fig. 1 is a diagram illustrating an image pickup apparatus according to a first embodiment and a shake direction of the image pickup apparatus.
Fig. 2 is a diagram illustrating a configuration example of an image pickup apparatus according to the first embodiment.
Fig. 3A and 3B are diagrams for describing tracking control of a detected object.
Fig. 4 is a functional block diagram of a tracking amount calculation unit according to the first embodiment.
Fig. 5A to 5C are diagrams for describing the object specifying method of the tracking control of the first embodiment.
Fig. 6 is a diagram for describing setting of the blur prevention cutoff frequency according to the object position according to the first embodiment.
Fig. 7 is a flowchart for describing control according to the first embodiment.
Fig. 8A and 8B are diagrams schematically illustrating an image capturing apparatus according to a second embodiment.
Fig. 9 is a diagram illustrating a configuration example of an image pickup apparatus according to the second embodiment.
Fig. 10 is a diagram illustrating a configuration example of an image pickup apparatus according to the third embodiment.
Fig. 11 is a diagram illustrating a configuration example of an image pickup apparatus according to the fourth embodiment.
Fig. 12A to 12H are diagrams showing an example of a state and a timing chart of the image pickup apparatus during tracking control according to the fourth embodiment.
Fig. 13A to 13H are diagrams showing an example of the state of the image pickup apparatus during tracking control and an example of a timing chart according to the fourth embodiment.
Fig. 14A to 14H are diagrams showing an example of a state and a timing chart of the image pickup apparatus during tracking control according to the fourth embodiment.
Fig. 15A to 15H are diagrams showing an example of a state and a timing chart of the image pickup apparatus during tracking control according to the fourth embodiment.
Fig. 16A to 16H are diagrams showing an example of a state and a timing chart of the image pickup apparatus during tracking control according to the fourth embodiment.
Fig. 17 is a diagram illustrating a configuration example of an image pickup apparatus according to the fourth embodiment.
Fig. 18 is a diagram illustrating a configuration example of an image pickup apparatus according to the fourth embodiment.
Detailed Description
Hereinafter, various embodiments of the present invention will be described in detail with reference to the accompanying drawings. In each embodiment, an image pickup apparatus including control means having a function of detecting a shake to correct an image blur and a function of automatically tracking an object will be shown. The present invention can also be applied to an image pickup apparatus such as a monitoring camera, a web camera, or a mobile phone, without being limited to a digital camera or a digital video camera. Further, the present invention can also be applied to a lens replacement type and a lens integrated type camera.
(first embodiment)
In a system that tracks an object to maintain the position of an object image at a specific position (for example, a center position) within a captured image, object tracking control is control in which a camera automatically performs correction so that the position of the object image is at the specific position. Therefore, the photographer does not need to perform framing to track the object during the tracking control. In this case, when the blur correction effect (hereinafter referred to as an image blur correction effect) is enhanced by widening the frequency range of the control characteristic at the time of correcting the image blur, it is possible to perform shooting with smooth subject tracking while eliminating the influence of hand shake or the like.
However, in the case of performing such object tracking, although the photographer can perform tracking by specifying an object in the screen to capture a desired object, the event of specifying an object may itself have difficulty. For example, in the case where the focal length of the photographing lens is very large (1000mm or more), the subject image may move due to the influence of hand shake, and it may be difficult to capture the subject within the angle of view to view. In this case, in the case where the image blur correction effect is enhanced by expanding the frequency range of the control characteristic of the image blur correction, the difficulty of framing due to the influence of hand shake can be reduced. However, in the case where the image blur correction effect is enhanced, if the object moves, even in the case where the photographer operates the camera so that the object falls within the image capturing angle of view, the image blur correction is performed with respect to the amount of hand shake that occurs when the photographer intentionally operates the camera. Depending on the degree of the image blur correction effect, it may be difficult to capture the subject again within the angle of view or to finely adjust the position of the subject image so that the subject image is located at the center position of the captured image. Therefore, in the present embodiment, an image pickup apparatus including a control device that performs image blur correction and object tracking control and can improve the ease of framing by a photographer will be described. This control device is sometimes referred to as an "image position control device" because it controls the image position by performing image blur correction and subject tracking control.
Fig. 1 is a diagram for describing a shake direction of an image pickup apparatus according to various embodiments of the present invention. The control means mounted on the image pickup apparatus 101 includes image blur correction means and object tracking control means.
Fig. 2 shows the structure of an image capturing unit of the image capturing apparatus 101 and functional blocks of image blur correction processing and automatic tracking processing executed by a Central Processing Unit (CPU)105 included in the image capturing apparatus 101.
The image blur correction means performs image blur correction on angular shakes in pitch and yaw directions indicated by arrows 103p and 103y, for example, with respect to the optical axis 102. A Z-axis direction of the three-dimensional orthogonal coordinate system is defined as an optical axis direction, a first axis orthogonal to the Z-axis is defined as an X-axis, and a second axis orthogonal to the X-axis is defined as a Y-axis. The direction around the X axis indicated by an arrow 103p is a pitch direction, and the direction around the Y axis indicated by an arrow 103Y is a yaw direction.
A release switch 104 for allowing shutter travel is provided in the main body of the image pickup apparatus 101, and a switch on/off signal is transmitted to the CPU 105 in accordance with an operation of the release switch 104. The release switch 104 is a two-stage switch in which a first switch (SW1) and a second switch (SW2) are sequentially brought into an ON (ON) state according to the amount of pressing. In the case where the release switch 104 is half-pressed, SW1 enters the ON state, and in the case where the release switch 104 is fully pressed, SW2 enters the ON state. In the case where SW1 enters the ON state, focusing is performed by driving of the focus lens, and an appropriate exposure amount is set by driving of the diaphragm. In the case where SW2 is brought into the ON state, image data obtained from an optical image exposed to the image pickup element 106 is recorded ON the recording medium.
The CPU 105 functions as the control means of the present embodiment. The present embodiment can be applied to any optical apparatus including the CPU 105. An image pickup element 106 and a blur correction lens (hereinafter also referred to as a correction lens) 108 that corrects an image blur of an object image on a screen by moving in a direction different from the optical axis 102 are located on the optical axis 102 of the image pickup optical system.
The image pickup apparatus 101 includes an angular velocity detection unit (hereinafter referred to as an angular velocity meter) that detects an angular velocity of angular shake. The angular velocity meter 103 detects angular shake in a rotational direction (pitch direction) around the X-axis shown by an arrow 103p and a rotational direction (yaw direction) around the Y-axis shown by an arrow 103Y in fig. 1. The detection signal output from the angular velocity meter 103 is input to the CPU 105, and a blur correction angle is calculated by a blur correction angle calculation unit 109. The output of the blur correction angle calculation unit 109 is input to the sensitivity adjustment unit 114.
Without performing automatic object tracking control, the blur correction angle calculation unit 109 calculates a blur correction angle based on the angular velocity meter 103. Specifically, in the blur correction angle calculation unit 109, the offset subtractor 110 removes a Direct Current (DC) component added to the angular velocity meter 103 as detection noise from the detection signal as the output of the angular velocity meter 103. Further, the integration filter unit 111 performs integration processing and outputs an angle signal. The offset subtractor 110 removes the DC component using, for example, a high-pass filter (HPF). Further, the integration filter unit 111 uses a filter represented by the following expression (1). Expression (1) represents a filter combining the integrator of the first term on the left and the HPF of the second term on the left. The filter may be expressed as an expression of a Low Pass Filter (LPF) having a time constant T multiplied by the time constant T.
Figure BDA0002427029500000061
The output of the blur prevention characteristic changing unit 112 is input to the offset subtractor 110 and the integrating filter unit 111. The output of the panning determination unit 113 is input to the blur prevention characteristic change unit 112. The output of the tracking Switch (SW)121 is also input to the blur prevention characteristic changing unit 112, and the processing of the blur prevention characteristic changing unit 112 corresponding to the tracking SW121 will be described later.
The panning determination unit 113 acquires the output of the angular velocity meter 103 and a blur correction angle as the output of the integration filter unit 111, and determines whether the image pickup apparatus 101 is panning. Specifically, the panning determination unit 113 compares the angular velocity of the image pickup apparatus 101 detected by the angular velocity meter 103 with a predetermined threshold value. When a predetermined period of time (period of time for determination) has elapsed from the point in time when the detected angular velocity exceeds the threshold value, it is determined that the image pickup apparatus 101 is panning, and a determination signal is output to the blur prevention characteristic change unit 112.
In a case where it is determined that the image pickup apparatus 101 is panning, the panning determination unit 113 issues an instruction to increase the thrust toward the center of the control range to the blur prevention characteristic change unit 112 so that the blur correction angle output by the integration filter unit 111 does not become too large. The thrust force is a control operation for controlling the blur correction angle to approach the center of the control range. Further, in a case where it is determined that the image pickup apparatus 101 is not panning, the panning determination unit 113 issues an instruction to the blur prevention characteristic change unit 112 to reduce the thrust toward the center of the control range. Here, the thrust force toward the center of the control range may be gradually changed according to the magnitude of the pan. The magnitude of the pan is the magnitude of the detected angular velocity or the length of the time period that the angular velocity exceeds the threshold. In the present embodiment, the panning operation will be described by way of example. Since the same processing is performed for the tilting operation except for the direction difference, a detailed description thereof will not be provided.
The blur prevention characteristic change unit 112 changes the blur prevention characteristic (image blur correction characteristic) in accordance with the panning determination result and an instruction regarding the magnitude of the thrust force toward the center of the control range corresponding to the blur correction angle. For example, in the case where the blur correction angle as the output of the integration filter unit 111 is larger than a predetermined threshold value, the blur prevention characteristic change unit 112 changes the blur prevention characteristic to control the blur correction angle to approach the vicinity of the center of the control range. And controlling according to the size of the blur correction angle so that the blur correction angle gradually approaches to the vicinity of the center. Specifically, control is performed such that: the larger the blur correction angle, the larger the thrust toward the center, and the smaller the blur correction angle, the smaller the thrust toward the center.
The blur prevention characteristic changing unit 112 changes the blur prevention characteristic by changing the frequency characteristic of the offset subtractor 110 or the integration filter unit 111. Upon receiving the instruction to reduce the thrust, the blur prevention characteristic change unit 112 reduces the cutoff frequency of the HPF of the offset subtractor 110, and reduces the cutoff frequency of the integration filter unit 111. Upon receiving the instruction to increase the thrust force, the blur prevention characteristic change unit 112 increases the cutoff frequency of the HPF of the offset subtractor 110, and increases the cutoff frequency of the integration filter unit 111.
As described above, the blur correction angle calculation unit 109 performs processing of changing the thrust force toward the center of the control range in accordance with the panning determination result and the magnitude of the blur correction angle. This makes it possible to calculate the blur correction angle while performing the panning determination process. The blur correction angle signal output from the blur correction angle calculation unit 109 is input to the sensitivity adjustment unit 114.
The sensitivity adjustment unit 114 amplifies the output of the blur correction angle calculation unit 109 based on the zoom and focus position information 107 and the focal length and shooting magnification derived from these information items, and outputs a blur correction target value. Zoom position information is acquired from the zoom unit, and focus position information is acquired from the focus unit. The reason why the blur correction target value is calculated based on the zoom and focus position information 107 is because the blur correction sensitivity of the image pickup surface with respect to the blur correction stroke of the correction lens 108 varies according to a change in optical information such as zooming or focusing. The sensitivity adjustment unit 114 outputs the blur correction target value to the adder 115.
The adder 115 outputs the blur correction target value obtained by the sensitivity adjustment unit 114 without performing automatic object tracking (the tracking correction amount as the output of the tracking amount calculation unit 118 is 0). Only the output of the sensitivity adjustment unit 114 is input to the drive control unit 116. The correction lens 108 functions as a movable unit that shifts the object position in the captured image. The drive control unit 116 controls the drive of the correction lens 108 to perform object tracking control and blur correction.
In the present embodiment, the drive control unit 116 performs image blur correction (optical blur prevention) by driving the correction lens 108 in a direction different from the optical axis direction. Although the image blur correction method using the correction lens 108 is adopted in the present embodiment, the present invention is not limited thereto, and a method of correcting an image blur by moving an image pickup element in a plane perpendicular to an optical axis may be applied. Alternatively, electronic blur prevention may be applied which reduces the influence of shake by changing the cut-out position of the image in each captured frame output by the image pickup element. Alternatively, a plurality of image blur correction methods may be combined. Further, in the present embodiment, the object tracking control is performed by driving the correction lens or the image pickup element and changing the cut-out position of the image using the optical blur prevention technique and the electronic blur prevention technique.
Next, the object position detection processing performed by the object detection unit 117 will be described in detail. The image pickup element 106 acquires image information by photoelectrically converting light reflected from an object into an electric signal. The image information is converted into a digital image signal by an a/D conversion unit, and the digital image signal is sent to the subject detection unit 117. In a case where a plurality of subject images are captured within an imaging angle of view, the following method is used as a method of automatically recognizing a main subject from among the plurality of subjects.
A first method of detecting a main object is to detect a person. In this case, the subject detection unit 117 detects a face or a human body of the subject. In the face detection process, a pattern for determining the face of a person is determined in advance, and a portion matching the pattern included in a captured image may be detected as the face image of the person. The object detection unit 117 calculates a reliability indicating a probability of a face being an object for each detected object. For example, the reliability is calculated from the size of the face region in the image, the degree of matching with the face pattern, and the like. That is, the object detection unit 117 functions as a reliability calculation unit that calculates the reliability of an object based on the size of the object in a captured image or the degree of matching between the object and an object pattern stored in advance.
A second method of detecting a main object uses histograms of hue, saturation, and the like in a captured image. The following processing is performed: a distribution derived from a histogram of hue, saturation, and the like of an image of an object captured within a shooting angle of view is divided into a plurality of sections and images captured in the sections are classified. For example, a histogram of a plurality of color components is created for a captured image, and the histogram is divided with a mountain distribution range, images captured in regions belonging to a combination of the same section are classified, and a subject image region is identified. By calculating the evaluation value for each recognized object image region, the object image region having the highest evaluation value can be determined as the main object region.
After the main object region is determined, the main object region can be tracked by detecting a region having a feature amount similar to that of the main object region from images sequentially captured by subsequent live view operation, continuous capturing, or video capturing using the feature amount of the main object region. For example, the feature amount of the main object region is calculated from the hue distribution, the hue size, or the like. The detected position information of the main object is input to the tracking amount calculation unit 118. The tracking amount calculation unit 118 calculates a tracking correction amount that causes the center position of the main subject image to be located near the center (target position) of the captured image. Further, in the case of viewing a video displayed on the screen of the display unit of the image pickup apparatus which is output according to the video signal, the photographer can specify an object position on the display screen at any time by operating the operation member of the image pickup apparatus. In this case, when the photographer performs an operation of specifying a main object from a plurality of object images displayed on the display screen, a feature amount such as a hue distribution or a hue size at a specified position is calculated. A region having a feature amount similar to the feature amount is detected from an image sequentially obtained later using the calculated feature amount, and the detected region can be tracked as a main object region.
Next, an object tracking control method using the correction lens 108 will be described.
The object detection unit 117 shown in fig. 2 acquires an image signal obtained by the image pickup element 106, and detects an image position (object position) of an object in a captured image. The object position information is output to the tracking amount calculation unit 118. The tracking amount calculation unit 118 calculates a control amount for tracking the object by driving of the correction lens 108 based on the zoom and focus position information 107 and the detected object position information. Hereinafter, a control amount (tracking control amount) for tracking the object so that the image of the object is located at a predetermined position (target position (center position in the present embodiment)) of the screen (range in which the image is captured) is referred to as a tracking correction amount. The tracking amount calculation unit 118 calculates a tracking correction amount from the state of the tracking SW 121. The tracking correction amount is input to the adder 115, and is added to a blur correction target value that is an output of the sensitivity adjustment unit 114, so that blur correction by driving of the correction lens 108 and tracking control are performed simultaneously.
Fig. 3A and 3B are diagrams for describing tracking control for tracking a detected main object according to the present embodiment. Fig. 3A illustrates a captured image 301a before the start of the object tracking control. Fig. 3B illustrates a captured image 301B after the start of the object tracking control. In the captured image 301a of fig. 3A, a black dot shown at the center shows the screen center position 304. The image position of the object 302a before the tracking control is started is distant from the screen center position 304. The subject center position 303a shows the center position of the subject 302a in the image. In the case where the CPU 105 starts the object tracking control, the distance between the object center position 303a and the screen center position 304 gradually decreases with the passage of time. As shown in fig. 3B, with the object tracking control, the final object center position 303a becomes substantially the same as the screen center position 304.
Next, the tracking amount calculation process of the tracking amount calculation unit 118 will be described with reference to fig. 4. Fig. 4 is a functional block diagram showing an example of the tracking amount calculation unit 118. Although the tracking amount calculation unit 118 calculates the amount of tracking correction on each axis in the vertical and horizontal directions of the screen, for the sake of simplicity, only a single axis will be described.
The subtractor 403 subtracts the coordinates of the screen center position 402 from the coordinates of the object position (center position) 401 based on the object position information output by the object detection unit 117. In this way, a difference amount (hereinafter referred to as a center shift amount) representing the distance between the center position of the subject image and the screen center position 402 on the image is calculated. The center shift amount is signed data calculated in the case where the difference amount at the screen center position 402 is 0. The output of the subtractor 403 is input to a count value table reference section (hereinafter simply referred to as a reference section) 404.
The reference unit 404 calculates a count value for tracking based on the amount of center shift (i.e., the magnitude of the difference). Specifically, the count value is calculated as described below.
In the case where the amount of center shift is equal to or smaller than the predetermined threshold value a and equal to or larger than the predetermined threshold value "-a", the count value is set to 0 or the minimum value. In the case where the magnitude (absolute value) of the amount of center shift is equal to or smaller than a predetermined threshold value, a non-sensitive region where tracking is not performed within a predetermined range from the screen center position is set.
In the case where the amount of center shift is greater than predetermined threshold a or less than predetermined threshold "-a", the count value is set to increase as the absolute value of the amount of center shift increases. The sign of the count value is calculated from the sign of the center shift amount.
The output of the reference unit 404 is obtained as a first input of a signal selection unit 406. The signal selection unit 406 acquires the down count value output by the down count value output unit 405 as a second input. Further, a signal indicating the state of the tracking SW (switch) 121 is input to the signal selection unit 406 as a control signal. In the case where the tracking SW121 is set to the ON state, the signal selection unit 406 selects the output of the reference unit 404 and outputs the output to the adder unit 407. Further, in the case where the tracking SW121 is set to the OFF state, the signal selection unit 406 selects the down-count value and outputs the down-count value to the adder unit 407. The down count value will be described later.
The adder unit 407 takes the output of the signal selection unit 406 and the previous sample value associated with the tracking amount and adds these two values. The output of the adder unit 407 is input to the upper and lower limit setting unit 408. The upper and lower limit setting unit 408 limits the tracking correction amount within a predetermined range. That is, the tracking correction amount is limited not to be equal to and not more than the predetermined upper limit and not to be equal to and not less than the predetermined lower limit, and the value is changed. The output of the upper and lower limit setting unit 408 is input to a delay unit 409 and a Low Pass Filter (LPF) 410.
The delay unit 409 outputs a past tracking correction amount (i.e., a previous sample value) obtained earlier than the current time point by a predetermined sample period to the adder unit 407 and the down-count value output unit 405 as a calculation result. The down count value output unit 405 outputs a down count value. The previous tracking amount sampling value 409 calculated by the post-processing is input to the count-down value output unit 405. In the case where the tracking correction amount (previous sample value) obtained at the previous sampling time point has a positive sign, the count-down value is set to a negative sign. Further, the down-count value output unit 405 sets the down-count value to a plus sign in the case where the previous sample value has a minus sign. In this way, processing is performed such that the absolute value of the tracking correction amount is reduced. Further, the down-count value output unit 405 sets the down-count value to 0 in the case where the previous sample value output by the delay unit 409 is within a range (0 ± predetermined range) close to zero. The down count value is a second input to the signal selection unit 406.
The LPF 410 removes high-frequency noise included during detection of the object from the output of the upper and lower limit setting unit 408, and outputs the processed signal to the correction lens driving amount conversion unit 411. The correction lens driving amount conversion unit 411 converts the output of the LPF 410 into a signal for allowing the correction lens 108 to perform an object tracking operation. In this way, a final tracking correction amount is calculated and the correction lens 108 is driven based on the tracking correction amount, thereby performing tracking correction processing so that the center position of the subject image is gradually located near the center of the captured image.
By driving the correction lens 108 based on the addition result of adding the blur correction amount and the tracking correction amount in the above-described manner by the adder 115, the image blur correction control and the object tracking control can be performed simultaneously.
In the case of performing object tracking, a photographer can designate an object within a screen to capture a desired object. A subject specifying method for tracking will be described by a specific example of fig. 5A to 5C. Fig. 5A illustrates an example in which a touch panel 502 is provided on a liquid crystal display device (LCD)501 provided on the rear surface of the image pickup apparatus 101. In a case where the photographer touches the display screen, processing of acquiring the coordinates of the object 503 at the touched position and setting a tracking target object is performed. Further, fig. 5B shows an example in which an operation SW (switch) 515 for setting a tracking object selectable mode (hereinafter referred to as a tracking selectable mode) is provided. In the case where the photographer presses the operation SW 515, the operation mode enters the tracking selectable mode. Since the icon 504 for specifying the subject is displayed in the center of the screen, the photographer can set the subject 503 at the display position of the icon 504 as the tracking target subject by, for example, operating SW1 of the release switch 104.
In any of the methods shown in fig. 5A and 5B, in the case of specifying a subject, processing is performed to detect a region having a similar feature amount from an image sequentially obtained later using the feature amount such as a hue distribution or a hue size. Fig. 5C shows an example in which the main subject region extracted as a region having a similar feature amount is presented to the photographer as the tracking region indicated by the block 506. In a case where a photographer designates an object image within a display screen to track an object, the following problem may occur before designating the object.
For example, in the case where the focal length of the lens is as large as 1000mm or more, the subject image may move due to the influence of hand shake, and it may be difficult to capture the subject within the angle of view to view. In order to alleviate the difficulty of framing due to the influence of hand trembling, the following method may be used: the characteristic is changed to reduce the thrust toward the center of the control range of the offset subtractor 110 or to reduce the cutoff frequency of the integration filter unit 111 to increase the blur correction control range. However, in the case of enhancing the shake suppression characteristic for hand shake correction, even when the photographer operates the camera so that the subject falls within the angle of view, image blur correction is performed on the amount of hand shake that occurs when the photographer intentionally operates the camera. Therefore, due to the influence of the image blur correction control, it may be difficult to capture the subject within the angle of view again or to finely adjust the subject image so that the subject image is located near the center of the captured image. For example, before the photographer designates an object in a state in which the operation mode is set to the tracking selectable mode by operating the operation SW 515 in fig. 5B, the photographer can easily perform framing, for example, when necessary, so that the icon 504 is moved to the position of the object 503 to set the tracking target object. In the case where the operation mode is set to the tracking optional mode, the shake suppression characteristic for blur correction is stronger than that in the case where the tracking optional mode is not set. In this case, since the photographer is less likely to pan greatly in a state in which the tracking selectable mode is set, the pan determination unit 113 increases the determination threshold. In this way, the photographer can easily perform framing so that the icon 504 is located at the position of the object 503.
After the subject is specified, correction to capture a subject image at a specific position (for example, a center position) within the captured image by tracking control is performed simultaneously with image blur correction. In this case, the shake suppression characteristic associated with the image blur correction is enhanced to suppress the influence of hand shake as much as possible. In this case, the shake suppression characteristic is further enhanced than before the object is specified. In the case where tracking of the specified object is started, since it is still unlikely that the photographer performs a large panning, the panning determination unit 113 increases the determination threshold to be larger than the determination threshold before the specified object.
In the following description, a default state in which the tracking selectable mode is not set will be referred to as a first state, and a state in which the tracking selectable mode is set, the subject is not selected, and the tracking SW121 is in an OFF state will be referred to as a second state. A state in which the tracking selectable mode is set, the subject is selected, and the tracking SW121 is in the ON state will be referred to as a third state (tracking mode setting state). The image blur correction effect in the second state is stronger than that in the first state. Further, the image blur correction effect in the third state is stronger than that in the second state. The cutoff frequency of the integration filter unit 111 in the first state will be represented by fc1, and the panning determination threshold will be represented by Ah 1. The cutoff frequency of the integration filter unit 111 in the second state will be represented by fc2, and the panning determination threshold will be represented by Ah 2. The cutoff frequency of the integration filter unit 111 in the third state will be represented by fc3, and the panning determination threshold will be represented by Ah 3. The cutoff frequency and the panning determination threshold satisfy the following relationships expressed by expressions (2) and (3).
fc1>fc2>fc3...(2)
Ah1<Ah2<Ah3...(3)
The degree of the image blur correction effect is changed from a first state in which the tracking selectable mode is not set to a second state in which the tracking operation is not performed, and to a third state in which the tracking operation is performed. Thus, can
It is possible to perform optimal image blur correction control when an object is specified and tracked, and to perform shooting with the object smoothly tracked.
Next, a process of changing the degree of the image blur correction effect according to the subject detection position will be described. In the case where the trackable range of the subject tracking control is sufficient, the tracking control is performed so that the subject image moves near the center of the screen. However, since the trackable range is limited, in the case where the control position reaches the end (limit position) of the trackable range, further tracking control cannot be performed. In this case, the photographer performs framing of the camera so that the subject image moves to the vicinity of the center of the screen. In this case, in the case of performing image blur correction on the amount of hand shake that occurs when the photographer intentionally operates the camera, adverse effects may occur in the correction. That is, it may be difficult for the photographer to capture the subject within the angle of view again or to finely adjust the position of the subject image so that the subject image is located at the center of the captured image. Therefore, in the present embodiment, a process of changing the cutoff frequency (fc4) of the integration filter unit 111 in accordance with the subject detection position detected by the subject detection unit 117 is performed. A processing example will be described with reference to fig. 6.
Fig. 6 shows a diagram illustrating setting of the cutoff frequency fc4 (vertical axis) according to the object detection position (horizontal axis). Coordinates of the object detection position in both axial directions, i.e., a vertical direction and a horizontal direction of the screen, are detected. As for the lens control, the cutoff frequency fc4 of the integration filter unit 111 corresponding to the pitch control in the vertical direction and the yaw control in the horizontal direction is set. Further, it is assumed that the subject detection position represents the position of the subject in the coordinates in which the center position (target position) of the angle of view is set to zero.
Fig. 6 shows a table having such a characteristic that the cutoff frequency fc4 is larger as the object position is farther from the center position of the angle of view. P1 and P2 are thresholds for a positive range of values, and-P1 and-P2 are thresholds for a negative range of values. When the object detection position is in the range of-P1 to + P1, the cutoff frequency fc4 is set to the default value D1. In the case where the object detection position is equal to or greater than P2 or equal to or less than-P2, the cutoff frequency fc4 is set to the frequency D2. Here, D2> D1. In the case where the object detection position is within the interval between P1 and P2 or the interval between-P1 and-P2, the cutoff frequency fc4 is set to a value calculated by linear interpolation between D1 and D2 using a linear equation.
The blur prevention characteristic change unit 112 compares the calculated cutoff frequency fc4 with the cutoff frequency of the integration filter unit 111 determined according to the tracking option mode and the state of the tracking SW 121. As a result of the comparison, the higher cutoff frequency is set as the final cutoff frequency of the integration filter unit 111.
By changing the cutoff frequency of the integration filter unit 111 in accordance with the subject position in this way, the image blur correction effect in the case where the subject image is located near the center of the screen is enhanced, and the subject image can be prevented from deviating from the center of the screen due to the influence of hand shake. Further, in the case where the subject image is deviated from the center of the screen, a state is generated in which the subject image cannot be moved to the center of the screen by the tracking control (for example, a state in which the trackable range is located near the end of the control). The control end is a location where the implementation of the correction is limited. In this case, the image blur correction effect is slightly weakened, so that the photographer can move the subject image to the vicinity of the center of the screen through framing. That is, it is possible to prevent image blur correction due to framing by the photographer, and to facilitate framing.
A processing example of the blur correction control and the object tracking control will be described with reference to a flowchart shown in fig. 7. The following processing is started with the main power supply of the image pickup apparatus 101 turned on, and is executed at a fixed sampling period.
First, in S701, the CPU 105 determines whether the blur prevention SW (switch) is in the ON state. The blur prevention SW is an operation switch indicating whether the user performs image blur correction control. In the case where the blur prevention SW is in the ON state, the flow proceeds to S702. In the case where the blur prevention SW is in the OFF state, the flow proceeds to S703. In S702, the CPU 105 acquires the output of the angular velocity meter 103, and the flow proceeds to S704. In S703, the output of the angular velocity meter 103 is not acquired, and the flow proceeds to S704 after the angular velocity is set to zero. In the case where the angular velocity is set to zero, the subsequent blur correction calculation result (i.e., blur correction amount) becomes zero.
In S704, the CPU 105 determines whether the tracking selectable mode is set to ON. For example, it is determined whether the tracking option mode is set to ON by the operation SW 515 in fig. 5. In the case where the tracking option mode is set to ON, the flow proceeds to S705. In the case where the tracking option mode is set to OFF, the flow proceeds to S708. In S705, the CPU 105 determines whether the tracking SW121 is in the ON state. Whether the tracking SW121 is in the ON or OFF state may be determined based ON whether the subject is selected by the method described in fig. 5A to 5C, for example. In the case where the tracking SW121 is in the ON state, the flow proceeds to S706. When the tracking SW121 is in the OFF state, the flow proceeds to S707.
In S706, the cutoff frequency (denoted by fa) of the integration filter unit 111 is set to fc 3. Further, the panning determination unit 113 sets a panning determination threshold (indicated by Tha) to Ah3, and the flow advances to S709. In S709, the subject detection unit 117 detects a subject position in the captured image. In this example, assuming that the tracking SW121 is set to the time point of ON in S705, the subject image ON the LCD screen ON which the captured image is displayed in the live view is specified by the operation of the user. That is, the image pickup apparatus 101 recognizes an object, and acquires coordinates of a detection position of the object by tracking and detecting the object.
Subsequently, in S710, the tracking amount calculation unit 118 calculates a tracking correction amount from the detected object position, and the flow advances to S711. In S711, the CPU 105 calculates the cutoff frequency fc4 of the integration filter unit 111 based on the detected object position. In S712, the CPU 105 determines whether the current cutoff frequency fa is smaller than fc 4. In the case of fa < fc4, the flow advances to S713. In the case where fa ≧ fc4, the flow proceeds to S715. In S713, the CPU 105 substitutes the value fc4 for fa, and the flow advances to S715.
In S708, the CPU 105 sets the cutoff frequency fa of the integration filter unit 111 to fc 1. The panning determination unit 113 sets the panning determination threshold Tha to Ah1, and the flow advances to S714. In S707, the CPU 105 sets the cutoff frequency fa of the integration filter unit 111 to fc 2. The panning determination unit 113 sets the panning determination threshold Tha to Ah2, and the flow advances to S714. After the tracking correction amount is set to zero in S714, the flow proceeds to S715.
In S715, the blur correction angle calculation unit 109 calculates a blur correction amount. In this case, the angular velocity acquired in S702 or S703, and the cutoff frequency fa of the integration filter unit 111 and the panning determination threshold Tha of the panning determination unit 113 set in S705 to S713 are used. After the blur correction amount is calculated, the flow proceeds to S716. In S716, the adder 115 adds the blur correction amount and the tracking correction amount to calculate the lens driving amount. Subsequently, in S717 the CPU 105 controls the drive of the correction lens 108 based on the lens driving amount with the aid of the drive control unit 116. In this way, image blur correction and object tracking control are performed. In the case where the image blur correction process and the object tracking control process end, a standby is made until the subsequent sampling time point comes.
The image position control apparatus of the present embodiment calculates an object tracking amount based on the object position in the captured image. The subject tracking amount is a control amount with which the subject image is moved to be located at a specific position (for example, a center position or a position designated by the user) within the screen. Further, the image position control device performs calculation based on the angular velocity output as a blur detection signal to calculate a blur correction amount (image blur correction amount). The blur correction amount and the object tracking amount are added, and image blur correction and object tracking control are performed by controlling driving of the correction lens based on the added value. In this case, the characteristic for calculating the blur correction amount is changed according to the control state of the object tracking. Namely, the following processing is performed: changing the degree of the image blur correction effect based ON the determination result information indicating whether the tracking selectable mode is set and the ON/OFF setting information of the tracking switch, and changing the degree of the image blur correction effect according to the subject position. Therefore, image blur correction control that enables a photographer to easily pan can be realized, and the performance of the image blur correction control and the automatic object tracking control can be improved.
In the present embodiment, an application example of so-called optical blur prevention control in which a correction lens is used as an image blur correction and automatic object tracking control unit and is moved in a plane perpendicular to an optical axis has been described. However, the present invention may be applied to an image position control apparatus having the following structure.
(1) And a structure for moving the image pickup element in a plane perpendicular to the optical axis.
(2) An electronic control is performed by image processing for changing a cut-out position of a captured frame output by an image pickup element.
(3) There is a structure of a mechanism for rotating a lens barrel including an image pickup element and a photographing lens group.
(4) A structure combined with a driving mechanism (e.g., a rotating pan/tilt head for performing panning or tilting of the image pickup apparatus) provided separately from the image pickup apparatus.
(5) A combination of the structures described in (1) to (4).
The same is true for a third embodiment described later.
(second embodiment)
Next, a second embodiment of the present invention will be described. In the present embodiment, image blur correction and automatic object tracking in the case of including the correction lens 108 and a mechanism for rotating the lens barrel having the photographing lens group and the image pickup element 106 will be described. In the present embodiment, the same constituent elements as those of the first embodiment will be denoted by the reference numerals used previously, a detailed description thereof will not be provided, and differences will be mainly described. The same is true for the embodiments described later.
Fig. 8A and 8B are diagrams schematically illustrating the image pickup apparatus 101 according to the present embodiment. In fig. 8A and 8B, of X, Y and the Z axis of the three-dimensional orthogonal coordinate system, a direction around the X axis is defined as a pitch direction, and a direction around the Y axis is defined as a yaw direction. The Z-axis direction is the optical axis direction of the image pickup apparatus 101. An image pickup apparatus 101 illustrated in fig. 8A includes an operation member such as a release switch 104 and a display unit that displays a captured image. A display unit using an LCD or the like has a monitor function of displaying a captured image in real time in live view. The lens barrel 801 includes a photographing lens group and an image pickup element 106, and is mounted in a state capable of being driven by the image pickup apparatus 101. That is, a mechanism for rotating (tilting) the lens barrel 801 with respect to the image pickup apparatus 101 is provided.
Fig. 8B shows a structure associated with rotation of the lens barrel 801. The drive mechanism 802p includes a motor that rotates the lens barrel 801 in the pitch direction and a drive control unit thereof. The drive mechanism 802y includes a motor that rotates the lens barrel 801 in the yaw direction and a drive control unit thereof. The posture of the lens barrel 801 can be independently controlled in the pitch direction and the yaw direction by the drive mechanisms 802p and 803 y.
Fig. 9 is a diagram illustrating the structure of a main portion of the image pickup apparatus 101. Differences between this structure and the structure of the first embodiment described in fig. 2 will be described below.
(1) A rotation driving mechanism 802 for rotating the rotation cylinder 801 using a motor is provided.
(2) The adder 115 that adds the blur correction amount and the tracking correction amount is removed, and the blur correction amount output by the sensitivity adjustment unit 114 is input to the drive control unit 116.
(3) A drive control unit 901 for driving the rotation drive mechanism 802 is added, and the tracking correction amount output by the tracking amount calculation unit 118 is input to the drive control unit 901.
(4) The drive control unit 901 performs automatic object tracking control by driving the rotation drive mechanism 802 based on the tracking correction amount calculated by the tracking amount calculation unit 118.
The image position control apparatus of the present embodiment performs image blur correction using the correction lens 108, and performs automatic object tracking control using the rotation drive mechanism 802. In this case, the same advantages as those of the first embodiment are obtained, and image blur correction control that enables the photographer to easily perform framing can be performed.
In the present embodiment, the angular velocity meter 103 is mounted to the lens barrel 801 or to a portion of the image pickup apparatus 101 other than the lens barrel 801. In the case where the angular velocity meter 103 is attached to the lens barrel 801, the relative angular velocity of the lens barrel 801 with respect to the fixed portion of the image pickup apparatus 101 is subtracted from the output of the angular velocity meter 103. The relative angular velocity between the image pickup apparatus 101 and the lens barrel 801 is the rotational velocity of the lens barrel 801 rotated by the rotation driving mechanism 802, and is detected based on a driving instruction signal of a motor or by a rotation detection sensor or the like. By subtracting the relative angular velocity from the output of the angular velocity meter 103, the shake amount of the image pickup apparatus 101 can be calculated.
In this embodiment, a correction lens and a driving mechanism thereof function as an image blur correction unit, and a lens barrel 801 including an image pickup element and a photographing lens group and a rotation driving mechanism thereof function as an automatic object tracking unit. The present invention is not limited thereto, and the following structure may be applied.
(1) A structure including a mechanism for moving the image pickup element in a plane perpendicular to the optical axis and a mechanism for driving the correction lens.
(2) The image processing apparatus includes a processing unit that changes an image cutout position of each captured frame output by an image pickup element and a mechanism for driving a correction lens.
(3) The image processing apparatus includes a processing unit that changes an image cutout position of each captured frame output by the image pickup element, and a mechanism for moving the image pickup element in a plane perpendicular to the optical axis.
(4) The image pickup apparatus includes a mechanism for moving an image pickup element in a plane perpendicular to an optical axis and a mechanism for rotating a lens barrel including a photographing lens group.
(5) The image pickup apparatus includes a processing unit that changes an image cutout position of each photographing frame output by an image pickup element, and a mechanism for rotating a lens barrel including a photographing lens group.
(6) A combination of the plurality of structures described in (1) to (5).
(third embodiment)
Next, a third embodiment of the present invention will be described. The image blur correction and automatic subject tracking apparatus of the present embodiment changes the characteristic of image blur correction based on subject detection information used for automatic subject tracking. The subject detection information is information indicating a subject detection state such as whether a subject is being detected or has been lost, and is information relating to the reliability of subject detection. After that, image blur correction and automatic object tracking control in consideration of the object detection information will be described.
Fig. 10 is a diagram showing the structure of a main part of the image pickup apparatus according to the present embodiment. Differences of the structure from that of the first embodiment described in fig. 2 will be described below.
(1) The object detection unit 117 includes an object detection state determination unit 1001 and an object reliability determination unit 1002 in addition to the object position detection unit 1000, and inputs detection information and determination information to the tracking gain unit 1003, the blur prevention characteristic change unit 112, and the tracking amount calculation unit 118.
(2) A tracking gain unit 1003 is added, and the gain is multiplied by a tracking correction amount that is an output of the tracking amount calculation unit 118, and the multiplied tracking correction amount is input to the adder 115.
The judgment information items obtained by the object detection state judgment unit (hereinafter referred to as detection state determination unit) 1001 and the object reliability judgment unit (hereinafter referred to as reliability judgment unit) 1002 are input to the tracking gain unit 1003, and the tracking gain unit 1003 sets a gain according to each judgment result. The gain is set to not less than 0 and not more than 1. The following will describe the tracking gain unit 1003 and the blur prevention characteristic change unit 112 in detail.
The object position detection unit 1000 corresponds to the object detection unit 117 of the first embodiment, and detects an object position using, for example, face detection or pattern matching. The detection state determination unit 1001 determines whether the object is in the first detection state or the second detection state. The first detection state is a state in which an object has been detected. The second detection state is a state in which an object is not detected due to object loss (object loss state). For example, the state may be determined based on whether the object position detection unit 1000 has successfully detected.
Since the tracking control can be performed in the first detection state, the tracking gain unit 1003 sets the gain to 1. That is, the tracking correction amount calculated by the tracking amount calculation unit 118 is input to the adder 115 as it is. Further, since the subject tracking control is performed, the image blur correction effect is enhanced. As described in the first embodiment, the characteristics of the offset subtractor 110 and the integration filter unit 111 are set according to the state of the tracking SW121 and the determination result of the shake determination unit 113, and the blur correction amount is calculated.
On the other hand, in a case where the detection state determination unit 1001 has determined that the object is in the second detection state, it is preferable to perform control to stop object tracking. That is, when it is determined that the object is in the second detection state, the gain is preferably set to 0. Further, the detection state determination unit 1001 may determine the object detection state from three or more states instead of two states of the first detection state and the second detection state. For example, the first detection state is defined as a state in which the possibility that the object has been detected is high, and the second detection state is defined as a state in which the possibility that the object has not been detected is high. Further, the third detection state is a state in which the possibility of detecting the object is lower than the first detection state and the possibility of detecting the object is higher than the second detection state. For example, during a predetermined period of time, the detection state determination unit 1001 detects a state in which the object is lost but continues searching for the object (third detection state), and if the object is detected again during the object lost state, transitions to the first state as a re-object detection state. In the case where the object loss state continues for a predetermined period of time, after stopping the search for the object, the detection state determination unit 1001 transitions to the second state as an object non-detection state. For example, the third state may be set so that the possibility of detecting the object becomes small due to the time elapsed since the object was lost. In this case, the tracking gain unit 1003 sets the gain to 1 in the case where the object is in the first detection state, sets the gain to 0 in the case where the object is in the second detection state, and sets the gain to a value larger than 0 and smaller than 1 in the case where the object is in the third detection state.
The detection state determination unit 1001 may determine the detection state from a plurality of states set according to the possibility of detecting the object between the first detection state and the second detection state. In this case, the tracking gain unit 1003 sets the gain to gradually decrease as the possibility of detecting the object decreases, and eventually reaches zero. Since the tracking control is not performed in the second detection state, it is set so that the subject image can be moved to be located near the center of the screen by the framing operation of the photographer. That is, the blur prevention characteristic change unit 112 slightly weakens the image blur correction effect, prevents image blur correction from being performed for framing operation, and changes the blur prevention characteristic so that the photographer can easily perform framing. In this case, the blur prevention characteristic facilitates framing while maintaining the image blur correction effect at a certain intensity.
The cutoff frequency of the integration filter unit 111 is set to fc2, and the panning determination threshold is set to Ah 2. This setting is the same as that in the case where the subject is not specified and the tracking SW is in the OFF state shown in S707 of fig. 7. The blur prevention characteristic changing unit 112 changes the blur prevention characteristic in accordance with the determination result regarding the object detection state obtained by the detection state determining unit 1001, the state of the tracking SW121, and the determination result of the pan determining unit 113. That is, the blur prevention characteristic that can most reduce the image blur correction effect is determined between the blur prevention characteristic corresponding to the determination result regarding the object detection state and the blur prevention characteristic corresponding to the state of the tracking SW121 and the determination result of the pan determination unit 113. The offset subtractor 110 and the integration filter unit 111 are set according to the determined blur prevention characteristic.
The reliability determination unit 1002 determines whether the detected object is reliable. For example, the reliability of the object may be determined based on the size of the object image in the captured image. Alternatively, the reliability of the main subject as the determination target may be calculated based on the degree of matching between the pattern of the subject image stored when the subject is specified and the detected subject image. Further, even if there are a plurality of subjects having the same pattern as the detected subject, the calculated reliability is low because the possibility of detecting an erroneous main subject is high. The reliability determination unit 1002 compares the calculated reliability with a threshold to determine whether the detected object is reliable. In the case where the reliability of the object is high (equal to or higher than the threshold), the object tracking control can be performed. In this case, the tracking gain unit 1003 sets the gain to 1. The tracking correction amount calculated by the tracking amount calculation unit 118 is input to the adder 115 as it is. In order to improve the image blur correction effect during tracking control, the characteristics of the offset subtractor 110 and the integration filter unit 111 are changed, and a blur correction amount is calculated.
On the other hand, in the case where the reliability determination unit 1002 determines that the reliability of the object is less than the threshold and the reliability is low, control is performed to stop tracking because the photographer may not desire to perform tracking control. The reliability determination unit 1002 preferably evaluates the reliability in a plurality of steps using a plurality of thresholds, instead of evaluating whether the reliability is reliable using the same threshold in two steps. In this case, in the case where the calculated reliability is equal to or higher than the maximum threshold (the reliability is evaluated as the highest), the tracking gain unit 1003 sets the gain to 1 and gradually decreases the gain according to the reliability. Further, in the case where the reliability is less than the minimum threshold, the gain is finally set to zero. In this case, the blur prevention characteristic is changed to slightly weaken the image blur correction effect. That is, the cutoff frequency of the integration filter unit 111 is set to fc2, and the panning determination threshold is set to Ah 2. In this case, the blur prevention characteristic facilitates framing while maintaining the image blur correction effect at a certain intensity. The blur prevention characteristic changing unit 112 changes the blur prevention characteristic in accordance with the determination result of the reliability determining unit 1002, the determination result on the object detection state obtained by the detection state determining unit 1001, the state of the tracking SW121, and the determination result of the panning determining unit 113. That is, the blur prevention characteristic corresponding to the object reliability as the determination result of the reliability determination unit 1002, the blur prevention characteristic corresponding to the object detection state, and the blur prevention characteristic corresponding to the state of the tracking SW121 and the determination result of the panning determination unit 113 are compared. Among these blur prevention characteristics, a blur prevention characteristic that can most reduce the image blur correction effect is determined. The offset subtractor 110 and the integration filter unit 111 are set according to the determined blur prevention characteristic.
The tracking correction amount output from the tracking gain unit 1003 is input to the adder 115, and is added to a target value of the blur correction amount that is the output of the sensitivity adjustment unit 114. In this way, tracking correction and image blur correction can be performed simultaneously by driving the correction lens 108.
In the present embodiment, the characteristics of the image blur correction are changed according to the object detection state or the object reliability or both. Therefore, since image blur correction control that enables a photographer to easily perform framing can be realized, the performance of image blur correction and automatic object tracking control can be improved.
(fourth embodiment)
Next, a fourth embodiment of the present invention will be described. The image blur correction and automatic subject tracking apparatus according to the present embodiment determines whether tracking control is possible based on subject detection information used for automatic subject tracking, and displays the determination result on the screen of the display unit. In this way, it is possible to notify the photographer whether the image pickup apparatus can track the object, and it is possible to realize image blur correction and object tracking control that enable the photographer to easily perform framing.
In a system that tracks an object such that the position of the object is maintained at a specific position of a shooting screen, because during a tracking operation, a camera tracks the object such that the object is automatically located near the center of an angle of view by tracking control, the photographer does not need to perform a framing operation to track the object. However, such object tracking is performed by moving a part of the optical system, and there is a limitation in a trackable range. Since further tracking cannot be performed in the case where the optical system reaches the end of the trackable range, the photographer needs to shake the camera to track the object. However, since it is difficult for the photographer to immediately recognize that the optical system has reached the end of the trackable range, the photographer may lose the object from the shooting screen due to a late framing operation. This may occur even when tracking is performed by changing the cropping range of the frame.
In the present embodiment, it is determined whether or not the camera can perform tracking control, and the determination result is displayed on the screen of the display unit. In this way, the operability of the framing operation by the photographer can be improved.
Fig. 11 is a diagram showing the structure of a main part of the image pickup apparatus according to the present embodiment. This structure is different from the structure described in fig. 10 of the third embodiment in that it includes a tracking state judgment unit 120 and a tracking icon control unit 123. The determination result obtained by the tracking state determination unit 120 is input to the tracking icon control unit 123, and the icon to be displayed on the display unit 122 is controlled by the tracking icon control unit 123.
In the case where the object tracking control is performed and the trackable range is sufficient, the tracking control is implemented such that the position of the object is maintained near the center of the screen. However, since the trackable range is limited, in the case where the optical system reaches the end of the trackable range, further tracking control cannot be performed. In this case, the photographer needs to perform framing of the camera again to move the object to be located near the center. However, it is difficult for the photographer to immediately judge whether further tracking is possible only by viewing the live view image. Therefore, the photographer may not perform framing again immediately after the correction lens 108 reaches the end of the trackable range, and the subject may be lost from the screen. Therefore, the image pickup apparatus 101 of the present embodiment determines whether or not tracking control is possible. Further, the tracking icon to be displayed on the screen is changed according to the determination result, and the photographer is notified whether or not tracking is possible. In this way, in a case where tracking is not possible, a warning notification is issued to prompt the photographer to perform an operation of tracking the object through a framing operation.
Fig. 12A to 12H illustrate a camera rear view showing an object tracking operation and a tracking state and a timing chart showing a control state. In the case of performing object tracking, it is preferable that the photographer specifies an object on the screen and performs tracking in order to allow the photographer to photograph a desired object. As a method of specifying an object to be tracked, for example, as shown in fig. 12A, the following method may be used: the coordinate position of the object 503 touched is acquired by means of the touch panel 502 provided under the LCD 501 on the rear surface of the camera 101, and a tracking target object is set.
In the case where a subject is specified by the method shown in fig. 12A, the main subject region is tracked by detecting a region having a similar feature amount from images sequentially obtained after passing through the live view operation using the feature amount such as the hue distribution or the hue size. Further, the main object region is shown as a tracking region by a block 506 to notify the photographer of the tracking region. After the object is specified, tracking control is performed to move the detected object position toward the center of the screen. Further, a tracking icon 505 is displayed on the screen so that the photographer understands that tracking control is performed. When the subject tracking is not performed (the subject is not specified), the tracking icon 505 is not displayed on the screen.
Fig. 12B is a diagram showing a state after an object is specified. Fig. 12F shows an output of the adder 115 obtained by adding the blur correction amount and the tracking correction amount in the case where the state of the screen displayed by the image pickup apparatus 101 is changed from fig. 12A to fig. 12B, 12C, 12D, and 12E with the elapse of time. Fig. 12G shows the amount of center shift as an output of the subtractor 403, and fig. 12H shows a display state of the tracking icon 505. In a case where the object position detection unit outputs the coordinates of the detected object using the target position as the center coordinates, the output of the object position detection unit may be regarded as the amount of center shift. Until time T1 at which the subject is specified, the tracking icon 505 is not displayed on the screen, and in the case where the subject is specified as shown in fig. 12B at timing T1, the tracking icon 505 is displayed on the screen. Subsequently, even if the object 503 is displaced from the center of the image (fig. 12C) with the specified object 503 moving, tracking control is performed based on the tracking correction amount calculated by the tracking amount calculation unit 118 so that the object 503 is returned to the center of the image (fig. 12D). In these figures, T2 denotes a timing at which the subject moves away from the center of the image, and T3 denotes a timing at which the subject returns to the center of the image by tracking control.
Subsequently, in the case where the object further moves, although the tracking control is performed in the case where the tracking correction amount is further increased, the tracking correction amount is limited not to exceed the trackable correction amount Th2 in a state where the correction amount of the correction lens 108 reaches the trackable threshold Th1 as shown in fig. 12F. In this case, the correction amount is set between Th1 and Th2, and the tracking icon 505 is displayed in gray at T5 to notify the photographer of the fact that further tracking control cannot be performed, wherein at T5, a state in which the correction amount is equal to or larger than the threshold Th1 has continued for the predetermined period TL or longer. That is, in the case where the state where the correction amount is the threshold Th1 or more has continued for the predetermined period TL or more, the CPU 105 serving as the tracking state determination unit and the tracking icon control unit determines that the tracking control cannot be performed, displays the tracking icon 505 in gray, and issues a warning instruction to send a warning to the photographer. The correctability Th2 is an amount necessary for the correction lens to reach the end of the movable range, and the trackable threshold Th1 may be appropriately set to a value smaller than Th 2. For example, the trackable threshold Th1 may be set to a value obtained by multiplying the correctability Th2 by a coefficient that is greater than 0 and less than 1, and may be set to a value obtained by subtracting a fixed amount from the correctability Th 2.
Since further tracking control cannot be performed in the case where the correction lens 108 reaches the end of the movable range, the display of the tracking icon 505 is changed in the above-described manner, thereby prompting the photographer to frame the camera to move the subject toward the center of the image.
Next, another example of a method of sending a notification to the photographer by changing the tracking icon 505 in a case where the correction lens 108 exceeds the trackable range and tracking control is not possible will be described with reference to fig. 13A to 13H. The method for determining whether or not the tracking operation can be performed used in the example of fig. 13A to 13H is different from the example of fig. 12A to 12H. In the case where the position of the object cannot be moved to the vicinity of the center of the image after the tracking control is started and the object is located at a position off the center for a long time, it can be determined that the tracking correction has reached the end of the movable range, and further tracking control cannot be performed. In this case, a warning icon is displayed to notify the fact that the tracking control cannot be performed.
Whether or not the tracking operation can be performed is determined in the following manner. That is, in the case where a period in which the amount of center shift, which is the distance between the object position on the image and the center of the image, exceeds the predetermined threshold Th3 is equal to or longer than the predetermined period T7 or T7, it is determined that tracking cannot be performed. In other words, in a case where a state where the position of the subject image is separated from the target position by a predetermined distance or more has continued for a predetermined period of time or more, it is determined that the subject cannot be tracked. Further, in the case where the center shift amount is equal to or smaller than the threshold Th3, it is determined that tracking is possible. This determination is made by the tracking state determination unit 120. The tracking icon displayed on the screen is changed according to the determination result so that the photographer is notified of information on whether tracking is possible. In the case where tracking is not possible, the photographer is prompted to perform a framing operation to track the object.
As in the method described in fig. 12A, in the case where a subject is specified by the touch operation in fig. 13A, tracking control is started, and a tracking icon 505 is displayed on the screen.
Fig. 13B to 13E are diagrams showing a state after the object is specified. Fig. 13F shows an output of the adder 115 obtained by adding the blur correction amount and the tracking correction amount in the case where the state of the screen displayed by the image pickup apparatus 101 changes from fig. 13A to fig. 13B, 13C, 13D, and 13E with the elapse of time. Fig. 13G shows the amount of center shift, and fig. 13H shows the display state of the trace icon 505. Until time T8 at which the subject is specified, the tracking icon 505 is not displayed on the screen, and in the case where the subject is specified at timing T8, the tracking icon 505 is displayed on the screen. Subsequently, even if the object 503 is deviated from the center of the image (fig. 13C) in the case where the specified object 503 is moved, tracking control is performed so that the object 503 is returned to the center of the image (fig. 13D) based on the tracking correction amount calculated by the tracking amount calculation unit 118. In these figures, T9 denotes a timing at which the subject moves away from the center of the image, and T10 denotes a timing at which the subject returns to the center of the image by tracking control. Although the object position exceeds the threshold Th3 in the period between T9 and T10, since the period in which the position exceeds Th3 does not exceed the predetermined period T7, the icon 505 is displayed in a normal color.
Subsequently, in the case where the object further moves, although the tracking control is performed with the tracking correction amount further increased, in a situation where a state where the center shift amount is equal to or larger than the threshold Th3 as shown in fig. 13G continues for a predetermined period of time T7 or longer, the tracking icon 505 is displayed in gray to notify the photographer of the fact that further tracking control cannot be performed.
In this way, in the case where the object position is deviated from the center of the image for a long time, the CPU 105 serving as the tracking state determination unit 120 determines that the tracking has reached the end of the movable range, and further tracking control cannot be performed. Further, the CPU 105 also functions as the tracking icon control unit 123 and changes the display of the tracking icon 505, thereby prompting the photographer to view the camera to move the subject toward the center of the image.
In fig. 12A to 12H and fig. 13A to 13H, the following examples have been described: in the case where it is difficult to perform the object tracking control because the object in the image moves a large distance and the correction amount for moving the object position to the target position in the image is large, the icon is displayed. However, even if the correction amount is small, tracking control may be difficult. An example of a method of notifying the photographer by changing the display of the tracking icon 505 in a case where tracking control cannot be performed due to the object detection state will be described with reference to fig. 14A to 14H. When the reliability of the detected object is low, it may be determined that the tracking control cannot be performed. In this case, an icon is displayed to indicate that the tracking control cannot be performed.
The determination of the reliability is performed by the object reliability determination unit 1002, and it is determined whether or not the detected object is reliable. Since the object reliability determination by the object reliability determination unit 1002 is the same as that of the third embodiment, a detailed description thereof will not be provided.
As in the method described in fig. 12A, in a case where an object is specified by the touch operation in fig. 14A, tracking control is started, and a tracking icon 505 is displayed on the screen.
Fig. 14B is a diagram showing a state after an object is specified. Fig. 14E shows an output of the adder 115 obtained by adding the blur correction amount and the tracking correction amount in the case where the state of the screen displayed by the image pickup apparatus 101 is changed from fig. 14A to fig. 14B, 14C, and 14D with the elapse of time. Further, fig. 14F shows the center shift amount, fig. 14G shows the reliability of the object, and fig. 14H shows the display state of the tracking icon 505. Until time T11 at which the subject is specified, the tracking icon 505 is not displayed on the screen, and in the case where the subject is specified at time T11, the tracking icon 505 is displayed on the screen. Subsequently, even in a case where the specified object 503 is deviated from the center of the image, tracking control is performed based on the tracking correction amount calculated by the tracking amount calculation unit 118 so that the object 503 is returned to the center of the image as illustrated in fig. 14B. In these figures, T11 denotes a timing at which a subject is specified and a subject position is detected, and T12 denotes a timing at which the subject is returned to the center of the image by tracking control.
Subsequently, as shown in fig. 14C, in the case where the subject further moves, although the tracking control is performed in the case where the tracking correction amount is further increased, the reliability of the subject is set low in a state where a plurality of similar subjects appear in the screen as shown in fig. 14D. When the object reliability is lowered, the gain of the tracking gain unit 1003 is set small, and tracking control is not performed. Therefore, as shown in fig. 14G, in the case where the object reliability is equal to or less than the predetermined threshold Th4, the tracking icon 505 is displayed in gray to notify the photographer of the fact that further tracking control cannot be performed.
In this way, in the case where the object reliability is low, the CPU 105 serving as the tracking state determination unit 120 determines that the tracking control cannot be performed. Further, the CPU 105 also functions as the tracking icon control unit 123 and changes the display of the tracking icon 505, thereby prompting the photographer to view the camera to move the subject toward the center of the image.
Next, another example of a method of notifying the photographer by changing the display of the tracking icon 505 in a case where the tracking control cannot be performed due to the object detection state will be described with reference to fig. 15A to 15H. Although tracking control is performed in a case where an object is detected after the object is specified, in a case where the photographer loses the object, the state transitions to an object lost state. Although the state transitions to the object detection state in a case where an object can be detected again in the object loss state, if the object loss state continues for a predetermined period of time, it is determined that the tracking control cannot be performed. In this case, an icon is displayed to notify the fact that the tracking control cannot be performed. Further, in the case where the object lost state continues for a longer period of time, it is determined that there is no possibility that the object is detected again, the object detection is stopped, and the tracking icon 505 is not displayed. The object detection state is determined by the object detection state determination unit 1001 based on detection of an object, loss of an object, or non-detection of an object.
In the same way as the method described in fig. 12A, in the case where the object is specified by the touch operation in fig. 15A, the tracking control is started, and the tracking icon 505 is displayed on the screen.
Fig. 15B is a diagram showing a state after an object is specified. Fig. 15E shows an output of the adder 115 obtained by adding the blur correction amount and the tracking correction amount in the case where the state of the screen displayed by the image pickup apparatus 101 is changed from fig. 15A to fig. 15B, 15C, and 15D with the elapse of time. Further, fig. 15F shows the center shift amount, fig. 15G shows the object detection state, and fig. 15H shows the display state of the tracking icon 505.
Until time T11 at which the subject is specified, the tracking icon 505 is not displayed on the screen, and in the case where the subject is specified as shown in fig. 15B, the tracking icon 505 is displayed on the screen. Subsequently, even in a case where the specified object 503 is moved to a position deviated from the center of the image, tracking control is performed based on the tracking correction amount calculated by the tracking amount calculation unit 118 so that the object 503 is returned to the center of the image as illustrated in fig. 15B. In these figures, T15 denotes a timing at which a subject is specified and a subject position is detected, and T16 denotes a timing at which the subject is returned to the center of the image by tracking control.
Subsequently, as shown in fig. 15C, in the case where the subject further moves, although the tracking control is performed in the case where the tracking correction amount is further increased, as shown in fig. 15D, in the case where the tracking target subject moves behind another subject and disappears from the screen, the subject detection unit 117 (subject position detection unit 1000) cannot detect the subject and a subject loss state is generated (T17). The object loss state is generated for a predetermined period of time (T18) after the object loss, and in the case where the object 503 is detected again in the object loss state, the state may be returned to the object detection state. In the case where the predetermined period of time T18 has elapsed after the state has transitioned to the object loss state, it is determined that the object can no longer be detected, and an object detection stop state is produced. In the case where the object detection stop state is generated, even if the object 503 appears in the screen again, the state does not transit to the object detection state. In a case where the object detection state is to be generated in the object detection stop state, the object is specified again. Since it is possible to detect the object again in the object lost state, the object tracking control is continued for a predetermined period of time (T19) after the object is lost. The object tracking control is stopped after the predetermined time period T19 has elapsed, and the last tracking correction amount obtained based on the last sampling data is held as the tracking correction amount that is the output of the tracking amount calculation unit 118. During the period until the end of T18, the last tracking correction amount is maintained as the tracking correction amount, and in the case where T18 ends, the tracking control amount gradually approaches 0, and the correction lens returns to the position where the object tracking control is not performed.
In the object lost state (T18), the tracking icon 505 is displayed in gray to notify the photographer of the fact that tracking control cannot be performed. Further, when the time period T18 has elapsed from the object lost state and the object detection is stopped, the display of the tracking icon 505 is stopped.
In this way, the CPU 105 serving as the tracking state determination unit 120 determines that the tracking control cannot be performed according to the object detection state. Further, the CPU 105 also functions as the tracking icon control unit 123, and changes the display of the tracking icon 505 (including stopping the display of the tracking icon), so that the photographer can be prompted to view the camera to move the subject toward the center of the image, and the tracking target subject can be set again.
Next, a method of sending a notification to the photographer by changing the tracking icon 505 in a case where tracking control cannot be performed in a state where the size of shake of the camera 101 exceeds the size that can be corrected by image blur correction will be described with reference to fig. 16A to 16H. When the detected amount of shake of the camera 101 is large, it is determined that the tracking control cannot be performed. In this case, an icon is displayed to indicate that the tracking control cannot be performed.
In the same way as the method described in fig. 12A, in the case where the object is specified by the touch operation in fig. 16A, the tracking control is started, and the tracking icon 505 is displayed on the screen.
Fig. 16B is a diagram showing a state after an object is specified. Fig. 16E shows an output of the adder 115 obtained by adding the blur correction amount and the tracking correction amount in the case where the state of the screen displayed by the image pickup apparatus 101 is changed from fig. 16A to fig. 16B and 16C with the elapse of time. Further, fig. 16E shows the amount of center shift, fig. 16F shows the output of the angular velocity meter 103, fig. 16G shows the blur correction angle as the output of the blur correction angle calculation unit 109, and fig. 16H shows the display state of the tracking icon 505.
Until time T11 at which the subject is specified, the tracking icon 505 is not displayed on the screen, and in the case where the subject is specified as shown in fig. 16B, the tracking icon 505 is displayed on the screen. Subsequently, even in a case where the specified object 503 is deviated from the center of the image, tracking control is performed based on the tracking correction amount calculated by the tracking amount calculation unit 118 so that the object 503 is returned to the center of the image as illustrated in fig. 16B. In these figures, T22 denotes a timing at which a subject is specified and a subject position is detected, and T23 denotes a timing at which the subject is returned to the center of the image by tracking control.
Subsequently, in a case where the shake amount of the image pickup apparatus 101 increases, the angular velocity as the output of the angular velocity meter 103 increases, and the blur correction angle calculated based on the output of the angular velocity meter also increases. In this case, in a case where the blur correction angle exceeds the blur correction limit, the captured image is blurred, and it is impossible to fill the object 503 to a specific position as shown in fig. 16C. The correction lens 108 cannot perform correction beyond the correction limit (Th5, -Th 5). Therefore, in the case where the tracking state determination unit 120 determines that the tracking control cannot be performed in the range between Th5 and-Th 5 at T24, the tracking icon 505 is displayed in gray to notify the photographer of the fact that the tracking control cannot be performed.
As a method of determining whether or not tracking is impossible due to a large shake amount, a method of determining based on the angular velocity output from the angular velocity meter 103 may be used. The tracking state determination unit 120 determines that tracking is not possible if a period in which the correction amount exceeds the threshold (above Th6 or below-Th 6) is equal to or greater than a predetermined period within a predetermined period T20. The determination may be made based on the number of times rather than the time period. In this case, in the case where the number of times the correction amount detected at the predetermined sampling interval exceeds the threshold value is equal to or larger than the predetermined number of times, it is determined that the tracking cannot be performed.
As another method of determining whether or not tracking is not possible due to a large shake amount, a method of performing determination based on a blur correction angle as an output of the blur correction angle calculation unit 109 may be used. In this case, the tracking state determination unit 120 determines that tracking is not possible if the period or number of times during which the correction amount exceeds the predetermined threshold (above Th7 or below-Th 7) is equal to or greater than the predetermined period or predetermined number of times within the predetermined period T21.
According to any of these methods, a state in which large image blur continuously appears can be detected, and image blur correction and tracking control are difficult to perform. Further, the display of the tracking icon 505 may be changed to notify the photographer of the fact that tracking control is not possible.
In this way, in the case where a shake having a size that cannot be corrected by blur correction of the image pickup apparatus occurs, the CPU 105 serving as the tracking state determination unit 120 determines that tracking control cannot be performed. Further, the CPU 105 also functions as a tracking icon control unit 123, and changes the display of the tracking icon 505 in response to the determination result of the tracking state determination unit 120, so that the photographer can be notified of the fact that: in order to reduce the shake of the camera 101, the photographer needs to reliably hold the image pickup apparatus.
As described above, in the present embodiment, the tracking state determination unit 120 determines whether or not tracking control is possible. In a case where it is determined that tracking is not possible, a warning is sent to the photographer, so that it is possible to urge the photographer to view the image pickup apparatus to move the subject toward the center of the image, and to reliably hold the image pickup apparatus.
In the present embodiment, although a method of transmitting a warning notification using display of the trace icon 505 has been described, the following method may be used as another method of transmitting a warning.
(1) Alert notifications are sent using tracking icon 505. Specifically, the notification that the tracking control cannot be performed is realized by blinking the icon on and off. The state in which the tracking control cannot be performed may be classified into several levels, and the level may be notified to the photographer by changing the blinking time period according to the level.
(2) As shown in fig. 17, an LED 1101 (light emitting unit) may be provided in the camera 101, and a warning notification is sent using the LED 1101. Specifically, the notification indicating that the tracking control cannot be performed by the photographer is realized by blinking the LED 1101 on and off. The state in which the tracking control cannot be performed may be divided into several levels, and the photographer may be notified of the level of the state in which the tracking control cannot be performed by changing the blinking time period according to the level. Further, in the case where tracking is possible, the LED 1101 may be turned on, and in the case where a subject is not specified and tracking control is not performed, the LED 1101 may be turned off.
(3) As shown in fig. 18, a sheet-like actuator 1201 that generates vibration is provided at a position where a photographer often places his or her fingers when the photographer holds the camera. Notification that the tracking control cannot be performed for the indication of the photographer is realized by vibrating the actuator 1201. A piezoelectric actuator such as a piezoelectric element is used as the actuator 1201. The state in which the tracking control cannot be performed may be divided into several levels, and the photographer may be notified of the level of the state in which the tracking control cannot be performed by changing the magnitude of the vibration according to the level.
In fig. 12A to 12H to fig. 16A to 16H, an example of using one determination method to determine whether or not the tracking control is possible is described. However, whether or not the tracking control is possible is determined using a variety of determination methods. For example, in the case where the state where the correction amount is equal to or larger than the trackable threshold Th1 continues for the predetermined period TL or longer, it may be determined that the tracking control is not possible, and in the case where the state where the position shift amount is equal to or larger than the threshold Th3 continues for the predetermined period T7 or longer, it may be determined that the tracking control is not possible. These determination methods may be appropriately combined. In this case, the warning display may be changed according to the determination method, and may be changed according to the reason why the tracking control is not possible (due to a large correction amount, due to a subject detection state, or due to large shake). However, since it may appear complicated to the photographer if there are a plurality of types of warning displays, it is preferable to use the same warning display regardless of the determination method and the reason why the tracking control cannot be performed. For example, whether due to a large correction amount or a subject detection state, it is preferable to use the same warning display. This is because even if there is one warning display, it is possible to prompt the photographer to follow the operation of the object through the framing operation by using the warning notification.
In the present embodiment, a so-called optical shift method of moving a correction lens in a plane perpendicular to the optical axis of an imaging optical system is used as an automatic object tracking unit. However, the present invention is not limited to this optical shift method, but the following structure may be adopted.
(1) And a structure for moving the image pickup element in a plane perpendicular to the optical axis.
(2) And a structure for changing the cut-out position of each shot frame output from the image pickup element.
(3) A structure for rotating a lens barrel including an image pickup element and a photographing lens group.
(4) A structure combined with a rotating pan/tilt head provided separately from the image pickup apparatus and capable of panning and tilting the image pickup apparatus.
(5) A structure in which the above-described plurality of structures are combined.
(other embodiments)
The embodiments of the present invention can also be realized by a method of supplying software (programs) performing the functions of the above-described embodiments to a system or an apparatus via a network or various storage media, and a method of reading out and executing the programs by a computer or a Central Processing Unit (CPU) or a Micro Processing Unit (MPU) of the system or the apparatus.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
The present application claims the benefits of Japanese patent application No. 2015-244288 filed 12/15/2015, Japanese patent application No. 2015-239747 filed 12/8/2015 and Japanese patent application No. 2016-2016 filed 11/8/2016, which are all incorporated herein by reference.

Claims (13)

1. A control device, comprising:
a correction control unit configured to acquire the blur detection signal detected by the blur detection unit to calculate a correction amount of image blur, and to control an image blur correction unit configured to correct the image blur;
an object detection unit configured to detect a position of an object in a captured image and acquire position information of the object in the captured image; and
a setting unit configured to set a control state in which a user can select a subject selectable mode of a desired subject,
wherein the setting unit sets the control state from among a plurality of control states including a first state in which a subject selectable mode in which a user can select a desired subject is set but the desired subject is not selected, and a second state in which the subject selectable mode is set and the desired subject is selected, and
wherein the correction control unit acquires information on the control state set by the setting unit, and controls to change the characteristic for calculating the correction amount based on the control state so that the image blur correction effect in the second state is higher than the image blur correction effect in the first state.
2. The control device according to claim 1, further comprising:
a tracking control unit configured to perform tracking control of a desired subject selected in the subject selectable mode based on the position information of the desired subject acquired by the subject detection unit.
3. The control device according to claim 2,
the correction control unit changes the degree of the image blur correction effect based on the subject detection result obtained by the subject detection unit.
4. The control device according to claim 2,
the tracking control unit performs tracking control of the desired object by moving the position of the desired object based on the position information of the desired object selected in the object selectable mode acquired by the object detection unit.
5. The control device according to claim 1,
the correction control unit includes a filter for calculating the correction amount, and changes a degree of an image blur correction effect by changing a characteristic of the filter.
6. The control device according to claim 5,
the characteristic of the filter is a cut-off frequency of the filter.
7. The control device according to claim 6,
the correction control unit improves the image blur correction effect by lowering the cutoff frequency of the filter used in the first state more than the cutoff frequency of the filter used in the second state.
8. The control device according to claim 2,
the correction control unit acquires the blur detection signal to determine whether panning or tilting is performed, changes a characteristic for calculating the correction amount according to a determination result, and controls to change a degree of an image blur correction effect by performing: when it is determined that panning or tilting is performed, the correction amount is shifted closer to the center of the control range than when it is determined that panning or tilting is not performed.
9. The control device according to claim 4,
the tracking control unit performs tracking control of the object by moving the position of the object in the captured image to reduce a distance between the position of the object in the captured image and a predetermined position in the captured image, and
the correction control unit controls so that an image blur correction effect in a case where a distance between the position of the object and the predetermined position is equal to or greater than a threshold value is reduced to be lower than an image blur correction effect in a case where the distance between the position of the object and the predetermined position is less than the threshold value.
10. The control device according to claim 1, further comprising:
a reliability determination unit configured to compare the reliability of the detected object with a threshold to determine the reliability of the object,
wherein the correction control unit performs control so that an image blur correction effect in a case where the reliability is equal to or greater than the threshold is improved higher than an image blur correction effect in a case where the reliability is less than the threshold, based on a determination result obtained by the reliability determination unit.
11. The control device according to claim 1, further comprising:
a determination unit configured to determine whether tracking control of the object is possible; and
a warning instruction unit configured to issue a warning if the determination unit determines that the tracking control of the object is not possible.
12. An image pickup apparatus comprising an image pickup element for obtaining an object image and the control device according to any one of claims 1 to 11.
13. A control method comprising the steps of:
a correction control step of acquiring a blur detection signal detected by a blur detection unit to calculate a correction amount of image blur, and controlling an image blur correction unit configured to correct the image blur;
an object detection step of detecting a position of an object in a captured image and acquiring position information of the object in the captured image; and
a setting step of setting a control state in which a user can select a subject selectable mode of a desired subject,
wherein the setting step sets the control state from among a plurality of control states including a first state in which a subject selectable mode in which a user can select a desired subject is set but the desired subject is not selected, and a second state in which the subject selectable mode is set and the desired subject is selected, and
wherein the correction control step acquires information on the control state set by the setting step, and controls to change the characteristic for calculating the correction amount based on the control state so that the image blur correction effect in the second state is higher than the image blur correction effect in the first state.
CN202010223939.XA 2015-12-08 2016-12-08 Control device, image pickup apparatus, and control method Active CN111246117B (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP2015239747 2015-12-08
JP2015-239747 2015-12-08
JP2015244288 2015-12-15
JP2015-244288 2015-12-15
JP2016-218297 2016-11-08
JP2016218297A JP6833461B2 (en) 2015-12-08 2016-11-08 Control device and control method, imaging device
CN201611140487.9A CN107018288B (en) 2015-12-08 2016-12-08 Control device and image pickup apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201611140487.9A Division CN107018288B (en) 2015-12-08 2016-12-08 Control device and image pickup apparatus

Publications (2)

Publication Number Publication Date
CN111246117A CN111246117A (en) 2020-06-05
CN111246117B true CN111246117B (en) 2021-11-16

Family

ID=59079658

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010223939.XA Active CN111246117B (en) 2015-12-08 2016-12-08 Control device, image pickup apparatus, and control method
CN201611140487.9A Active CN107018288B (en) 2015-12-08 2016-12-08 Control device and image pickup apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201611140487.9A Active CN107018288B (en) 2015-12-08 2016-12-08 Control device and image pickup apparatus

Country Status (2)

Country Link
JP (1) JP6833461B2 (en)
CN (2) CN111246117B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6912923B2 (en) * 2017-04-14 2021-08-04 キヤノン株式会社 Image pickup device, drive mechanism of image pickup device, and control method thereof
US11232581B2 (en) 2017-08-02 2022-01-25 Sony Group Corporation Information processing apparatus, information processing method, and recording medium
JP2019062340A (en) * 2017-09-26 2019-04-18 キヤノン株式会社 Image shake correction apparatus and control method
WO2020174911A1 (en) * 2019-02-28 2020-09-03 富士フイルム株式会社 Image display device, image display method, and program
CN114616820A (en) 2019-10-29 2022-06-10 富士胶片株式会社 Imaging support device, imaging system, imaging support method, and program
US11265478B2 (en) 2019-12-20 2022-03-01 Canon Kabushiki Kaisha Tracking apparatus and control method thereof, image capturing apparatus, and storage medium
JP7441040B2 (en) * 2019-12-27 2024-02-29 キヤノン株式会社 Control device, imaging device and control method thereof
KR20210096795A (en) * 2020-01-29 2021-08-06 삼성전자주식회사 Method of automatically photographing an image, image processing device and image photographing system performing the same
JP7209305B2 (en) * 2021-01-27 2023-01-20 株式会社モルフォ Control device, control method, program
CN116998160A (en) * 2021-03-19 2023-11-03 富士胶片株式会社 Image pickup support device, image pickup support method, and program
US20230232102A1 (en) * 2022-01-14 2023-07-20 Canon Kabushiki Kaisha Image blur correction apparatus, control method therefor, imaging apparatus, and storage medium
CN115103126A (en) * 2022-07-22 2022-09-23 维沃移动通信有限公司 Shooting preview method and device, electronic equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845593A (en) * 2005-04-07 2006-10-11 索尼公司 Imaging apparatus and method for processing imaging results
CN1893558A (en) * 2005-06-30 2007-01-10 奥林巴斯映像株式会社 Electronic jitter correction device
CN101237529A (en) * 2007-01-31 2008-08-06 富士胶片株式会社 Imaging apparatus and imaging method
CN101459762A (en) * 2007-12-13 2009-06-17 佳能株式会社 Image capturing apparatus, control method therefor, and program
JP2010093362A (en) * 2008-10-03 2010-04-22 Nikon Corp Imaging apparatus and optical apparatus
CN102057665A (en) * 2009-05-07 2011-05-11 松下电器产业株式会社 Electron camera, image processing device, and image processing method
CN102422630A (en) * 2009-05-12 2012-04-18 佳能株式会社 Image pickup apparatus
CN102447832A (en) * 2010-08-18 2012-05-09 佳能株式会社 Tracking apparatus and tracking method
CN104052923A (en) * 2013-03-15 2014-09-17 奥林巴斯株式会社 Photographing apparatus, image display apparatus, and display control method of image display apparatus
CN104349051A (en) * 2013-07-24 2015-02-11 佳能株式会社 Subject detection apparatus and control method of same
JP2015035001A (en) * 2014-11-04 2015-02-19 株式会社ニコン Shake correcting device and optical apparatus
CN104469139A (en) * 2013-09-24 2015-03-25 佳能株式会社 Image Capturing Apparatus And Image Capturing Method
JP2015130612A (en) * 2014-01-08 2015-07-16 キヤノン株式会社 Imaging apparatus and control method of the same

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4079463B2 (en) * 1996-01-26 2008-04-23 ソニー株式会社 Subject detection apparatus and subject detection method
US20050018051A1 (en) * 2003-07-25 2005-01-27 Nikon Corporation Shooting lens having vibration reducing function and camera system for same
JP5188138B2 (en) * 2007-10-15 2013-04-24 キヤノン株式会社 Optical apparatus having image blur correction device
JP2009222899A (en) * 2008-03-14 2009-10-01 Canon Inc Image blur correction apparatus
JP5159515B2 (en) * 2008-08-26 2013-03-06 キヤノン株式会社 Image processing apparatus and control method thereof
JP6074298B2 (en) * 2013-03-18 2017-02-01 キヤノン株式会社 Imaging apparatus, image processing apparatus, and control method thereof
JP6351321B2 (en) * 2013-05-28 2018-07-04 キヤノン株式会社 Optical apparatus, control method therefor, and control program
JP6494202B2 (en) * 2013-08-07 2019-04-03 キヤノン株式会社 Image shake correction apparatus, control method thereof, and imaging apparatus
JP6056774B2 (en) * 2014-01-17 2017-01-11 ソニー株式会社 Imaging apparatus, imaging method, and program.

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845593A (en) * 2005-04-07 2006-10-11 索尼公司 Imaging apparatus and method for processing imaging results
CN1893558A (en) * 2005-06-30 2007-01-10 奥林巴斯映像株式会社 Electronic jitter correction device
CN101237529A (en) * 2007-01-31 2008-08-06 富士胶片株式会社 Imaging apparatus and imaging method
CN101459762A (en) * 2007-12-13 2009-06-17 佳能株式会社 Image capturing apparatus, control method therefor, and program
JP2010093362A (en) * 2008-10-03 2010-04-22 Nikon Corp Imaging apparatus and optical apparatus
CN102057665A (en) * 2009-05-07 2011-05-11 松下电器产业株式会社 Electron camera, image processing device, and image processing method
CN102422630A (en) * 2009-05-12 2012-04-18 佳能株式会社 Image pickup apparatus
CN102447832A (en) * 2010-08-18 2012-05-09 佳能株式会社 Tracking apparatus and tracking method
CN104052923A (en) * 2013-03-15 2014-09-17 奥林巴斯株式会社 Photographing apparatus, image display apparatus, and display control method of image display apparatus
CN104349051A (en) * 2013-07-24 2015-02-11 佳能株式会社 Subject detection apparatus and control method of same
CN104469139A (en) * 2013-09-24 2015-03-25 佳能株式会社 Image Capturing Apparatus And Image Capturing Method
JP2015130612A (en) * 2014-01-08 2015-07-16 キヤノン株式会社 Imaging apparatus and control method of the same
JP2015035001A (en) * 2014-11-04 2015-02-19 株式会社ニコン Shake correcting device and optical apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无传感器感应电机模糊自校正速度控制;杨治平 等;《西南大学学报(自然科学版)》;20140420;全文 *

Also Published As

Publication number Publication date
CN111246117A (en) 2020-06-05
CN107018288A (en) 2017-08-04
JP6833461B2 (en) 2021-02-24
CN107018288B (en) 2020-04-07
JP2017111430A (en) 2017-06-22

Similar Documents

Publication Publication Date Title
CN111246117B (en) Control device, image pickup apparatus, and control method
US10659691B2 (en) Control device and imaging apparatus
CN106954007B (en) Image pickup apparatus and image pickup method
JP6486087B2 (en) Image blur correction apparatus, imaging apparatus, and control method
US10321058B2 (en) Image pickup apparatus and motion vector detection method
JP4964807B2 (en) Imaging apparatus and imaging method
US8493493B2 (en) Imaging apparatus, imaging apparatus control method, and computer program
US9912867B2 (en) Image processing apparatus, image capturing apparatus and storage medium storing image processing apparatus
KR20100067407A (en) Photographing control method and apparatus according to motion of digital photographing apparatus
JP2016009998A (en) Imaging apparatus, imaging method and program
US10212364B2 (en) Zoom control apparatus, image capturing apparatus and zoom control method
JP6824710B2 (en) Zoom control device and zoom control method, imaging device
CN106067943B (en) Control device, optical apparatus, image pickup apparatus, and control method
JP2017121042A (en) Subject tracking apparatus, control method of the same, control program, and imaging apparatus
JP2011223174A (en) Imaging device and method for controlling the same
JP6613149B2 (en) Image blur correction apparatus and control method therefor, imaging apparatus, program, and storage medium
JP6330283B2 (en) Subject tracking device, imaging device, and subject tracking program
US9781337B2 (en) Image processing device, image processing method, and recording medium for trimming an image based on motion information
JP5449448B2 (en) Imaging apparatus and control method thereof
JP6702736B2 (en) IMAGING CONTROL DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP2018142981A (en) Subject tracking device, imaging device, and subject tracking program
JP2010206275A (en) Electronic camera
JP2017207554A (en) Imaging device and method for controlling the same, program, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant