JP2012255896A - Imaging apparatus, focus adjustment method therefor and program - Google Patents

Imaging apparatus, focus adjustment method therefor and program Download PDF

Info

Publication number
JP2012255896A
JP2012255896A JP2011128551A JP2011128551A JP2012255896A JP 2012255896 A JP2012255896 A JP 2012255896A JP 2011128551 A JP2011128551 A JP 2011128551A JP 2011128551 A JP2011128551 A JP 2011128551A JP 2012255896 A JP2012255896 A JP 2012255896A
Authority
JP
Japan
Prior art keywords
focus
evaluation value
step
imaging
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2011128551A
Other languages
Japanese (ja)
Other versions
JP5762156B2 (en
JP2012255896A5 (en
Inventor
Kenji Kimoto
賢志 木本
Original Assignee
Canon Inc
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc, キヤノン株式会社 filed Critical Canon Inc
Priority to JP2011128551A priority Critical patent/JP5762156B2/en
Publication of JP2012255896A publication Critical patent/JP2012255896A/en
Publication of JP2012255896A5 publication Critical patent/JP2012255896A5/ja
Application granted granted Critical
Publication of JP5762156B2 publication Critical patent/JP5762156B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To perform focusing in an excellent state even when the ratio of noise contained in an imaging signal is increased.SOLUTION: The imaging apparatus includes: focus evaluation value calculation means for extracting a specified frequency component from the imaging signal and calculating a focus evaluation value showing contrast of the imaging signal; focusing degree calculation means for calculating a focusing degree showing the degree of focusing using the focus evaluation value; shape prediction means for predicting the shape of the peak of the focus evaluation value using change amount for every fixed period of the focus evaluation value; and focus speed setting means for setting driving speed of a focus lens on the basis of the focusing degree calculated by the focusing degree calculation means and the result by the shape prediction means.

Description

  The present invention relates to an imaging apparatus, a focus adjustment method thereof, and a program. Specifically, it is suitable for use in an autofocus imaging device.

  As an autofocus method for an imaging apparatus such as a digital camera, for example, a method of focusing on a subject by moving a focus lens based on a luminance signal obtained from an imaging element such as a CCD element is used. Specifically, the focus evaluation value indicating the contrast in the distance measurement area is calculated by integrating the high frequency components of the signal in the distance measurement area set in the imaging screen. The imaging apparatus calculates the focus evaluation value while moving the focus lens in the direction in which the focus evaluation value increases, and detects the position of the focus lens (that is, the in-focus point) where the focus evaluation value is highest. This operation is called “mountain climbing operation”.

  Then, in the vicinity of the in-focus point, the imaging apparatus acquires the focus evaluation value while moving the focus lens in the front-rear direction, and the position of the focus lens is the maximum value of the focus evaluation value (= the peak of the focus evaluation value curve). Confirm that it is at the apex of. This operation is called a “wobbling operation”. When it is detected that the position of the focus lens is not at the position where the focus evaluation value is maximized, the imaging apparatus moves the focus lens to a position where the focus evaluation value is maximized. In this way, the imaging device controls to keep the focus lens in focus.

  The imaging device calculates the focus degree (= degree of focus) at the position of the focus lens based on the focus evaluation value in the hill climbing operation and the wobbling operation, and the focus lens based on the calculated focus degree A technique for switching the driving conditions is known. For example, the lower the degree of focus, the higher the moving speed of the focus lens in the hill climbing operation, and the larger the moving amount of the focus lens in the wobbling operation. On the other hand, the higher the degree of focus, the smaller the moving speed of the focus lens in the hill climbing operation, and the smaller the moving amount of the focus lens in the wobbling operation. According to such a method, it is possible to quickly return from a low focus state to a high focus state, and it is possible to suppress the amount of the focus lens passing through the focus point in the vicinity of the focus point.

  However, in such an autofocus method, the degree of focus needs to show an appropriate change tendency corresponding to the change in focus. For this reason, when the noise included in the luminance signal increases, the focus degree may not show the expected change tendency. Then, the expected speed control cannot be realized, and the autofocus operation may not be stable in the vicinity of the in-focus point.

  For example, Patent Document 1 includes means for extracting different frequency components at the same point in an imaging signal as a method of calculating the degree of focus, and divides a high frequency signal component by a lower signal component to obtain a high frequency component. A method is described in which is normalized by a low frequency component. According to Patent Document 1, it is possible to perform focus adjustment accurately and without malfunction regardless of the state of the subject such as low contrast by using the normalized output.

JP 05-145827 A Japanese Patent Laid-Open No. 06-086143

  By the way, in the method described in Patent Document 1, a signal component having a high frequency is divided by a signal component having a lower frequency. However, when a signal component having a high frequency is extracted, the noise component included in the imaging signal is extracted. It tends to be easily affected. For example, when the influence of the noise component is stronger than the original signal component, such as in a low-light scene, the high-frequency signal component generated by the noise is output at a high value regardless of the actual focus position. There is a case. Therefore, in such a scene, there is a problem that the normalized focus degree does not show a correct change tendency.

  Further, for example, Patent Document 2 describes a method of obtaining primary and secondary differential amounts from a focus evaluation value obtained by extracting a specific frequency component from an imaging signal, and controlling the speed of the focus lens based on the differential amount. ing. In the conventional method of speed control using the amount of change in the focus evaluation value within a fixed time, the maximum focus evaluation value extracted is different between a low-contrast subject and a high-contrast subject even at the same in-focus point. Therefore, in the case of a low-contrast subject, it is impossible to decelerate in the vicinity of the focal point, and there is a possibility of passing more than necessary. However, in the method of Patent Document 2, the speed of the focus lens is controlled based on the change tendency of the primary and secondary differential amounts, so that it is not affected by the focus evaluation value of the in-focus point, and is fast when the focus is blurred. It is possible to control the speed of the focus lens so as to be slow in the vicinity.

  However, in the method described in Patent Document 2, it is clearly known whether or not it is in the vicinity of the in-focus point, but it is necessary to calculate the primary and secondary differential amounts of the focus evaluation value. Need. In addition, it is difficult to determine how much the focus is achieved by acquiring the focus evaluation value once as in the degree of focus described in Patent Document 1, and it is difficult to finely adjust the driving speed in the vicinity of the focus. There is a problem.

  The present invention has been made in view of the above-described circumstances, and an object of the present invention is to enable good focusing even when the ratio of noise included in an imaging signal increases.

  The present invention is an imaging device that controls the position of a focus lens based on an imaging signal acquired by an imaging unit by imaging a subject, and extracts a specific frequency component from the imaging signal, Focus evaluation value calculating means for calculating a focus evaluation value indicating contrast, focus degree calculating means for calculating a focus degree indicating the degree of focus using the focus evaluation value, and a fixed period of the focus evaluation value A shape prediction means for predicting the shape of the peak of the focus evaluation value using the change amount of the focus, and the driving speed of the focus lens based on the focus degree calculated by the focus degree calculation means and the result by the shape prediction means And a focus speed setting means for setting.

  According to the present invention, a stable focus adjustment operation can be performed even when the ratio of noise included in the image pickup signal is large, and good focusing can be achieved.

It is a block diagram which shows the structure of the imaging device which concerns on 1st Embodiment. It is a flowchart which shows the focus adjustment operation | movement which concerns on this embodiment. It is a flowchart which shows the wobbling operation | movement which concerns on this embodiment. It is a flowchart which shows the mountain climbing operation | movement which concerns on this embodiment. It is a flowchart which shows the focus speed setting process which concerns on this embodiment. It is a flowchart which shows the exposure condition determination process which concerns on this embodiment. It is a flowchart which shows the shape prediction process which concerns on this embodiment. It is a flowchart which shows the primary differential calculation process which concerns on this embodiment. It is a flowchart which shows the secondary differential calculation process which concerns on this embodiment. It is a flowchart which shows the focus degree calculation process which concerns on this embodiment. It is a flowchart which shows the area | region identification process which concerns on this embodiment. It is a flowchart which shows the focus speed selection process which concerns on this embodiment. It is a figure for demonstrating the relationship between the focus evaluation value at the time of normal illumination intensity, a differential output, a focus degree, and a focus speed concerning this embodiment. It is a flowchart which shows the focus speed selection process which concerns on this embodiment. It is a figure for demonstrating the relationship between the focus evaluation value at the time of the low illumination intensity concerning this embodiment, a differential output, a focus degree, and a focus speed. It is the figure which compared the relationship between the brightness and speed control which concern on this embodiment. It is a flowchart which shows the exposure condition determination process which concerns on 2nd Embodiment. It is a flowchart which shows the focus adjustment operation | movement which concerns on 3rd Embodiment.

The best mode for carrying out the present invention is as follows.
(First embodiment)
Hereinafter, the present embodiment will be described in detail with reference to the drawings.
In the present embodiment, an example in which the present invention is applied to an AF operation that is continuously performed during through image display is shown.

<Overall configuration of imaging device>
First, the overall configuration of the imaging apparatus 1 according to the present embodiment will be described. FIG. 1 is a block diagram schematically showing the configuration of the imaging apparatus 1 according to the present embodiment.
The system control unit 115 illustrated in FIG. 1 includes, for example, a CPU, a RAM, and a ROM. The system control unit 115 controls the operation of the entire imaging apparatus 1 using the RAM as a work area according to a program stored in advance in the ROM. In addition, each process mentioned later shall be mainly performed by the system control part 115 as a computer program (software).
In addition, the system control unit 115 specifies a focus position based on the focus evaluation value calculated by the AF processing unit 105, moves the focus lens by controlling the focus lens control unit 104, and performs automatic focus adjustment (AF). Implement the process. The focus evaluation value is a numerical value that serves as an index of contrast in the distance measurement area.

  As the imaging lens 101, a conventional general imaging lens having a zoom function can be applied. The aperture / shutter control unit 102 controls driving of an aperture and a shutter for controlling the light amount. The focus lens control unit 104 controls the drive of the focus lens in order to focus on the image sensor 108. The aperture / shutter control unit 102 and the focus lens control unit 104 include an optical element such as a lens, a mechanism such as an aperture / shutter, and various devices (none of which are shown) necessary to drive these. Various devices include an actuator for driving the optical element and the mechanism, a circuit for controlling the actuator, a D / A converter, and the like.

The light emitting device (strobe) 106 adjusts the subject brightness by emitting light toward the outside. Upon receiving a “flash on” signal from the system control unit 115, the EF processing unit 107 controls the light emitting device 106 to emit light. When the system control unit 115 determines that the light emitting device 106 needs to emit light, the system control unit 115 sends a “flash on” signal to the EF processing unit 107.
A light receiving means or a photoelectric conversion means for converting incident light into an electric signal is applied to the image sensor 108. For example, the image sensor 108 is composed of a photoelectric conversion element such as a CCD or a CMOS imager, and can photoelectrically convert incident light to generate and output an image signal (image signal). The imaging processing unit 109 includes a CDS circuit, a non-linear amplification circuit, and an A / D conversion unit. The CDS circuit removes output noise of the image sensor 108 by a correlated double sampling method. The nonlinear amplifier circuit performs signal amplification (gain control) on the imaging signal from which noise has been removed by the CDS circuit. The A / D converter converts an imaging signal that is an analog signal into a digital signal. The imaging element 108 and the imaging processing unit 109 function as an imaging unit that acquires an imaging signal by imaging a subject.

  The image processing unit 110 performs predetermined image processing such as gamma correction and contour correction of an imaging signal (that is, image data). Further, the image processing unit 110 performs white balance processing of the imaging signal based on the control of the WB processing unit 111. The format conversion unit 112 converts the supplied imaging signal into a format suitable for recording on a recording medium in the image recording unit 114 and display on the operation display unit 117. The DRAM 113 is a high-speed built-in memory (for example, a random access memory). The DRAM 113 is used as a high-speed buffer as a storage unit that can temporarily store an imaging signal. The DRAM 113 is used as a working memory for compressing and decompressing the image signal. The image recording unit 114 includes a recording medium such as a memory card and its interface, and records an imaging signal.

  The AE processing unit 103 calculates a photometric value corresponding to the brightness of the subject based on the imaging signal acquired from the imaging unit (the imaging device 108 and the imaging processing unit 109). In other words, the AE processing unit 103 and the imaging processing unit 109 function as an exposure condition detection unit that detects an exposure condition during subject imaging. Further, the AE processing unit 103 determines a signal amplification amount (gain amount) for amplifying the imaging signal and maintaining proper exposure when the subject luminance is low. In other words, the AE processing unit 103 determines a signal amplification amount (gain amount) for correcting the imaging signal to an appropriate exposure. The system control unit 115 controls the aperture shutter control unit 102 and the nonlinear amplification circuit of the imaging processing unit 109 based on the photometric value calculated by the AE processing unit 103. Thus, the system control unit 115 automatically adjusts the exposure amount. In other words, the system control unit 115 performs an automatic exposure (AE) process using the exposure condition detected by the exposure condition detection unit.

The AF processing unit 105 calculates a focus evaluation value by extracting a plurality of frequency components based on the imaging signal acquired from the imaging unit (the imaging device 108 and the imaging processing unit 109). That is, the AF processing unit 105 functions as a focus evaluation value calculation unit that calculates a focus evaluation value. In this embodiment, the focus evaluation value calculated by extracting the high frequency component is used as the focus evaluation value H, and the focus evaluation value calculated by extracting the low frequency component is used as the focus evaluation value L. .
A VRAM (image display memory) 116 records an imaging signal and the like. The operation display unit 117 can display an image, display for assisting operation, and display of a camera state. The operation display unit 117 can display a shooting screen during shooting. The main switch (main SW) 118 is a switch for turning on / off the power of the imaging apparatus 1. A first switch (SW1) 119 is a switch for performing a shooting standby operation (shooting preparation operation) such as AF and AE. The second switch (SW2) 120 is a switch for performing imaging after the first switch 119 is operated.

<Basic operation>
Next, the basic flow of the focus adjustment operation of this embodiment will be described.
The focus adjustment operation of the present embodiment is continuously performed during moving image recording and standby.
FIG. 2 is a flowchart showing a rough flow of the focus adjustment operation. The focus adjustment operation process shown in FIG. 2 is stored as a computer program (software) in the ROM of the system control unit 115 or the like. Then, after the main switch 118 is operated to turn on the power and the imaging apparatus 1 is activated, the system control unit 115 and the focus lens control unit 104 are mainly executed.

The system control unit 115 of the imaging apparatus 1 periodically and continuously acquires focus evaluation values after activation.
First, in step S200, the focus lens control unit 104 performs a wobbling operation. Details of the wobbling operation will be described later. In step S201, the system control unit 115 determines whether or not the determination result in step S200 is in focus. If it is determined that the subject is in focus (in the case of “Yes”), the process proceeds to step S205. If it is determined that the subject is not in focus (in the case of “No”), the process proceeds to step S202.

  In step S202, the system control unit 115 determines the in-focus direction. That is, the system control unit 115 determines whether the focal point is in the direction before or after the current position of the focus lens. If the in-focus direction cannot be determined in step S202 (in the case of “No”), the process returns to step S200 to continue the wobbling. In step S202, if the in-focus direction can be determined (in the case of “Yes”), the process proceeds to step S203.

In step S203, the focus lens control unit 104 performs a “mountain climbing operation”. The “mountain climbing operation” is an operation for moving the focus lens at a higher speed than wobbling in a direction in which the focus evaluation value increases. Details of the “mountain climbing operation” will be described later.
In step S204, the system control unit 115 determines whether or not the in-focus point has been detected beyond the vertex of the focus evaluation value by the hill climbing operation in step S203. In step S204, when it is not determined that the vertex of the focus evaluation value has been exceeded (in the case of “No”), the process returns to step S203 and the hill climbing operation is continued. When it is determined in step S204 that the vertex of the focus evaluation value has been exceeded (in the case of “Yes”), the process proceeds to step S200, and the focus lens control unit 104 converges to the in-focus position by the wobbling operation. .

On the other hand, if it is determined in step S201 that the subject is in focus (in the case of “Yes”), the process proceeds to step S205. In step S <b> 205, the system control unit 115 stores the focus evaluation value in the focused state in the DRAM 113. The focus evaluation value stored in the DRAM 113 is used for restart determination of the focus adjustment operation in steps S206 and S207.
In step S206, the system control unit 115 performs restart determination by comparing the focus evaluation value in the in-focus state stored in step S205 with the latest focus evaluation value. In step S207, the system control unit 115 determines whether the restart condition is satisfied using the restart determination result in step S206. Specifically, when the difference between the focus evaluation value in the focused state stored in step S205 and the latest focus evaluation value is equal to or higher than a predetermined level, the restart condition for the focus adjustment operation is satisfied. It is determined. If it is determined in step S207 that the restart condition is satisfied (in the case of “Yes”), the process returns to step S200 to restart the focus adjustment operation. On the other hand, when it is not determined that the restart condition is satisfied (in the case of “No”), the process returns to step S206.
Thereafter, the system control unit 115 repeats the restart determination based on the focus evaluation values acquired periodically. That is, the monitoring of the change in the focus evaluation value is continued.

  As described above, in the AF operation, the system control unit 115 of the imaging device 1 continuously performs the wobbling operation, the mountain climbing operation, and the restart determination. The system control unit 115 controls the focus lens so as to maintain the in-focus state by the AF operation.

<Wobbling action>
Next, the wobbling operation will be described. The wobbling operation is a low-speed AF operation for specifying the focus position and the direction of the focus position. In the wobbling operation, the focus lens is intermittently moved by a predetermined amount, and the in-focus point and the direction in which the in-focus point exists are specified based on the increase / decrease of the acquired focus evaluation value. The wobbling operation corresponds to an example of processing by the first focus adjustment unit. FIG. 3 is a flowchart showing processing of the wobbling operation.

In step S <b> 300, the system control unit 115 acquires a focus evaluation value from the AF processing unit 105.
In step S301, the system control unit 115 determines the shape of the mountain (the shape of the curve of the graph) created by the focus evaluation value acquired in step S300 and the focus evaluation value acquired before that. Specifically, the focus evaluation value acquired this time is compared with the focus evaluation value acquired last time, and it is determined whether or not there is a predetermined level of increase or decrease. Note that the focus evaluation value used for determination in the shape determination in step S301 is obtained by adding the focus evaluation value H and the focus evaluation value L (hereinafter, referred to as a focus evaluation value in the wobbling operation). The system control unit 115 determines that the inversion condition is satisfied when the focus evaluation value decreases with respect to the current traveling direction of the focus lens. The system control unit 115 stores the position where the reversal condition is satisfied for a predetermined number of times. The position where the stored inversion condition is satisfied is used for determination in step S302 described later.

  On the other hand, when the focus evaluation value continuously increases, the system control unit 115 measures the number of times (counter value). When the increase in the focus evaluation value continues for a predetermined number of times, the system control unit 115 determines that there is a peak focus evaluation value in the current traveling direction. That is, the traveling direction of the focus lens is specified as the current traveling direction. Thus, in step S301, the system control unit 115 determines the shape of the mountain (the shape of the curve of the graph) created by the focus evaluation value in the wobbling operation.

In step S302, the system control unit 115 determines whether or not a condition for reversing the traveling direction of the focus lens in the wobbling operation is satisfied. When it is determined that the inversion condition is satisfied (in the case of “Yes”), the process proceeds to step S303. In step S303, the system control unit 115 controls the focus lens control unit 104, and the focus lens control unit 104 moves the focus lens by a predetermined distance in a direction opposite to the immediately preceding traveling direction.
On the other hand, when it is determined in step S302 that the inversion condition is not satisfied (in the case of “No”), the process proceeds to step S304. In step S304, the system control unit 115 controls the focus lens control unit 104, and the focus lens control unit 104 moves the focus lens by a predetermined distance in the same direction as the immediately preceding traveling direction.

In step S305, the system control unit 115 determines whether or not the moving direction of the focus lens is the same continuously for a predetermined number of times. For this determination, the number of times (counter value) measured in step S301 is used. If it is determined that they are the same (in the case of “Yes”), the process proceeds to step S307. In step S307, the system control unit 115 determines that the direction in which the peak value of the focus evaluation value exists is in the current traveling direction of the focus lens. That is, the direction of the peak value of the focus evaluation value is specified.
On the other hand, when it is not determined in step S305 that the moving direction of the focus lens is the same for a predetermined number of times (in the case of “No”), the process proceeds to step S306.

In step S306, the system control unit 115 determines whether or not the focus lens exists in the same area over a predetermined number of determinations in step S302. In this determination, the position information stored at the time of direction reversal in step S301 is used. In step S306, when it is not determined that it exists in the same area for a predetermined number of times (in the case of “No”), the wobbling operation is terminated. On the other hand, if it is determined that the focus lens exists in the same area for a predetermined number of times (in the case of “Yes”), the process proceeds to step S308.
In step S308, the system control unit 115 determines that the focus lens is in the in-focus position. As described above, when the rise and fall of the focus evaluation value appear at timings close to each other near the top of the peak of the focus evaluation value, the above processing is performed.
As described above, the system control unit 115 performs the wobbling operation.

<Climbing movement>
Next, the hill climbing operation will be described. The hill-climbing operation is an AF operation that is faster than the wobbling operation for specifying the in-focus position. That is, in the hill climbing operation, the focus lens is continuously moved at a predetermined speed, and the in-focus position is specified by increasing or decreasing the acquired focus evaluation value. The mountain climbing operation corresponds to an example of processing by the second focus adjustment unit. FIG. 4 is a flowchart showing the processing of the mountain climbing operation.

In step S <b> 400, the system control unit 115 acquires the lens position corresponding to the focus evaluation value from the AF processing unit 105.
Next, in step S401, the system control unit 115 executes a focus speed setting process. Although details will be described later, the driving speed of the focus lens in the hill-climbing operation is determined and set based on the acquired focus evaluation value and the corresponding lens position.

  In step S402, the system control unit 115 determines the shape of the mountain (the shape of the curve in the graph) created by the focus evaluation value from the focus evaluation value acquired in step S400 and the focus evaluation value acquired in the past. . Note that the focus evaluation value used for the determination in the hill-climbing shape determination in step S402 is a focus evaluation value L calculated by a lower frequency component (hereinafter referred to as a focus evaluation value in the hill-climbing operation). Specifically, when the acquired score evaluation value continuously shows a rising tendency for a predetermined number of times and then shows a downward tendency, the system control unit 115 detects the peak position (peak position) of the mountain. It is determined that On the other hand, when the acquired focus evaluation value shows a downward trend continuously for a predetermined number of times along the traveling direction of the focus lens, the system control unit 115 is away from the peak position of the mountain. judge. When it is determined that the distance from the top of the mountain is determined, the system control unit 115 determines that a condition for reversing the traveling direction of the focus lens is satisfied. If the number of focus evaluation values necessary for determining the shape of the mountain has not been acquired, or if it is not possible to clearly determine whether the focus evaluation value rises or falls, the current hill-climbing operation is continued (that is, the current state) It is determined to be maintained).

  In step S403, the system control unit 115 determines whether or not a condition for reversing the traveling direction of the focus lens is satisfied based on the result of step S402. If it is established (in the case of “Yes”), the process proceeds to step S407. In step S407, the system control unit 115 controls the focus lens control unit 104, and the focus lens control unit 104 sets the traveling direction of the focus lens in a direction opposite to the immediately preceding traveling direction. The focus lens control unit 104 moves the focus lens in the set traveling direction.

  On the other hand, when it is determined in step S403 that the inversion condition is not satisfied (in the case of “No”), the process proceeds to step S404. In step S404, the system control unit 115 determines whether or not a focal point is detected based on the result of step S402. If the in-focus point is detected in step S404, the process proceeds to step S406. In step S406, the system control unit 115 controls the focus lens control unit 104 based on the determination in step S402 to stop the focus lens.

On the other hand, if it is determined in step S404 that the in-focus point has not been detected, the process proceeds to step S405. In step S405, the system control unit 115 controls the focus lens control unit 104, and the focus lens control unit 104 sets the traveling direction of the focus lens to the same direction as before. In step S405 and step S407, the system control unit 115 drives the focus lens at the driving speed set in step S401. In the first time, it is determined that the in-focus point cannot be detected from the determination result in step S402, and the inversion condition in step S403 is not satisfied. Accordingly, in step S405, the focus lens that has not started moving (= stopped) is moved toward the traveling direction detected in the wobbling operation. In the second and subsequent times, the traveling direction of the moving focus lens is set and the driving speed is changed based on the processing conditions determined in step S402 and step S403.
As described above, the system control unit 115 performs a hill climbing operation.

<Focus speed setting process>
Next, the focus speed setting process will be described. The focus speed setting process is a series of processes for determining the drive speed for the purpose of changing the drive speed of the focus lens in accordance with the focus state in the above-described hill-climbing operation. FIG. 5 is a flowchart showing the focus speed setting process.

Step S500 is exposure condition determination for determining whether the subject being photographed is normal illuminance or low illuminance, and the determination result is used in subsequent processing. Details will be described later.
Next, in step S501, the system control unit 115 stores the lens position corresponding to the focus evaluation value. In step S <b> 502, the system control unit 115 sets the peak position (focusing point) of the mountain as a center based on the mountain shape (the shape of the curve of the graph) created by the focus evaluation value and the lens position, and a plurality of regions around the peak position. And the region to which the current focus lens position belongs is determined. Details will be described later.

  In step S503, the system control unit 115 calculates the degree of focus obtained by normalizing the focus evaluation value in order to determine the degree of focus. Details will be described later. In step S504, the system control unit 115 reads a determination result related to the exposure condition determined in step S500. Subsequently, in step S505, the system control unit 115 determines whether the scene is a low-light scene based on the determination result read in step S504. If it is determined that the scene is a low-light scene (in the case of “Yes”), the process proceeds to step S507. In step S507, the system control unit 115 executes a focus speed selection process (low illuminance).

  On the other hand, if it is determined in step S505 that the scene is not a low illumination scene (in the case of “No”), the process proceeds to step S506. In step S506, the system control unit 115 executes a focus speed selection process (normal illuminance). In step S506 and step S507, the brightness is switched according to the brightness of the subject. However, the focus lens in the hill-climbing operation is obtained using the results of the exposure condition determination, the shape prediction process, and the focus degree calculation process described above. This is a process for determining the driving speed. Details will be described later.

After the drive speed is determined in steps S506 and S507, the process proceeds to step S508, and the system control unit 115 stores the determined drive speed. The focus lens driving speed stored here is actually reflected in the above-described hill-climbing process.
As described above, the system control unit 115 performs the focus speed setting process.

<Exposure condition judgment processing>
Next, the exposure condition determination process will be described. FIG. 6 is a flowchart showing the exposure condition determination process.
In step S600, the system control unit 115 acquires the currently set gain amount. This processing is performed by acquiring a signal amplification amount (hereinafter referred to as a gain amount) in the nonlinear amplification circuit in the imaging processing unit 109 among the current exposure conditions. The gain amount is controlled by an automatic exposure (AE) process performed by the AE processing unit 103 and the system control unit 115. For example, when the imaging signal can maintain a desired signal level in a bright scene, the gain amount is small. . However, when the desired signal level cannot be maintained in a dark scene, the gain amount tends to increase.

Next, in step S601, the system control unit 115 determines whether the scene is a low-light scene. Specifically, the system control unit 115 compares the gain amount reference value (low illumination gain threshold) with the gain amount acquired in step S600, and determines whether or not the current gain amount is greater than the low illumination gain threshold. Determine. When the current gain amount is larger than the low illuminance gain threshold (in the case of “Yes”), the process proceeds to step S602, and the system control unit 115 stores the exposure condition determination result as “low illuminance” in the DRAM 113. . On the other hand, if the current gain amount is not larger than the low illuminance gain threshold value in step S601 (in the case of “No”), the process proceeds to step S603, and the system control unit 115 sets the exposure condition determination result to “normal”. It is stored in the DRAM 113 as “illuminance”.
As described above, the system control unit 115 performs the exposure condition determination process.

<Shape prediction process>
Next, the shape prediction process will be described. FIG. 7 is a flowchart showing the shape prediction process. In the shape prediction process, the peak shape of the focus evaluation value is predicted using the amount of change in the focus evaluation value for each fixed period, and the current lens position is determined.
First, in step S700, the system control unit 115 uses the focus evaluation value stored in advance in step S400 of FIG. 4 and the corresponding lens position to form a mountain shape created by the focus evaluation value and the lens position. Based on this, the slope of the peak which is the first derivative output is obtained. The system control unit 115 stores the obtained mountain slope in the DRAM 113. Details of the processing will be described later.
In step S701, the system control unit 115 obtains a second derivative output of the focus evaluation value. Specifically, the system control unit 115 further calculates a change in the mountain inclination based on the output of the mountain inclination of the focus evaluation value obtained in step S700. Details of the processing will be described later.

Next, in step S <b> 702, the system control unit 115 performs an area specifying process for determining how close the current lens position is to the focal point. In this process, the system control unit 115 predicts the relationship between the current lens position and the in-focus point based on the primary and secondary differential outputs in steps S700 and S701. Although details of the processing will be described later, the relationship between the primary and secondary differential outputs with respect to the focus evaluation value is as shown in the lower part of FIG. That is, the primary differential output is 0 at the in-focus position (= the position where the focus evaluation value is maximum), while the focus evaluation value is formed near the position where the primary differential output is maximum (= in the vicinity of the in-focus point). The secondary differential output is 0 at the position where the slope of the mountain is the steepest. Therefore, it is possible to roughly determine the relationship between the in-focus point and the current lens position by monitoring and determining changes in the primary and secondary differential outputs.
As described above, the system control unit 115 performs the shape prediction process.

<First derivative calculation processing>
Next, the primary differential calculation process will be described. FIG. 8 is a flowchart showing the first-order differential calculation process. Note that the focus evaluation value used in the first-order differential calculation processing is the focus evaluation value L calculated from a lower frequency component (hereinafter, referred to as the focus evaluation value in the hill-climbing operation).

First, in step S800, the system control unit 115 acquires the latest focus evaluation value (AF 1 ) and the lens position (P 1 ) corresponding to the focus evaluation value. The focus evaluation value and the corresponding lens position are stored in advance in the DRAM 113 in step S400 of FIG. 4, and a predetermined number of pieces of information from the nearest number are stored as many as necessary for calculating the first derivative. . The system control unit 115 reads out corresponding information from the DRAM 113.

Next, in step S801, the system control unit 115 obtains a focus evaluation value (AF 2 ) for a predetermined time and a lens position (P 2 ) corresponding to the focus evaluation value, as in step S800. In step S802, the system control unit 115 obtains and divides the difference between the focus evaluation values and the lens value acquired in steps S800 and S801. In step S803, the system control unit 115 stores the primary differential output D1 obtained from the calculation result in the DRAM 113.
As described above, the system control unit 115 performs the primary differential calculation process.

<Secondary differential calculation processing>
Next, the secondary differential calculation process will be described. FIG. 9 is a flowchart showing the secondary differential calculation process. First, in step S900, the system control unit 115 acquires a lens position (P 1 ) corresponding to the first-order differential output (D1 1 ) stored in advance in the DRAM 113 in step S803 of FIG.
Similarly, in step S901, the system control unit 115 obtains the lens position (P 2 ) corresponding to the first-order differential output (D1 2 ) a predetermined time ago. In step S902, the system control unit 115 calculates and divides the difference between the first-order differential outputs and the lens value obtained in steps S900 and S901. In step S903, the system control unit 115 stores the secondary differential output D2 obtained from the calculation result in the DRAM 113.
As described above, the system control unit 115 performs the second derivative calculation process.

<Focus degree calculation process>
Next, focus degree calculation processing will be described. FIG. 10 is a flowchart showing the focus degree calculation process. First, in step S1000, the system control unit 115 reads the focus evaluation value H calculated by extracting a high frequency component from the latest focus evaluation value from the DRAM 113. Similarly, in step S <b> 1001, the system control unit 115 reads the focus evaluation value L calculated by extracting a low frequency component from the latest focus evaluation value from the DRAM 113.

Next, in step S1002, the system control unit 115 normalizes by dividing the focus evaluation value H read in step S1000 by the focus evaluation value L read in step S1001. In the present embodiment, the value obtained in step S1001 is referred to as the degree of focus. In the example shown in FIG. 13, the output obtained by normalizing the focus evaluation value in step S1002 is approximately 1 near the in-focus point, and the value tends to decrease as it deviates from that. Therefore, the system control unit 115 can determine whether or not it is near the in-focus position by examining the change tendency of the signal.
Next, in step S1003, the system control unit 115 stores the in-focus degree obtained in step S1002 in the DRAM 113. The degree of focus stored here is used in a focus speed setting process described later.
As described above, the system control unit 115 performs the focus degree calculation process.

<Area identification processing>
Next, the area specifying process will be described. FIG. 11 is a flowchart showing the area specifying process. In the area specifying process, three areas “area 0”, “area 1”, and “area 2” are selected according to the change tendency of the primary and secondary differential outputs of the focus evaluation values shown in the lower part of FIG. The purpose is to identify. In FIG. 13, “area 0” is an area out of focus, “area 1” is relatively close to the focus, and “area 2” is near the focus.

  First, in step S1100, the system control unit 115 reads the determination result of the exposure condition determination process described above with reference to FIG. Next, in step S1101, the system control unit 115 determines whether the result read in step S1100 is low illuminance. If it is determined in step S1101 that the illumination is not low (in the case of “No”), the process proceeds to step S1102. In step S1102, the system control unit 115 selects and sets threshold values for normal illuminance (D1 threshold value, D2 threshold value) used when determining the magnitude of a first-order and second-order differential output described later.

  On the other hand, if it is determined in step S1101 that the illumination is low (in the case of “Yes”), the process proceeds to step S1103. In step S1103, the system control unit 115 selects and sets threshold values for low illuminance (D1 threshold value, D2 threshold value). That is, the threshold value is switched between the normal illuminance scene and the low illuminance scene and used for determination. In the case of a low-light scene, the focus evaluation value calculated as shown in FIG. 15 tends to change gradually because it is easily affected by noise components included in the image pickup signal and the original signal components are reduced. . Therefore, the threshold value is switched between the normal illuminance scene and the low illuminance scene in consideration of the influence on the first and second differential outputs calculated therefrom.

Next, in step S1104, the system control unit 115 determines whether or not the immediately preceding determination result is “area 0”. It is assumed that the determination result immediately before this is stored in the DRAM 113 and is initialized to “area 0” in advance when the mountain climbing operation is started before the area specifying process is performed. If the immediately previous determination result is “area 0”, the process advances to step S1105.
In step S1105, the system control unit 115 reads the previously obtained first-order differential output D1 and determines whether the absolute value is larger than the threshold value determined in step S1102 or step S1103. If it is larger than the threshold value (in the case of “Yes”), the process proceeds to step S1106, and the system control unit 115 reads the previously obtained secondary differential output D2 and determines its absolute value in step S1102 or step S1103. It is determined whether it is larger than the threshold value. If the conditions of step S1105 and step S1106 are satisfied, in step S1107, the system control unit 115 determines that the current lens position belongs to “region 1”, and stores the result. In step S1108, the system control unit 115 counts up and stores the number of continuations for determining how long the “position 1” has the lens position. In step S1106, the number of times that the above-mentioned “region 1” is continued is also determined, and if it continues for a certain period or longer, it is recognized that the secondary differential output D2 is below the threshold value. This is because the secondary differential output decreases so as to approach 0 near the peak of the primary differential output.

  On the other hand, if it is equal to or less than the threshold value (in the case of “No”) in step S1105, the process proceeds to step S1109, and the system control unit 115 determines that the current lens position belongs to “region 0”, and as a result. Remember. In step S1106, when the value is equal to or smaller than the threshold value (in the case of “No”), the process proceeds to step S1109, and the system control unit 115 determines that the current lens position belongs to “region 0”, and the result Remember. That is, the system control unit 115 determines “area 0” when the primary and secondary differential outputs are less than or equal to the threshold, determines “area 1” when the period is greater than the threshold, and further belongs to “area”. Memorize the total period. This continuation count counter is cleared at the time of switching of areas, and is cleared even when the area specifying process is executed, when it is initialized to "area 0" in advance when the hill climbing operation is started. .

  If it is determined in step S1104 that the previous determination result is not “area 0” (in the case of “No”), that is, if a determination result other than “area 0” has already been obtained in the past determination, the process proceeds to step S1110. . In step S <b> 1110, the system control unit 115 determines whether or not “region 1” is immediately before and the number of continuations is continued for a certain period or longer. That is, it is determined whether or not the primary / secondary differential output has stably exceeded the threshold value for a certain period, and if the condition is satisfied, the process proceeds to step S1111. On the other hand, if the condition is not satisfied, the process proceeds to step S1105, and the system control unit 115 determines again whether it belongs to “area 0” or “area 1”.

In step S1111, the system control unit 115 determines whether or not the secondary differential output D2 is 0 or less. The secondary differential output uses a feature that becomes 0 at the position where the primary differential output is maximum. When this condition is satisfied, in step S1112, the system control unit 115 stores the determination result as “region 2” and performs processing. Exit.
As described above, the system control unit 115 performs the area specifying process and determines whether the lens position during the mountain climbing operation belongs to “area 0”, “area 1”, or “area 2”.

<Focus speed selection process (normal illuminance)>
Next, focus speed selection processing (normal illuminance) will be described. FIG. 12 is a flowchart showing focus speed selection processing (normal illuminance).
First, in step S1200, the system control unit 115 reads from the DRAM 113 the degree of focus calculated and stored in step S503 in FIG. In step S1201, the system control unit 115 reads out the result of the shape prediction process determined in step S502 of FIG.

In step S <b> 1202, the system control unit 115 determines whether or not the current lens position belongs to “region 2”. If the current lens position belongs to “region 2” (in the case of “Yes”), the process proceeds to step S1204, and the system control unit 115 sets the driving speed of the focus lens to a predetermined low speed. To do.
On the other hand, if it is determined in step S1202 that the current lens position does not belong to “region 2” (in the case of “No”), the process proceeds to step S1203, where the system control unit 115 It is determined whether or not the position belongs to “region 1”.

In step S1203, when the current lens position belongs to “region 1” (in the case of “Yes”), the process proceeds to step S1205, and the focus lens is driven according to the degree of focus read in step S1200. Calculate and set the speed.
On the other hand, if it is determined in step S1203 that the current lens position does not belong to “region 1” (in the case of “No”), the process proceeds to step S1206, and the system control unit 115 sets the drive speed of the focus lens in advance. Set to a fixed high speed.

  The setting of the driving speed so far will be described with reference to FIG. First, in “area 0” shown in FIG. 13, the shape of the mountain created by the focus evaluation value hardly changes and the levels of the first and second differential outputs become low. On the other hand, when a signal component having a high frequency is extracted, it tends to be easily affected by a noise component included in the imaging signal. In other words, when the influence of the noise component is stronger than the original signal component, such as when it is out of focus, it is output at a higher value than when a signal component with a low frequency is extracted by the high frequency component generated by the noise. There is a case. For this reason, the focus degree obtained by dividing the focus evaluation value H by the focus evaluation value L and normalizing the value may become high without being linked to the actual focus state (the focus degree shown in FIG. 13). See both ends).

  Therefore, in the present embodiment, the determination result of the shape prediction process is prioritized over the degree of focus in “area 0”, and the focus speed in the upper part of FIG. 13 is used in this area regardless of the calculated degree of focus. The driving speed is set to a predetermined high speed. Next, in “region 1” shown in FIG. 13, the primary and secondary differential outputs are larger than a predetermined threshold value, and a change in the focus evaluation value can be expected to some extent. Therefore, if it is determined that the current position belongs to this region, the drive speed is set according to the degree of focus as in the focus speed of FIG. For example, when a predetermined focus level (focus level Low) or less, a predetermined high speed is set, and when a predetermined focus level (focus level High) or higher, a predetermined low speed is set. When the focus level is low and the focus level is high, the system control unit 115 obtains a high speed and a low speed by linear interpolation according to the focus level. Further, since “region 2” is near the in-focus position, the driving speed of the focus lens is set to a predetermined low speed.

In this way, by using both the positional relationship between the lens position determined from the shape prediction process and the focal point and the result of the in-focus degree obtained from the focus evaluation value, the driving speed of the focus lens is appropriately controlled, It is possible to suppress the amount of passing through the focal point and obtain a better image.
As described above, the system control unit 115 performs the focus speed selection process (normal illuminance).

<Focus speed selection process (low illumination)>
Next, focus speed selection processing (low illumination) will be described. FIG. 14 is a flowchart showing focus speed selection processing (low illumination).
First, in step S1400, the system control unit 115 reads from the DRAM 113 the degree of focus calculated and stored in step S503 in FIG. In step S1401, the system control unit 115 reads out the result of the shape prediction process determined in step S502 of FIG.

In step S1402, the system control unit 115 determines whether or not the current lens position belongs to “region 2”. When the current lens position belongs to “region 2” (in the case of “Yes”), the process proceeds to step S1403, and the system control unit 115 sets the driving speed of the focus lens to a predetermined low speed. To do.
On the other hand, if it is determined in step S1402 that the current lens position does not belong to “region 2” (in the case of “No”), the process proceeds to step S1404, and the system control unit 115 sets the drive speed of the focus lens in advance. Set to a fixed high speed.

  The setting of the driving speed so far will be described with reference to FIG. The flowchart shown in FIG. 14 described above is control for a low-illuminance scene, and is different from normal illuminance control. When calculating a focus evaluation value by extracting a signal component having a high frequency, the focus evaluation value calculated by extracting a signal component having a lower frequency is more likely to be affected by a noise component included in the imaging signal. It is in. Especially when the influence of noise components is stronger than the original signal components, such as in low-light scenes, high-frequency signal components are output at high values regardless of the actual focus position due to the high-frequency components created by the noise. There is. Therefore, as shown in the focus degree graph of FIG. 15, the focus degree obtained by dividing and normalizing the focus evaluation value H by the focus evaluation value L has a high value without being linked with the actual focus state. turn into.

Therefore, when the focus speed selection process (low illuminance) is performed, this determination result is obtained when it is determined as “area 0” or “area 1” in the shape prediction process using the primary and secondary differential outputs. Is given priority over the degree of focus. That is, the system control unit 115 sets the drive speed of the focus lens to a predetermined high speed as shown in the focus speed graph of FIG. On the other hand, when it is determined as “region 2”, the system control unit 115 prioritizes the determination result of the shape prediction process and sets the driving speed of the focus lens to a predetermined low speed.
As described above, the system control unit 115 performs the focus speed selection process (low illuminance).

As described above, in the present embodiment, the priority order between the in-focus level that is a determination criterion when setting the drive speed of the focus lens according to the exposure condition and the prediction result of the shape prediction process is switched. Here, FIG. 16 shows a comparison of differences in focus speed setting processing between normal illuminance and low illuminance in the present embodiment. FIG. 16A shows a normal illuminance, and FIG. 16B shows a low illuminance. At normal illuminance, the focus lens drive speed is determined based on the determination result of the shape prediction process and the degree of focus, which are highly reliable, or which are determined to be reliable by other determination results. Set. On the other hand, since there is a concern that the reliability of the in-focus level may be lowered at low illumination, the driving speed of the focus lens is set using a relatively reliable determination result of the shape prediction process.
Therefore, according to the present embodiment, even when the luminance signal obtained from the image sensor includes a lot of noise components and is strongly influenced by the luminance signal, the focus lens is driven based on highly reliable information at that time. Setting the speed enables stable focus adjustment operation.

(Second Embodiment)
Next, a second embodiment will be described with reference to the flowchart of FIG.
In the first embodiment, whether the scene is a normal illuminance scene or a low illuminance scene is determined based on the gain amount in the nonlinear amplification circuit in the imaging processing unit 109, but may be determined based on other exposure conditions. In the second embodiment, an example in which the subject brightness indicating the brightness of the subject calculated by the AE processing unit 103 is used is shown.

First, in step S <b> 1700, the system control unit 115 acquires the current subject brightness from the AE processing unit 103. In this process, the subject brightness indicating the brightness of the subject is acquired from the exposure conditions calculated as a result of the exposure control performed by the AE processing unit 103.
Next, in step S1701, the system control unit 115 determines whether it is a low-light scene. Specifically, the system control unit 115 compares the subject brightness reference value with the subject brightness acquired in step S1700, and determines whether or not the current subject brightness is smaller than the reference value. When the current subject brightness is smaller than the reference value (in the case of “Yes”), the process proceeds to step S1702, and the system control unit 115 stores the exposure condition determination result as “low illuminance” in the DRAM 113.

On the other hand, if the current subject brightness is not smaller than the reference value in step S1701 (in the case of “No”), the process proceeds to step S1703, and the system control unit 115 sets the exposure condition determination result to “normal illuminance”. Is stored in the DRAM 113.
As described above, the system control unit 115 performs the exposure condition determination process of the second embodiment.
As described above, in the exposure condition determination process, information that can be used to determine whether the scene is a low-light scene or a normal-light scene may be used, and the exposure condition determination process is not limited to the first embodiment or the second embodiment. . Note that it may be determined whether the scene is a low-light scene or a normal light scene based on both the gain amount and the subject brightness.

(Third embodiment)
Next, a third embodiment will be described.
In the first embodiment, in the focus lens speed selection process, when “region 2”, that is, near the in-focus point, the drive speed is set to a predetermined low speed. This is to suppress the amount of passing the focal point when performing the hill climbing operation.
In the third embodiment, it is possible to further suppress the amount of passing through the focal point. Specifically, in the focus lens speed selection process shown in FIGS. 12 and 14, when the belonging region is determined as “region 2” as the respective determination results, and the drive speed is set to a predetermined low speed, The predetermined low speed is changed to 0 (= stop).

Further, the flowchart of the focus adjustment operation shown in FIG. 2 is changed to the flowchart shown in FIG. The main differences between the flowcharts of FIG. 2 and FIG. 18 are Step S1804, Step S1808, and Step S1809.
Hereinafter, processing of related parts from step S1803 will be described.
First, in step S1803, the focus lens control unit 104 performs a hill climbing operation. The hill-climbing operation is an operation for moving the focus lens at a higher speed than wobbling in a direction in which the focus evaluation value increases.
In step S1804, the system control unit 115 determines whether the in-focus point has been detected beyond the vertex of the focus evaluation value by the hill-climbing operation in step S1803. If it is determined in step S1804 that the vertex of the focus evaluation value has been exceeded (in the case of “Yes”), the process advances to step S1800, and the focus lens control unit 104 performs a wobbling operation controlled by a fine movement amount. To converge to the in-focus position.

  On the other hand, if it is not determined in step S1804 that the vertex of the focus evaluation value has been exceeded (in the case of “No”), the process proceeds to step S1808. In step S1808, the system control unit 115 determines whether or not the drive speed is set to 0 (= stop) as a result of the focus lens speed selection process described above performed in step S1803. If the drive speed is other than 0, the process returns to step S1803 to continue the hill climbing operation. On the other hand, when the driving speed is 0, that is, when it is determined as “region 2” in the shape prediction process and there is a high possibility that the area is near the in-focus point, the process proceeds to step S1809, and the system control unit 115 Stop the focus lens. Next, returning to step S1800, the focus lens control unit 104 starts a wobbling operation.

  Accordingly, in the focus lens speed selection process shown in FIG. 12 and FIG. 14, when it is detected that it belongs to “region 2”, the hill climbing operation stops immediately before it does not pass through the focal point, From there to the focal point, it converges to the focal point by the wobbling operation. Thereby, a stable focus adjustment operation in the vicinity of the in-focus point can be performed regardless of the normal illumination scene and the low illumination scene.

  Although the present invention has been described in detail based on the preferred embodiments thereof, the present invention is not limited to these specific embodiments, and various forms within the scope of the present invention are also included in the present invention. It is. A part of the above-described embodiments may be appropriately combined.

  Also, when a software program that realizes the functions of the above-described embodiments is supplied from a recording medium directly to a system or apparatus having a computer that can execute the program using wired / wireless communication, and the program is executed Are also included in the present invention. Accordingly, the program code itself supplied and installed in the computer in order to implement the functional processing of the present invention by the computer also realizes the present invention. That is, the computer program itself for realizing the functional processing of the present invention is also included in the present invention. In this case, the program may be in any form as long as it has a program function, such as an object code, a program executed by an interpreter, or script data supplied to the OS. As a recording medium for supplying the program, for example, a magnetic recording medium such as a hard disk or a magnetic tape, an optical / magneto-optical storage medium, or a nonvolatile semiconductor memory may be used. As a program supply method, a computer program that forms the present invention is stored in a server on a computer network, and a connected client computer downloads and programs the computer program.

  101: photographing lens 104: focus lens control unit 105: AF processing unit 108: imaging device 109: imaging processing unit 115: system control unit

Claims (9)

  1. An imaging device that controls the position of a focus lens based on an imaging signal acquired by an imaging unit by imaging a subject,
    A focus evaluation value calculating means for extracting a specific frequency component from the imaging signal and calculating a focus evaluation value indicating a contrast of the imaging signal;
    A focus degree calculating means for calculating a focus degree indicating a degree of focus using the focus evaluation value;
    Shape prediction means for predicting the shape of the focus evaluation value mountain using the amount of change of the focus evaluation value for each fixed period;
    An imaging apparatus comprising: a focus speed setting unit that sets a drive speed of the focus lens based on a focus degree calculated by the focus degree calculation unit and a result obtained by the shape prediction unit.
  2. Having an exposure condition detection means for detecting an exposure condition when imaging a subject;
    The focus speed setting means sets a priority order between the focus degree and a result obtained by the shape prediction means as a determination criterion when setting the drive speed according to the exposure condition detected by the exposure condition detection means. The imaging apparatus according to claim 1, wherein the driving speed of the focus lens is set by switching.
  3.   The focus degree calculating means calculates the focus degree by normalizing using a signal component having a high frequency and a signal component having a low frequency extracted from the imaging signal by the focus evaluation value calculating means. The imaging apparatus according to claim 1, wherein the imaging apparatus is characterized.
  4.   4. The shape prediction unit obtains first and second derivative outputs of the focus evaluation value, and predicts a focus state based on a change in each derivative output. 5. The imaging device described in 1.
  5. The exposure condition detected by the exposure condition detection means is a gain amount for making the level of the imaging signal constant,
    The focus speed setting means sets the drive speed of the focus lens by giving priority to the result of the shape prediction means over the focus degree when the gain amount is larger than a predetermined value. The imaging apparatus according to any one of claims 2 to 4, wherein the imaging apparatus is characterized.
  6. The exposure condition detected by the exposure condition detection means is a subject brightness indicating the brightness of the subject,
    The focus speed setting means sets the drive speed of the focus lens by giving priority to the result of the shape prediction means over the focus degree when the subject brightness becomes darker than a predetermined brightness. The imaging apparatus according to any one of claims 2 to 4, wherein the imaging apparatus is characterized.
  7. First focus adjustment means for acquiring a focus evaluation value after intermittently moving the focus lens and detecting a focal point;
    A second focus adjusting means for acquiring a focus evaluation value while continuously moving the focus lens and detecting a focal point;
    When detecting the in-focus point by the second focus adjusting unit, when the driving speed is set to 0 by the focus speed setting unit, a switching unit that switches to a focus adjustment operation by the first focus adjusting unit; The imaging apparatus according to claim 1, wherein the imaging apparatus includes:
  8. A focus adjustment method for an imaging apparatus that controls the position of a focus lens based on an imaging signal acquired by an imaging unit by imaging a subject,
    A focus evaluation value calculating step of extracting a specific frequency component from the imaging signal and calculating a focus evaluation value indicating a contrast of the imaging signal;
    A focus degree calculating step for calculating a focus degree indicating a degree of focus using the focus evaluation value;
    A shape prediction step for predicting a shape of a focus evaluation value peak using a change amount of the focus evaluation value for each fixed period;
    And a focus speed setting step of setting a drive speed of the focus lens based on the focus degree calculated by the focus degree calculation step and the result of the shape prediction step.
  9. A program for focus adjustment of an imaging device that controls the position of a focus lens based on an imaging signal acquired by an imaging unit by imaging a subject,
    A focus evaluation value calculating step of extracting a specific frequency component from the imaging signal and calculating a focus evaluation value indicating a contrast of the imaging signal;
    A focus degree calculating step for calculating a focus degree indicating a degree of focus using the focus evaluation value;
    A shape prediction step for predicting a shape of a focus evaluation value peak using a change amount of the focus evaluation value for each fixed period;
    A program for causing a computer to execute a focus speed setting step of setting a drive speed of the focus lens based on a focus degree calculated by the focus degree calculation step and a result of the shape prediction step.
JP2011128551A 2011-06-08 2011-06-08 Imaging device, focus adjustment method, and program Active JP5762156B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011128551A JP5762156B2 (en) 2011-06-08 2011-06-08 Imaging device, focus adjustment method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011128551A JP5762156B2 (en) 2011-06-08 2011-06-08 Imaging device, focus adjustment method, and program

Publications (3)

Publication Number Publication Date
JP2012255896A true JP2012255896A (en) 2012-12-27
JP2012255896A5 JP2012255896A5 (en) 2014-07-17
JP5762156B2 JP5762156B2 (en) 2015-08-12

Family

ID=47527519

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011128551A Active JP5762156B2 (en) 2011-06-08 2011-06-08 Imaging device, focus adjustment method, and program

Country Status (1)

Country Link
JP (1) JP5762156B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014188884A1 (en) * 2013-05-22 2014-11-27 株式会社 日立産業制御ソリューションズ Imaging device and back focus adjustment method
JP2015040922A (en) * 2013-08-20 2015-03-02 キヤノン株式会社 Imaging apparatus and its control method, program and storage medium
JP2015079157A (en) * 2013-10-17 2015-04-23 キヤノン株式会社 Image capturing device and control method therefor
JP2015094859A (en) * 2013-11-12 2015-05-18 キヤノン株式会社 Focus adjustment device, imaging device, focus adjustment method and program
JP2016197215A (en) * 2015-04-06 2016-11-24 オリンパス株式会社 Focus adjustment device and method for controlling focus adjustment device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02280579A (en) * 1989-04-21 1990-11-16 Canon Inc Automatic focusing device
JPH05145827A (en) * 1991-05-02 1993-06-11 Canon Inc Automatic focusing controller
JPH0686143A (en) * 1992-08-31 1994-03-25 Canon Inc Autofocus controller
JPH07143387A (en) * 1993-11-19 1995-06-02 Fuji Photo Film Co Ltd Auto focusing device
JP2001255457A (en) * 2000-03-10 2001-09-21 Sony Corp Image pickup device
JP2003279845A (en) * 2002-03-22 2003-10-02 Ricoh Co Ltd Imaging device
JP2004070037A (en) * 2002-08-07 2004-03-04 Matsushita Electric Ind Co Ltd Auto-focusing device
JP2005345948A (en) * 2004-06-07 2005-12-15 Canon Inc Automatic focusing device
JP2009163063A (en) * 2008-01-08 2009-07-23 Canon Inc Focus adjustment device, optical equipment using the same, and control method
JP2009198574A (en) * 2008-02-19 2009-09-03 Canon Inc Focusing apparatus and method for controlling the same

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02280579A (en) * 1989-04-21 1990-11-16 Canon Inc Automatic focusing device
JPH05145827A (en) * 1991-05-02 1993-06-11 Canon Inc Automatic focusing controller
JPH0686143A (en) * 1992-08-31 1994-03-25 Canon Inc Autofocus controller
JPH07143387A (en) * 1993-11-19 1995-06-02 Fuji Photo Film Co Ltd Auto focusing device
JP2001255457A (en) * 2000-03-10 2001-09-21 Sony Corp Image pickup device
JP2003279845A (en) * 2002-03-22 2003-10-02 Ricoh Co Ltd Imaging device
JP2004070037A (en) * 2002-08-07 2004-03-04 Matsushita Electric Ind Co Ltd Auto-focusing device
JP2005345948A (en) * 2004-06-07 2005-12-15 Canon Inc Automatic focusing device
JP2009163063A (en) * 2008-01-08 2009-07-23 Canon Inc Focus adjustment device, optical equipment using the same, and control method
JP2009198574A (en) * 2008-02-19 2009-09-03 Canon Inc Focusing apparatus and method for controlling the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014188884A1 (en) * 2013-05-22 2014-11-27 株式会社 日立産業制御ソリューションズ Imaging device and back focus adjustment method
JP2014228695A (en) * 2013-05-22 2014-12-08 株式会社 日立産業制御ソリューションズ Imaging apparatus and back focus adjustment method
JP2015040922A (en) * 2013-08-20 2015-03-02 キヤノン株式会社 Imaging apparatus and its control method, program and storage medium
JP2015079157A (en) * 2013-10-17 2015-04-23 キヤノン株式会社 Image capturing device and control method therefor
JP2015094859A (en) * 2013-11-12 2015-05-18 キヤノン株式会社 Focus adjustment device, imaging device, focus adjustment method and program
JP2016197215A (en) * 2015-04-06 2016-11-24 オリンパス株式会社 Focus adjustment device and method for controlling focus adjustment device
US10367991B2 (en) 2015-04-06 2019-07-30 Olympus Corporation Focus adjustment device and control method of focus adjustment device

Also Published As

Publication number Publication date
JP5762156B2 (en) 2015-08-12

Similar Documents

Publication Publication Date Title
US9277112B2 (en) Auto focusing apparatus and auto focusing method, and image sensing apparatus
US8416338B2 (en) Imaging device and imaging method
US7869704B2 (en) Focus adjusting device, image pickup apparatus, and focus adjustment method
US8106965B2 (en) Image capturing device which corrects a target luminance, based on which an exposure condition is determined
JP4976160B2 (en) Imaging device
US7778539B2 (en) Optical apparatus
JP5088118B2 (en) Focus adjustment device
US9883095B2 (en) Focus adjusting device and focus adjusting program with control unit to guide a light image based upon detected distributions
JP4858849B2 (en) Imaging apparatus and program thereof
KR101369062B1 (en) Motion information assisted 3a techniques
US9049363B2 (en) Digital photographing apparatus, method of controlling the same, and computer-readable storage medium
JP4957943B2 (en) Imaging apparatus and program thereof
JP5486317B2 (en) Operation of double lens camera to expand image
US10027877B2 (en) Image pickup apparatus to perform scanning of focus lens
US8107806B2 (en) Focus adjustment apparatus and focus adjustment method
CN1940700B (en) Camera with autofocus system
JP6091228B2 (en) Image processing apparatus and imaging apparatus
JP5954336B2 (en) Image processing apparatus, image processing method, and recording medium
JP5221931B2 (en) Imaging apparatus and control method thereof
US10162248B2 (en) Automatic focusing apparatus and control method therefor
US10349028B2 (en) Image pickup apparatus that displays image based on signal output from image pickup device, method of controlling the same, and storage medium
JP5051812B2 (en) Imaging apparatus, focusing method thereof, and recording medium
CN104519276A (en) Image capturing apparatus and control method thereof
JP5144487B2 (en) Main face selection device, control method thereof, imaging device, and program
JP2008052225A (en) Camera, focus control method, and program

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140530

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140530

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150227

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150303

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150420

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150512

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150609