JP2004246160A - Autofocus camera - Google Patents

Autofocus camera Download PDF

Info

Publication number
JP2004246160A
JP2004246160A JP2003036824A JP2003036824A JP2004246160A JP 2004246160 A JP2004246160 A JP 2004246160A JP 2003036824 A JP2003036824 A JP 2003036824A JP 2003036824 A JP2003036824 A JP 2003036824A JP 2004246160 A JP2004246160 A JP 2004246160A
Authority
JP
Japan
Prior art keywords
focus
step
threshold
camera
focus area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003036824A
Other languages
Japanese (ja)
Other versions
JP3949067B2 (en
Inventor
Kazuhiko Arii
和彦 有井
Original Assignee
Sanyo Electric Co Ltd
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanyo Electric Co Ltd, 三洋電機株式会社 filed Critical Sanyo Electric Co Ltd
Priority to JP2003036824A priority Critical patent/JP3949067B2/en
Publication of JP2004246160A publication Critical patent/JP2004246160A/en
Application granted granted Critical
Publication of JP3949067B2 publication Critical patent/JP3949067B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an autofocus camera which can adjust focus correctly regardless of the attitude of a camera. <P>SOLUTION: At the time of deciding an effective focus area to be used in focus adjustment from among focus areas 0 to 4 which are arranged in a field to be photographed, maximum focus evaluation values which are obtained respectively in the focus areas 0 to 4 are compared with thresholds which are allocated respectively to the focus areas 0 to 4. At this time, assignment destinations of thresholds are changed based on the attitude of the camera. In short, the focus areas 0 to 4 are arranged in radial shape on the basis of the center of the field to be photographed and the assignment destinations of the thresholds are rotated according to the attitude of the camera. Thus, this autofocus camera can adjust the focus correctly regardless of the attitude of the camera by changing the assignment destinations of the thresholds based on the attitude of the camera in this manner. <P>COPYRIGHT: (C)2004,JPO&NCIPI

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to an autofocus camera, and more particularly, to an autofocus camera that adjusts a focus based on, for example, one of a plurality of focus areas assigned to an object scene.
[0002]
[Prior art]
As a conventional autofocus camera of this type, a so-called hill-climbing AF is performed for each of a plurality of focus areas assigned to an object field, and the focus position is set to the highest in a focus area in which a maximum focus evaluation value exceeds a threshold. In some cases, a focus area on the closest side is specified, and a focus lens is set at a focus position of the specified focus area.
[0003]
[Problems to be solved by the invention]
However, in the related art, since the threshold is fixed, there is a possibility that the focus may be adjusted based on a focus area where no main subject exists depending on the attitude of the camera.
[0004]
Therefore, a main object of the present invention is to provide an autofocus camera capable of accurately adjusting the focus regardless of the attitude of the camera.
[0005]
[Means for Solving the Problems]
An autofocus camera according to the present invention compares a plurality of focus evaluation values obtained in a plurality of focus areas arranged in an object scene with a plurality of thresholds respectively assigned to the plurality of focus areas and uses the same for focus adjustment. An auto-focus camera that determines a focus area includes a detection unit that detects a camera attitude, and a first change unit that changes an assignment destination of a plurality of thresholds based on a detection result of the detection unit.
[0006]
[Action]
When determining a focus area to be used for focus adjustment from among a plurality of focus areas arranged in the object scene, a plurality of focus evaluation values obtained in the plurality of focus areas are respectively assigned to the plurality of focus areas. It is compared with a plurality of thresholds. Here, the assignment destination of the plurality of thresholds is changed by the first changing unit based on the detection result of the detecting unit that detects the camera posture.
[0007]
By changing the assignment destination of the plurality of thresholds based on the camera posture, the focus can be accurately adjusted regardless of the posture of the camera.
[0008]
Preferably, the detecting means detects the posture of the camera at the time when the focus adjustment is instructed. As a result, it is not necessary to always detect the camera attitude, and the processing load can be reduced.
[0009]
Preferably, the plurality of focus areas are radially arranged with reference to the center of the object scene. At this time, the first changing unit rotates the assignment destination of the plurality of thresholds according to the posture of the camera.
[0010]
More preferably, the plurality of focus areas are arranged in a predetermined number in the vertical direction and the predetermined number in the horizontal direction, the detecting means determines whether the camera posture is in an upright state or a 90 ° tilt state, and the first changing means has a plurality of focus areas. Is rotated by 90 °. By arranging a plurality of focus areas in a predetermined number in the vertical direction and the predetermined number in the horizontal direction, it is possible to accurately assign a threshold value even if the detecting means can determine only the upright state and the 90 ° inclined state.
[0011]
Preferably, the numerical value indicated by each of the plurality of thresholds is changed by the second changing unit in accordance with the result of the judgment by the judging unit for judging the scene. This enables accurate focus adjustment according to the object scene.
[0012]
【The invention's effect】
According to the present invention, the assignees of the plurality of thresholds are changed based on the camera attitude, so that the focus can be accurately adjusted regardless of the attitude of the camera.
[0013]
The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description of embodiments with reference to the drawings.
[0014]
【Example】
Referring to FIG. 1, a digital camera 10 according to this embodiment includes a focus lens 12 and an aperture mechanism 14. The optical image of the object scene enters the light receiving surface of the CCD type image sensor 16 via these members. On the light receiving surface, electric charges corresponding to the optical image are generated by photoelectric conversion.
[0015]
When the LCD 40 is activated, a through image display process is executed. First, the CPU 24 instructs a TG (Timing Generator) 18 to perform the whole thinning-out reading. The TG 18 generates the vertical synchronizing signal Vsync once every 1/30 second, and reads out the vertical synchronizing signal Vsync from the image sensor 16 at a cycle of 30 fps and in a thinned-out manner. From the image sensor 16, a low-resolution raw image signal corresponding to the entire object scene is output at a rate of one frame per 1/30 second.
[0016]
The CSD / AGC circuit 26 performs well-known noise removal and level adjustment on the raw image signal of each frame output from the image sensor 16. The raw image signal thus processed is converted into digital image raw image data by the A / D converter 28, and the converted raw image data is subjected to color separation, white balance adjustment, gamma correction, A series of processing such as YUV conversion is performed. The signal processing circuit 28 outputs YUV data. The output YUV data of each frame is written to the SDRAM 34 by the memory controller 32, and thereafter read from the SDRAM 34 by the same memory controller 32. The read YUV data is converted by the video encoder 38 into a composite image signal of the NTSC system, and the converted composite image signal is provided to the LCD 40. As a result, a real-time moving image (through image) of the subject is displayed on the LCD screen.
[0017]
Of the YUV data generated by the signal processing circuit 30, the Y data is also input to the AE / AF evaluation circuit. The AE / AF evaluation circuit 42 integrates the input Y data for each frame to calculate a luminance evaluation value Iy [i] (i: block numbers from 0 to 255) representing the degree of brightness of the subject, and Then, the high-frequency component of the input Y data is integrated for each frame to calculate a focus evaluation value Ih [j] (j: focus area numbers of 0 to 4) indicating the degree of focusing of the focus lens 12. .
[0018]
Specifically, as shown in FIG. 2, the AE / AF evaluation circuit 36 divides the scene, that is, the screen, into 16 parts in each of the horizontal direction and the vertical direction, and assigns “0” to each of the divided blocks in a raster scan manner. The block numbers “” ”to“ 255 ”are assigned, and 256 luminance evaluation values Iy [0] to Iy [255] are calculated by integrating the Y data for each block. The AE / AF evaluation circuit 36 also assigns five focus areas to the object scene, assigns focus area numbers “0” to “4” to each focus area, and divides the high frequency component of the Y data for each focus area. To calculate focus evaluation values Ih [0] to Ih [4].
[0019]
Note that the focus area 0 is formed by eight blocks “103”, “104”, “119”, “120”, “135”, “136”, “151”, and “152” which are located substantially at the center of the field. It is formed. The focus area 1 is formed by eight blocks “99”, “100”, “115”, “116”, “131”, “132”, “147”, and “148” located on the left side of the field. The focus area 2 is formed by eight blocks "107", "108", "123", "124", "139," 140 "," 155 ", and" 156 "located on the right side of the field. The focus area 3 is formed by eight blocks “182” to “185” and “198” to “201” located below the field of view. The focus area 4 is formed by eight blocks “54” to “57” and “70” to “73” located above the object field.
[0020]
When the shutter button 44 is half-pressed, the attitude of the digital camera 10 is determined based on the output of the tilt sensor 46. As a result, it is specified whether the digital camera 10 is in the upright state, the 90 ° tilted state to the right, or the 90 ° tilted state to the left.
[0021]
Subsequently, the luminance evaluation values Iy [0] to Iy [255] output from the AE / AF circuit 42 are fetched by the CPU 24, and the optimal exposure period Ts and the optimal aperture amount As for main exposure are obtained based on the luminance evaluation values Iy [0] to Iy [255]. . In addition, the driving method of the image sensor 16 is determined according to the brightness of the object scene, and the AF exposure period that matches the determined driving method is obtained. When the whole thinning-out reading is maintained as the driving method, the AF exposure period Taf1 is obtained. When the driving method is changed from the whole thinning-out reading to the partial thinning-out reading, the AF exposure period Taf2 is obtained.
[0022]
In the entire thinning-out reading, as described above, the entire light receiving surface of the image sensor 16 is used as a reading area, and the charges are thinned out at a cycle of 30 fps. On the other hand, in the partial readout, the central area shown in FIG. 2 of the light receiving surface of the image sensor 16 is used as a readout area, and the charges are read out at a cycle of 60 fps. In the partial readout, the vertical synchronization signal Vsync is output from the TG 18 once every 1/60 second.
[0023]
Further, thresholds C [0] to C [4] assigned to focus areas 0 to 4 shown in FIG. 2 are determined based on the selection state of the scene selection key 48 and the output of the tilt sensor 46. The six scenes that can be selected by the scene selection key 48 are “portrait scene”, “sports scene”, “landscape scene”, “sunset scene”, “night scene”, and “default scene”.
[0024]
FIGS. 3A to 3C show setting states of threshold values C [0] to C [4] when “portrait scene” is selected. In the erect state shown in FIG. 3A, threshold THa is set as threshold C [0], threshold THb is set as thresholds C [1], C [2] and C [4], and threshold THd is set. It is set as threshold C [3]. 3B, the threshold value THa is set as the threshold value C [0], the threshold value THb is set as the threshold values C [2], C [3], and C [4]. THd is set as a threshold value C [1]. 3C, the threshold value THa is set as the threshold value C [0], the threshold value THb is set as the threshold values C [1], C [3], and C [4]. THd is set as a threshold value C [2].
[0025]
FIGS. 4A to 4C show setting states of threshold values C [0] to C [4] when “sports scene” is selected. In the erect state shown in FIG. 4A, threshold THa is set as threshold C [0], threshold THb is set as thresholds C [1], C [2] and C [4], and threshold THc is set. It is set as threshold C [3]. 4B, the threshold value THa is set as the threshold value C [0], the threshold value THb is set as the threshold values C [2], C [3], and C [4]. THc is set as a threshold value C [1]. 4C, the threshold value THa is set as the threshold value C [0], the threshold value THb is set as the threshold values C [1], C [3], and C [4]. THc is set as a threshold value C [2].
[0026]
FIGS. 5A to 5C show setting states of threshold values C [0] to C [4] when “landscape scene”, “evening scene” or “night scene” is selected. In the erect state shown in FIG. 5A, threshold THa is set as threshold C [0], threshold THb is set as thresholds C [1] and C [2], and threshold THc is set as threshold C [4]. Is set, and threshold THd is set as threshold C [3]. 5B, the threshold value THa is set as the threshold value C [0], the threshold value THb is set as the threshold values C [3] and C [4], and the threshold value THc is set as the threshold value C [2]. ], And the threshold THd is set as the threshold C [1]. 5C, the threshold THa is set as the threshold C [0], the threshold THb is set as the thresholds C [3] and C [4], and the threshold THc is set as the threshold C [1]. ], And the threshold THd is set as the threshold C [2].
[0027]
FIGS. 6A to 6C show setting states of threshold values C [0] to C [4] when “default scene” is selected. In any of the upright state, the 90 ° left tilt state, and the 90 ° right tilt state, the threshold value THa is set as the threshold value C [0], and the threshold value THb is set to the threshold values C [1], C [2], C [3]. And C [4].
[0028]
Note that a relationship of THa <THb <THc <THd holds between the threshold values THa, THb, THc, and THd.
[0029]
When the setting of the thresholds C [0] to C [4] is completed, the focus driver 12 moves the focus lens 12 stepwise in the optical axis direction, and the focus evaluation value Ih [0] of the field photographed in each step. ] To Ih [4] are taken in by the CPU 24. When the image sensor 16 is driven by the entire thinning-out reading method, the focus evaluation values Ih [0] to Ih [4] are taken into the CPU 24 at a rate of once every 1/30 second, and the image sensor 16 performs thinning-out reading for a part. When driven by the system, the focus evaluation values Ih [0] to Ih [4] are taken into the CPU 24 once every 1/60 second.
[0030]
The CPU 24 determines any one of the focus areas 0 to 4 as the effective focus area Zc based on the focus evaluation values Ih [0] to Ih [4] obtained in each step. Note that a focus area other than the effective focus area is defined as “invalid focus area”.
[0031]
Specifically, the maximum value of the focus evaluation values Ih [0] to Ih [4] obtained at each lens position is saved as the register values Ih [0] max to Ih [4] max, and the maximum value is The position information fpos of the focus lens 12 obtained at this time is saved as register values f [0] to f [4].
[0032]
The saved register values Ih [0] max to Ih [4] max are compared with the above-described thresholds C [0] to C [4]. Then, among the focus areas satisfying the condition of Ih [j] max> threshold value C [j], the focus area having the largest register value f [j], that is, the focus area closest to the focal point is defined as the effective focus area Zc. It is determined. Further, the subject within the determined effective focus area Zc is focused.
[0033]
For example, when the portrait scene is selected and the shutter button 44 is half-pressed in a state where the person Hm1 is captured as shown in FIG. 7A, the focus area 0 is determined as the effective focus area Zc. . Also, when the portrait scene is selected and the shutter button 44 is half-pressed in a state where the person Hm2 located far away and the person Hm3 located close as shown in FIG. Area 2 is determined as effective focus area Zc.
[0034]
When the sports scene is selected and the shutter button 44 is half-pressed in a state where the person Hm5 running far and the person Hm6 running close are captured as shown in FIG. 8, the focus area 1 becomes the effective focus area Zc. It is determined.
[0035]
When the landscape scene, the evening scene, or the night scene is selected, and the house Hs located far away and the person Hm4 located nearby are captured as shown in FIG. 9, when the shutter button 44 is half-pressed, Focus area 2 is determined as effective focus area Zc.
[0036]
When the effective focus area Zc is determined, the CPU 22 displays a frame indicating the effective focus area Zc on the LCD screen. The operator can visually grasp which focus area is currently set as the effective focus area Zc.
[0037]
When the image sensor 16 is driven by the partial reading method, the reading of the YUV data stored in the SDRAM 34 is continued, but the writing of the YUV data to the SDRAM 34 based on the raw image signal output from the image sensor 16 is performed. Is aborted. As a result, the display on the LCD 40 transitions from the through image to the frozen image.
[0038]
However, when the effective focus area Zc is determined based on the focus evaluation values Ih [0] to Ih [4], the driving method of the image sensor 16 is returned from the partial thinning reading to the entire thinning reading, and Writing of the YUV data to the SDRAM 34 based on the output raw image signal is restarted. Thereby, the display on the LCD 40 is also returned from the freeze image to the through image.
[0039]
When the shutter button 44 is kept half-pressed, an automatic tracking process is performed to keep the main subject moving in the field of focus. The automatic tracking operation is performed in the following manner.
[0040]
First, the brightness evaluation values Iy [0] to Iy [255] are fetched once every several frames by the CPU 24, and the currently fetched brightness evaluation values Iy [0] to Iy [255] and the previously fetched brightness. Luminance differences ΔIy [0] to ΔIy [255], which are differences from the evaluation values (register values) Iy [0] ′ to Iy [255] ′, are calculated. Further, the luminance change rates E [0] to E [255] for each block are calculated based on the register values Iy [0] ′ to Iy [255] ′ and the luminance differences ΔIy [0] to ΔIy [255], Thereafter, the luminance change rates Ef [0] to Ef [4] for each focus area are calculated.
[0041]
Whether or not the main subject has moved from the effective focus area Zc to another focus area is determined based on the calculated luminance change rates Ef [0] to Ef [4]. That is, since the brightness changes when the subject moves, the movement determination of the main subject is performed by focusing on the brightness change rate Ef [j] of the focus area other than the effective focus area Zc.
[0042]
Specifically, if the focus area 0 is the effective focus area Zc, the focus area having the maximum luminance change rate Ef [j] among the invalid focus areas 1 to 4 is determined as the expected movement destination area Zt. Further, a focus area located on the opposite side of the expected movement destination area Zt across the focus area 0 is determined as the monitoring target area Zm. Further, a threshold value K to be compared with the luminance change rate Ef [j] of the expected movement destination area Zt is determined in the following manner according to the attitude of the digital camera 10. The threshold value L to be compared with the luminance change rate Ef [j] of the monitoring target area Zm is a fixed value.
[0043]
Referring to FIGS. 10A and 10B, when digital camera 10 is in the upright state, main subject Pr present on focus area 0 is focused area 1 rather than focus area 3 or 4. Or move towards 2. On the other hand, if the digital camera 10 is tilted by 90 °, the main subject Pr existing on the focus area 0 is considered to move toward the focus area 3 or 4 rather than the focus area 1 or 2. Therefore, when the digital camera 10 is in the upright state, the threshold value K assigned to the force area 1 or 2 is set lower than the threshold value K assigned to the focus area 3 or 4. If the digital camera 10 is tilted by 90 °, the threshold K assigned to the force area 3 or 4 is set lower than the threshold K assigned to the focus area 1 or 2.
[0044]
On the other hand, if any one of the focus areas 1 to 4 is the effective focus area Zc, three invalid focus areas close to the effective focus area Zc are determined as candidates for the expected movement destination area Zt. That is, if the focus area 1 or 2 is the effective focus area Zc, the focus areas 0, 3 and 4 are candidates, and if the focus area 3 or 4 is the effective focus area Zc, the focus areas 0, 1 and 2 are candidates. Is done. Then, the focus area having the largest luminance change rate Ef [j] among the three focus areas set as candidates is determined as the expected movement destination area Zt.
[0045]
It should be noted that the focus area located on the opposite side of the expected movement destination area Zt across the focus area 0 is determined as the monitoring target area Zm, and the threshold value is compared with the luminance change rate Ef [j] of the monitoring target area Zm. The point that L is a fixed value is the same as described above. However, the threshold value K to be compared with the luminance change rate Ef [j] of the expected movement destination area Zt is determined in the following manner.
[0046]
Referring to FIGS. 11A and 11B, when digital camera 10 is in the upright state, main subject Pr present on focus area 1 is located in any of focus areas 0, 3, and 4. It seems to move. When the digital camera 10 is tilted by 90 °, the main subject Pr existing on the focus area 3 is considered to move to any one of the focus areas 0, 1 and 2.
[0047]
On the other hand, referring to FIGS. 12 (A) and 12 (B), when digital camera 10 is in the upright state, main subject Pr present on focus area 4 has focus area 1 rather than focus area 0. Or move towards 2. When the digital camera 10 is tilted by 90 °, the main subject Pr existing on the focus area 1 is considered to move toward the focus area 3 or 4 rather than the focus area 0.
[0048]
For this reason, when the main subject exists on the focus area arranged on the right or left side of the field, the lower threshold K is assigned to all three focus areas that are candidates for the expected movement destination area Zt. Can be On the other hand, when the main subject is present on the focus area arranged above or below the object scene, the focus area 0 is higher than the focus area 0 among the three focus areas that are candidates for the expected movement destination area Zt. A threshold K is assigned, and a lower threshold K is assigned to the remaining two focus areas.
[0049]
In the movement determination, if the luminance change rate Ef [j] of the expected movement destination area Zt is equal to or larger than the threshold K and the luminance change rate Ef [j] of the monitoring target area Zm is smaller than the threshold L, "there is movement" for the main subject. Is determined. On the other hand, whether the brightness change rate Ef [j] of the expected movement destination area Zt is less than the threshold value K, or the brightness change rate Ef [j] of the expected movement destination area Zt and the brightness change rate Ef [ j] is equal to or larger than the threshold K and the threshold L, respectively, it is determined that the main subject is “no movement”.
[0050]
Note that when the luminance change rate Ef [j] of the expected movement destination area Zt is equal to or larger than the threshold value K and the luminance change rate Ef [j] of the monitoring target area Zm is equal to or larger than the threshold value L, it is determined that “no movement”. This is because this change is considered to be caused by panning or tilting.
[0051]
However, when the moving speed of the main subject is slow, the luminance change rate Ef [j] of the expected destination area does not reach the threshold value K even though the main subject is moving toward the expected destination area Zt. there is a possibility. For this reason, when it is determined that there is no movement in the above-described movement determination, based on the integrated values (register values) S [0] to S [4] of the luminance change rates Ef [0] to Ef [4]. It is determined whether or not the main subject has moved.
[0052]
Specifically, if any one of the focus areas 1 to 4 is the effective focus area Zc, the integrated value S [0] is set as the threshold X. If the focus area 0 is the effective focus area Zc, any one of the integrated values S [1] to S [4] is set as the threshold X according to the attitude of the digital camera 10. That is, if the digital camera 10 is in the upright state, the larger value of the integrated values S [1] and S [2] is set as the threshold value X, and if the digital camera 10 is tilted by 90 °, the integrated value S [3 ] And S [4] are set as the threshold X.
[0053]
Therefore, as shown in FIG. 10A, when the digital camera 10 is in the upright state and the main subject Pr is on the focus area 0, the integrated value S [1] or S [2] is set as the threshold value X. Is set. When the digital camera 10 is tilted and the main subject Pr is on the focus area 0 as shown in FIG. 10B, the larger of the integrated values S [3] and S [4] is larger. It is set as threshold value X. On the other hand, as shown in FIG. 11A, FIG. 11B, FIG. 12A or FIG. Is set as the threshold value X.
[0054]
If the threshold X is less than the threshold L, the main subject is determined to be “no movement”, and if the threshold X is equal to or greater than the threshold L, the main subject is determined to be “moving”.
[0055]
When the determination result of “moving exists” is obtained, the effective focus area Zc is changed to the expected movement destination area Zt, and the focus is adjusted on the changed effective focus area Zc. When the determination result of “no movement” is obtained, the setting is maintained in the current effective focus area Zc.
[0056]
When performing the above-described automatic tracking process, the aperture mechanism 14 is opened by the aperture driver 20. This is for reducing the influence of light diffraction at the opening of the aperture mechanism 14 and improving the tracking accuracy.
[0057]
When the shutter button 44 shifts from the half-pressed state to the fully-pressed state, the optimal aperture amount As and the optimal exposure period Ts are set in the aperture mechanisms 14 and TG 18, respectively, and the recording process is executed. First, the CPU 24 instructs the TG 18 to perform one-frame main exposure and read all pixels, and the CPU 24 instructs the signal processing circuit 30 to perform compression processing. The TG 18 performs the main exposure according to the optimum exposure period Ts, and reads out all the electric charges generated by the main exposure, that is, one frame of the high-resolution raw image signal from the image sensor 16.
[0058]
The read raw image signal is input to the signal processing circuit 30 via the CDS / AGC circuit 26 and the A / D converter 28, and is converted into YUV data by the above-described series of processing. The converted YUV data is written to the SDRAM 34 by the memory controller 32. Thereafter, the signal processing circuit 30 reads the YUV data from the SDRAM 34 through the memory controller 32, applies JPEG compression to the read YUV data, and records the compressed image data generated by the JPEG compression on the memory card 36 in a file format.
[0059]
An example of the automatic tracking operation will be described with reference to FIGS. As shown in FIG. 13A, the main subject Pr is shifted from the state where the main subject Pr is present in the focus area 1 and the effective focus area Zc is set in the focus area 1 to the state shown in FIG. As shown in FIG. 13C, the secondary subject Se that traverses the object field toward the digital camera 10 and does not exist in the object field at the time of FIG. It is assumed that the user enters the object field at the point of, and deviates from the object field at the point of FIG.
[0060]
At the time of FIG. 13A, the focus area 0 is set as the expected movement destination area Zt, and the focus area 2 is set as the monitoring target area Zm. Then, whether or not the main subject Pr has moved to the focus area 0 is determined based on the luminance change rates Ef [0] and Ef [2] and the integrated value X of the luminance change rate Ef [0]. At the time of FIG. 13B, the focus area 2 is set as the expected movement destination area Zt, and the focus area 1 is set as the monitoring target area Zm. Then, it is determined whether or not the main subject Pr has moved to the focus area 2 based on the luminance change rates Ef [2] and Ef [1] and the integrated value of the luminance change rate Ef [2].
[0061]
As described above, the expected movement destination area Zt and the monitoring target area Zm are movement detection areas for detecting whether or not the main subject Pr has moved, and the luminance change rate Ef of the expected movement destination area Zt and the monitoring target area Zm. Based on [j], it is determined whether or not the main subject Pr has moved. Therefore, the determination of the presence or absence of the movement is not disturbed by the secondary main subject Se.
[0062]
When the shutter button 44 is half-pressed in a state where the automatic tracking function is activated by operating the mode key 50, the CPU 24 executes processing according to the flowcharts shown in FIGS. The control programs corresponding to these flowcharts are stored in the flash memory 24a.
[0063]
Referring to FIG. 14, first, in step S1, a posture detection process using the inclination sensor 46 is performed. Thus, it is determined whether the digital camera 10 is in a 90 ° rightward tilt state, a 90 ° leftward tilt state, or an erect state. The posture coefficient SLP is set to “1” when it is determined to be tilted 90 ° to the right, and the attitude coefficient SLP is set to “2” when it is determined to be tilted to the left 90 °. If it is determined that the vehicle is standing, the posture coefficient SLP is set to “3”.
[0064]
In a succeeding step S3, an AE / AF control process is executed. Among them, the optimum exposure period Ts and the optimum aperture amount As are obtained by the AE control process. Further, it is determined whether or not the object is a field in which focus adjustment can be performed by the AF control process. If the determination result is affirmative, the effective focus area Zc is determined, and the effective focus area Zc within the determined effective focus area Zc is determined. The subject is focused. If focus adjustment is possible, a determination result of "Good" is obtained, and if focus adjustment is impossible, a determination result of "NG" is obtained.
[0065]
In a step S5, a determination result obtained by the AF control processing is determined, and in a step S7, it is determined whether or not the shutter button 44 is half-pressed. If the determination result is “NG”, or if the determination result is “Good” and the shutter button 34 is not half-pressed, the optimal exposure period Ts and the optimal aperture amount As for the main exposure are set in step S25. After setting the TG 18 and the aperture mechanism 14, the process ends. Therefore, when the shutter button 34 shifts from the half-pressed state to the fully-pressed state, the recording process is executed irrespective of whether or not the subject is in focus.
[0066]
If the result of the determination is “Good” and the half-pressing state of the shutter button 44 is continued, the processing after step S9 is executed in order to automatically track the main subject.
[0067]
First, in step S9, variables p, q, r and register values S [0] to S [4] are set to "0". Here, the variable p is a variable for counting the number of frames required until the Y data is stabilized after the shutter button 44 is half-pressed. The variable q is a variable for counting the period of taking in the luminance evaluation values Iy [0] to Iy [255]. The variable r is a variable for counting the number of integrated frames of the luminance change rates Ef [0] to Ef [4]. Register values S [0] to S [4] are integrated values of luminance change rates Ef [0] to Ef [4], respectively.
[0068]
In step S11, it is determined whether or not the aperture mechanism 14 is open. When the determination is YES, the process directly proceeds to step S19, but when the determination is NO, the process proceeds to step S19 via the processes of steps S13 to S17.
[0069]
In step S13, the optimal aperture amount A (= As) obtained in the AE control process is retracted as a register value As', and in step S15, the aperture mechanism 14 is opened. In step S17, the AF exposure period Taf1 is calculated, and the calculated AF exposure period Taf1 is set to TG18.
[0070]
In step S19, it is determined whether or not the vertical synchronization signal Vsync has been generated. If YES, it is determined in step S25 whether or not the shutter button 44 is half-pressed. If the half-pressed state has been released, the processing ends after the main exposure setting in step S25.
[0071]
If the half-pressed state of the shutter button 44 is maintained, the luminance evaluation values Iy [0] to Iy [255] are fetched from the AE / AF evaluation circuit 36 in step S25, and the variable p is set to the upper limit P (=) in step S27. It is determined whether or not 3) has been reached. If the variable p has not yet reached the upper limit value P, the variable p is incremented in step S29, and the current luminance evaluation values Iy [0] to Iy [255] are registered in step S31 with the register values Iy [0] 'to Iy [ 255] ′, and the process returns to step S19.
[0072]
When the variable p reaches the upper limit P, the process proceeds from step S27 to step S33 to determine whether the variable q has reached the upper limit Q (= 3). If the variable q has not yet reached the upper limit value Q, the variable q is incremented in step S37, and the process returns to step S19. When the variable q has reached the upper limit value Q, the variable q is returned to "0" in step S37, and a luminance change detection process is executed in step S39. Through the luminance change detection processing, the luminance change rates Ef [0] to Ef [4] of the focus areas 0 to 4 are obtained.
[0073]
In step S41, a first movement determination process is performed. The first movement determination processing determines the expected movement destination area Zt and the monitoring target area Zm, and determines whether or not the main subject has moved based on the luminance change rate Ef [j] of the expected movement destination area Zt and the monitoring target area Zm. Is done. In step S43, it is determined whether the determination result of the first movement determination processing is “with movement” or “without movement”.
[0074]
If NO is determined here, the second movement determination processing is executed in step S45. In the second movement determination process, the luminance change rates Ef [0] to Ef [4] are integrated, and the presence or absence of movement of the main subject is determined based on the integrated values S [0] to S [4]. In step S47, it is determined whether the determination result of the second movement determination processing is “with movement” or “without movement”. If NO is determined here, it is determined that the main subject has not moved, and the process returns to step S19.
[0075]
On the other hand, if YES is determined in one of steps S43 and S47, the AF restart processing is executed in step S49. By the AF restart process, the effective focus area Zc is updated, and the focus is adjusted in the updated effective focus area Zc. In step S51, the register values S [0] to S [4] are returned to "0", and in step S53, the current luminance evaluation value Iy [i] is saved as the register value Iy [i] '. Upon completion of the evacuation process, the process returns to step S19.
[0076]
The posture detection process in step S1 shown in FIG. 14 follows a subroutine shown in FIG. First, in steps S61 and S63, the attitude of the digital camera 10 is determined based on the output of the tilt sensor 46. If the output of the inclination sensor 46 indicates the “erect state”, YES is determined in the step S61, and the attitude coefficient SLP is set to “3” in a step S65. If the output of the inclination sensor 46 indicates “90 ° inclination to the right”, YES is determined in the step S63, and the attitude coefficient SLP is set to “1” in a step S67. If the output of the tilt sensor 46 indicates “90 ° tilt to the left”, NO is determined in the step S63, and the attitude coefficient SLP is set to “2” in a step S69.
[0077]
The AE / AF control processing in step S3 shown in FIG. 14 complies with a subroutine shown in FIGS. First, in step S71, 256 luminance evaluation values Iy [0] to Iy [255] are fetched from the AE / AF evaluation circuit 42. In step S73, the optimum exposure period Ts is calculated based on the taken luminance evaluation values Iy [0] to Iy [255]. In step S75, the calculated optimum exposure period Ts is saved as a register value Ts'. . In step S77, the optimum aperture amount As is calculated based on the taken luminance evaluation values Iy [0] to Iy [255], and the aperture driver 20 is controlled to set the optimal aperture amount As in the aperture mechanism 14. The calculation of the optimum exposure period Ts and the optimum aperture amount As is performed in accordance with the currently valid photometric method (multi-segment photometry, center-weighted photometry, spot photometry, etc.).
[0078]
Upon completion of the setting, a first freeze process is performed in a step S79. As a result, the driving method of the image sensor 16 is determined to be one of the entire thinning-out reading and the partial thinning-out reading. That is, if the field is bright, the entire thinning-out reading is maintained, and if the field is dark, the partial reading-out is selected. When the partial reading is selected, the memory controller 32 is instructed to stop writing data.
[0079]
In step S81, a threshold setting process is performed. As a result, the thresholds C [1] to C [4] are individually assigned to the focus areas 0 to 4. Specific numerical values of the thresholds C [1] to C [4] are determined based on the scene selected by the scene selection key 48 and the attitude of the digital camera 10.
[0080]
In step S83, the focus driver 22 is driven to set the focus lens 12 to the infinite position, and in step S85, the focus position information fpos is set to "0" corresponding to the infinite position. Subsequently, the register values Ih [0] max to Ih [4] max are set to “0” in a step S87, and it is determined whether or not the vertical synchronization signal Vsync is generated in a step S89.
[0081]
When the vertical synchronization signal Vsync is generated, the variable j is set to "0" in a step S91, and the focus evaluation value Ih [j] is fetched from the AE / AF evaluation circuit 42 in a step S93. In step S95, the captured focus evaluation value Ih [j] is compared with the register value Ih [j] max. If it is determined that Ih [j] <Ih [j] max, the process proceeds to step S101. On the other hand, if it is determined that Ih [j] ≧ Ih [j] max, the focus evaluation value Ih [j] is set as the register value Ih [i] max in step S97, and the current focus position is determined in step S99. The information fpos is saved as the register value f [j], and the process proceeds to step S101.
[0082]
In step S101, it is determined whether the variable j has reached “4”. If the variable j has not reached "4", the variable j is incremented in step S103, and the process returns to step S93. When the variable j reaches "4", the focus position information fpos is compared with a predetermined value NEAR corresponding to the closest position in step S105. If fpos <NEAR here, it is considered that the focus lens 12 has not reached the close position, the focus lens 12 is moved to the close side by one step in step S107, and the focus position information fpos is incremented in step S109. Upon completion of the process in the step S109, the process returns to the step S99.
[0083]
By repeating the processing of steps S89 to S109 as described above, the register values f [0] to f [4] indicate the in-focus positions of the focus lens 12 in the focus areas 0 to 4, respectively.
[0084]
When the focus lens 12 reaches the close position, YES is determined in the step S105, and a second freeze process is executed in a step S111. By the second freeze process, the driving method of the image sensor 16 is returned to the entire thinning-out reading.
[0085]
In step S113, the register values Ih [0] max to Ih [4] max are compared with the threshold values C [0] to C [4], respectively, and the condition of Ih [j] max ≧ C [j] is satisfied. Detect the focus area. In step S115, it is determined whether a focus area satisfying the condition is detected, and the process proceeds to one of step S117 and step S127 according to the determination result.
[0086]
If a focus area satisfying the condition is detected, the optimum register value f [j] is specified in step S117. Specifically, from the register value f [j] corresponding to the focus area satisfying the condition, the one with the largest numerical value, that is, the closest one is specified. In step S119, the focus area corresponding to the optimum register value f [j] is determined as the effective focus area Zc, and a frame indicating the effective focus area Zc is displayed on the LCD 40. In step S121, the optimal register value f [j] is returned as the focus position information fpos. In step S123, the focus lens 12 is moved to the position indicated by the focus position information fpos. In step S125, the determination result "Good" is validated. .
[0087]
On the other hand, if no focus area that satisfies the condition is detected, a fixed value is set in the focus position information fpos in step S127. In step S129, the focus lens 12 is moved to the position indicated by the focus position information fpos, and in step S131, the determination result “NG” is validated. Upon completion of the process in the step S125 or S131, the process returns to the routine in the upper hierarchy.
[0088]
The first freeze process follows a subroutine shown in FIG. First, in step S141, a luminance level y_level is calculated based on the luminance evaluation values Iy [0] to Iy [255] acquired in step S71. The calculation method follows the currently valid photometric method. In step S143, a luminance level y_level_max corresponding to the longest exposure period Tmax is calculated based on the optimum exposure period Ts calculated in step S73, the settable maximum exposure period Tmax, and the luminance level y_level calculated in step S141. I do. Specifically, the calculation of Expression 1 is performed.
[0089]
(Equation 1)
y_level_max = y_level × Tmax / Ts
In step S145, the calculated luminance level y_level_max is compared with the target luminance level y_target. If y_level_max> y_target, the darkness ratio night_ratio is set to “0” in step S157, and if y_level_max ≦ y_target, the darkness ratio night_ratio is calculated according to Equation 2 in step S159.
[0090]
(Equation 2)
night_ratio =
100 (1-y_level_max / y_target)
When the process of step S147 or S149 is completed, the darkness ratio night_ratio is compared with the threshold B in step S151. If night_ratio> B, the object field is considered to be sufficiently bright, and the process proceeds to step S163. In step S163, the AF exposure period Taf1 suitable for the current driving method of the entire thinning-out reading is calculated, and the calculated exposure period Taf1 is set to TG18. When the setting is completed, the process returns to the routine in the upper hierarchy.
[0091]
On the other hand, if night_ratio ≦ B, the brightness of the object field is considered to be insufficient, and the generation of the vertical synchronization signal Vsync is awaited in step S153. When the vertical synchronization signal Vsync is generated, the memory controller 32 is instructed to stop writing data in step S155. The memory controller 32 performs only data reading, and the display on the LCD 40 transitions from a through image to a frozen image (still image). In step S157, the TG 18 is instructed to perform the partial readout, and in step S159, the generation of the vertical synchronization signal Vsync is awaited. When the vertical synchronizing signal Vsync is generated, the process proceeds to step S161 to calculate the AF exposure period Taf2 suitable for the partial thinning-out reading method and to set the calculated exposure period Taf2 to the TG 18. When the setting is completed, the process returns to the routine in the upper hierarchy.
[0092]
The threshold setting process in step S81 shown in FIG. 17 follows a subroutine shown in FIG. First, in step S171, the selection state of the scene selection key 48 is detected, and in steps S173, S177, and S179, what kind of scene is selected is determined. If the default scene has been selected, the process proceeds from step S173 to step S175, where the threshold THa is set as the threshold C [0] and the threshold THb is set as the thresholds C [1] to C [4]. Upon completion of the process in the step S175, the process returns to a routine in a higher hierarchy.
[0093]
If a portrait scene has been selected, “YES” is determined in the step S177, and a first threshold value determining process is executed in a step S181. If a sports scene has been selected, YES is determined in the step S179, and a second threshold value determining process is executed in a step S183. If a landscape scene, an evening scene, or a night scene is selected, NO is determined in the step S179, and a third threshold value determining process is executed in a step S185. Upon completion of the process in the step S181, S183, or S185, the process returns to a routine in a higher hierarchy.
[0094]
The first threshold value determination processing in step S181 follows a subroutine shown in FIG. First, the slope coefficient SLP is determined in steps S201 and S205. If the slope coefficient SLP is “1”, the process proceeds from step S201 to step S203, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [1], C [3] or C [4]. Then, the threshold THd is set as the threshold C [2]. If the slope coefficient SLP is “2”, the process proceeds from step S205 to step S207, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [2], C [3] or C [4]. Then, the threshold THd is set as the threshold C [1].
[0095]
If the slope coefficient SLP is “3”, the process proceeds from step S205 to step S209, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [1], C [2] or C [4]. Then, the threshold THd is set as the threshold C [3]. Upon completion of the process in the step S203, S207, or S209, the process returns to a routine in a higher hierarchy.
[0096]
The second threshold value determination processing in step S183 follows a subroutine shown in FIG. First, the slope coefficient SLP is determined in steps S211 and S215. If the slope coefficient SLP is “1”, the process proceeds from step S211 to step S213, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [1], C [3] or C [4]. Then, the threshold THc is set as the threshold C [2]. If the slope coefficient SLP is “2”, the process proceeds from step S215 to step S217, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [2], C [3] or C [4]. Then, the threshold THc is set as the threshold C [1].
[0097]
If the slope coefficient SLP is “3”, the process proceeds from step S215 to step S219, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [1], C [2] or C [4]. Then, the threshold THc is set as the threshold C [3]. Upon completion of the process in the step S213, S217, or S219, the process returns to a routine in a higher hierarchy.
[0098]
The third threshold value determination processing in step S185 follows a subroutine shown in FIG. First, the slope coefficient SLP is determined in steps S221 and S225. If the slope coefficient SLP is “1”, the process proceeds from step S221 to step S223, where the threshold THa is set as the threshold C [0], the threshold THb is set as the threshold C [3] or C [4], and the threshold C is set. The threshold value THc is set as [1], and the threshold value THd is set as the threshold value C [2]. If the slope coefficient SLP is “2”, the process proceeds from step S215 to step S217, where the threshold THa is set as the threshold C [0], and the threshold THb is set as the threshold C [3] or C [4]. The threshold value THd is set as [1], and the threshold value THc is set as the threshold value C [2].
[0099]
If the slope coefficient SLP is "3", the process proceeds from step S215 to step S219, where the threshold THa is set as the threshold C [0], the threshold THb is set as the threshold C [1] or C [2], and the threshold C is set. The threshold THd is set as [3], and the threshold THc is set as the threshold C [4]. Upon completion of the process in the step S223, S227, or S229, the process returns to a routine in a higher hierarchy.
[0100]
The second freeze process in step S111 shown in FIG. 19 follows a subroutine shown in FIG. First, in step S231, it is determined whether or not the memory controller 32 has stopped writing data. If NO is determined here, the process directly returns to the upper-level routine.
[0101]
On the other hand, if YES is determined, the TG 18 is instructed to perform the entire thinning-out reading in step S233. When the vertical synchronization signal Vsync is generated, the process proceeds from step S235 to step S237 to calculate the AF exposure period Taf1 suitable for the entire thinning-out reading and set the calculated exposure period Taf1 to the TG. When the vertical synchronization signal Vsync is generated again, the process proceeds from step S239 to step S241, and instructs the memory controller 32 to restart data writing. When the data writing is restarted, the display on the LCD 40 transitions from the freeze image to the through image. Upon completion of the process in the step S241, the process returns to a routine in a higher hierarchy.
[0102]
The luminance change detection process in step S39 shown in FIG. 15 follows a subroutine shown in FIG. First, in step S251, the luminance evaluation values Iy [0] to Iy [255] captured in step S23 in FIG. 14 and the register values Iy [0] 'to Iy [255 previously saved in step S31 in FIG. ] ′ And an operation according to Equation 3 is performed to obtain luminance differences ΔIy [0] to Iy [255] of each block.
[0103]
[Equation 3]
ΔIy [i] = | Iy [i] −Iy [i] ′ |
However, i = 0 to 255
In step S253, the luminance difference ΔIy [0] to Iy [255] calculated according to Expression 3 and the register values Iy [0] ′ to Iy [255] ′ are calculated according to Expression 4, and the luminance change rate of each block is calculated. E [0] to E [255] are calculated.
[0104]
(Equation 4)
E [i] = (ΔIy [i] / Iy [i] ′) × 100
However, i = 0 to 255
In step S255, the luminance change rates E [i] of the blocks belonging to the focus area 0 are averaged to obtain the luminance change rates Ef [0], and the luminance change rates E [i] of the blocks belonging to the focus area 1 are averaged. The luminance change rate Ef [1] is obtained, and the luminance change rates E [i] of the blocks belonging to the focus area 2 are averaged to obtain the luminance change rate Ef [2]. The luminance change rate E [i] of the blocks belonging to the focus area 3 is averaged to obtain a luminance change rate Ef [3], and the luminance change rates E [i] of the blocks belonging to the focus area 4 are averaged to obtain a luminance change rate. The ratio Ef [4] is obtained. Upon completion of the process in the step S255, the process returns to a routine in a higher hierarchy.
[0105]
The first movement determination processing in step S41 shown in FIG. 15 complies with a subroutine shown in FIGS. First, in step S261, it is determined whether the effective focus area Zc is set to the focus area 0 or not. If the determination is YES, the process proceeds to step S263, and the focus area having the largest luminance change rate Ef [j] among the invalid focus areas 1 to 4 is determined as the expected movement destination area Zt. In a succeeding step S265, a slope coefficient SLP is determined.
[0106]
If the inclination coefficient SPL is "3", the digital camera 10 is considered to be in the upright state, and the threshold value K is determined in step S267. Specifically, the threshold value K of the focus area 1 or 2 is determined to be a predetermined value k, and the threshold value K of the focus area 3 or 4 is determined to be twice the predetermined value k. On the other hand, if the inclination coefficient SPL is “1” or “2”, the digital camera 10 is considered to be in the 90 ° inclination state, and the threshold value K is determined in step S269. Specifically, the threshold value K of the focus area 1 or 2 is determined to be twice the predetermined value k, and the threshold value K of the focus area 3 or 4 is determined to the predetermined value k.
[0107]
Upon completion of the process in the step S267 or S269, the process proceeds to a step S271 to compare the luminance change rate Ef [j] of the expected movement destination area Zt and the threshold value K with each other. If Ef [j] <K, it is determined in step S279 that "no movement" has been made for the main subject, and the process returns to the upper hierarchy routine. On the other hand, if Ef [j] ≧ K, the monitoring target area Zm is determined in step S273. Specifically, when the focus area 1 is the expected movement destination area Zt, the focus area 2 is set as the monitoring target area Zm, and when the focus area 2 is the expected movement destination area Zt, the focus area 1 is set as the monitoring target area Zm. If the focus area 3 is the expected movement destination area Zt, the focus area 4 is set as the monitoring target area Zm. If the focus area 4 is the expected movement destination area Zt, the focus area 3 is set as the monitoring target area Zm.
[0108]
In step S275, the luminance change rate Ef [j] in the determined monitoring target area Zm is compared with the threshold L. If Ef [j] <L, it is determined in step S277 that the main subject is “moving”. On the other hand, if Ef [j] ≧ L, it is determined in step S279 that the main subject is “no movement”. Upon completion of the process in the step S277 or S279, the process returns to a routine in a higher hierarchy.
[0109]
If NO is determined in the step S261, a candidate of the expected movement destination area Zt is determined in a step S281. Specifically, if the focus area 1 or 2 is the effective focus area Zc, the invalid focus areas 0, 3, and 4 are set as candidates for the expected movement destination area Zt. If the focus area 3 or 4 is the effective focus area Zc, the invalid focus areas 0, 1 and 2 are set as candidates for the expected movement destination area Zt. In step S283, the focus area having the largest luminance change rate E [j] among the plurality of determined candidates is determined as the expected movement destination area Zt.
[0110]
In step S285, the inclination coefficient SLP is determined. If the slope coefficient SLP is "3", the process proceeds to step S287, and if the slope SLP is "1" or "2", the process proceeds to step S289. In step S287, the setting destination of the effective focus area Zc is determined. Then, if the effective focus area Zc is set to the focus area 1 or 2, the threshold value K of the candidate of the predicted movement destination area Zt is determined in step S291, and the effective focus area Zc is set to the focus area 3 or 4. For example, in step S293, the threshold value K of the candidate of the expected movement destination area Zt is determined.
[0111]
Also in step S289, the setting destination of the effective focus area is determined in the same manner as in step S287. If the effective focus area Zc is set to the focus area 1 or 2, the threshold value K of the candidate for the expected movement destination area Zt is determined in step S 295, and the effective focus area Zc is set to the focus area 3 or 4. For example, in step S297, the threshold value K of the candidate of the expected movement destination area Zt is determined.
[0112]
In step S291, the threshold value K of the focus area 0, 3, or 4 is determined to be a predetermined value k. In step S293, the threshold value K of the focus area 0 is determined to be twice the predetermined value k, and the threshold value K of the focus area 1 or 2 is determined to the predetermined value k. In step S295, the threshold value K of the focus area 0 is determined to be twice the predetermined value k, and the threshold value K of the focus area 3 or 4 is determined to the predetermined value k. In step S297, the threshold value K of the focus area 0, 1 or 2 is determined to be a predetermined value k.
[0113]
When the processing of steps S291 to S297 is completed, the processing of steps S299 to S307 is executed. However, since these processings are the same as the above-described steps S271 to S279, duplicate description will be omitted.
[0114]
The second movement determination processing in step S45 shown in FIG. 15 complies with a subroutine shown in FIGS. First, a change rate integration process is performed in step S311. By the change rate integration process, the brightness change rate Ef [j] of the focus area specified according to the effective focus area Zc and the attitude of the digital camera 10 among the focus areas 0 to 4 is integrated. The integration results of the luminance change rates Ef [0] to Ef [4] are held as register values S [0] to S [4]. Further, the variable r is incremented by the change rate integration processing.
[0115]
In step S313, it is determined whether or not the variable r has reached the upper limit value R. If the variable r has not reached the upper limit value R, the determination result of "no movement" is validated in step S339, and then the routine of the upper hierarchy is executed. To return.
[0116]
When the variable r has reached the upper limit R, the variable r is returned to “0” in step S315, and it is determined whether or not the focus area 0 is the effective focus area Zc in step S317. If any one of the focus areas 1 to 4 is the effective focus area Zc, the register value S [0] is set as the integrated value X in step S319. On the other hand, if the focus area 0 is the effective focus area Zc, the process advances to step S321 to determine the posture of the digital camera 10 based on the posture coefficient SLP.
[0117]
If the attitude coefficient SLP is “3”, the digital camera 10 is considered to be in the upright state, and the register values S [1] and S [2] are compared with each other in step S329. If S [1] ≧ S [2], the register value S [1] is set as the integrated value X in step S331, and if S [1] <S [2], the register value S [ 2] is set as the integrated value X.
[0118]
On the other hand, if the attitude coefficient SLP is “1” or “2”, the digital camera 10 is considered to be in a 90 ° tilt state, and the register values S [3] and S [4] are compared with each other in step S325. I do. If S [3] ≧ S [4], the register value S [3] is set as the integrated value X in step S325, and if S [3] <S [4], the register value S [3] is set in step S327. 4] is set as the integrated value X.
[0119]
When the integrated value X is determined, the integrated value X is compared with a threshold value M in step S335. If the integrated value X is equal to or larger than the threshold value M, the determination result of “moving” is validated in step S337, and the process returns to the upper hierarchy routine. After the determination result of "None" is validated, the process returns to the routine in the upper hierarchy.
[0120]
The change rate integration process in step S311 follows a subroutine shown in FIGS. First, in step S341, the setting destination of the effective focus area Zc is determined. If the focus area 0 is the effective focus area Zc, the inclination coefficient SLP is determined in step S343. When the slope coefficient SLP is “3”, the process proceeds to step S345, and the luminance change rate Ef [1] of the focus area 1 and the luminance change rate Ef [2] of the focus area 2 are individually integrated. Specifically, an operation according to Equation 5 is performed.
[0121]
(Equation 5)
S [1] = S [1] + Ef [1]
S [2] = S [2] + Ef [2]
If the slope coefficient SLP is “1” or “2”, the process proceeds to step S347, and the luminance change rate Ef [3] of the focus area 3 and the luminance change rate Ef [4] of the focus area 4 are individually integrated. Specifically, an operation according to Equation 6 is performed.
[0122]
(Equation 6)
S [3] = S [3] + Ef [3]
S [4] = S [4] + Ef [4]
When the calculation of Expression 5 or Expression 6 is completed, the variable r is incremented in Step S349, and the process returns to the routine in the upper hierarchy.
[0123]
If any one of the focus areas 1 to 4 is the effective focus area Zc, the tilt coefficient SLP is determined in a step S351. If the slope coefficient SLP is “3”, the destination of the effective focus area Zc is determined in step S355. If the focus area 3 or 4 is the effective focus area Zc, the process directly proceeds to step S359. If the focus area 1 or 2 is the effective focus area Zc, the luminance change rate Ef of the focus area 0 is calculated according to Equation 7 in step S357. 0], and then proceeds to step S349.
[0124]
(Equation 7)
S [0] = S [0] + Ef [0]
When the inclination coefficient SLP is “1” or “2”, the setting destination of the effective focus area Zc is determined in step S353. If the focus area 1 or 2 is the effective focus area Zc, the process directly proceeds to step S349. If the focus area 3 or 4 is the effective focus area Zc, the process proceeds to step S349 via step S357.
[0125]
The AF restart processing in step S49 shown in FIG. 15 follows a subroutine shown in FIG. First, in step S361, the effective focus area Zc is changed to the current expected movement destination area Zt, and a frame indicating the changed effective focus area Zc is displayed on the LCD 40. In step S363, the focus is adjusted in the changed effective focus area Zc. When the adjustment is completed, the variable p is returned to "0" in step S365, and the process returns to the routine in the upper hierarchy. In the focus adjustment in step S363, the image sensor 14 performs the entire thinning-out reading.
[0126]
As can be understood from the above description, when the effective focus area Zc used for the focus adjustment is first determined from the focus areas 0 to 4 arranged in the object scene, the maximum focus obtained in the focus areas 0 to 4 respectively. Evaluation values Ih [0] max to Ih [4] max are compared with threshold values C [0] to C [4] assigned to focus areas 0 to 4, respectively. Here, the allocation destination of the thresholds THa, THb, THc, or THd set as the thresholds C [0] to C [4] is changed by the CPU 24 based on the detection result of the tilt sensor 46, that is, the camera attitude. That is, the focus areas 0 to 4 are radially arranged with reference to the center of the object field, and the allocation destination of the threshold value THa, THb, THc or THd is rotated according to the camera posture.
[0127]
As described above, by changing the assignment destination of the threshold value THa, THb, THc, or THd based on the camera posture, the focus can be accurately adjusted regardless of the camera posture. Further, three focus areas 0 to 4 are arranged in the vertical direction and the horizontal direction, respectively. The tilt sensor 46 determines whether the camera posture is the erect state or the 90 ° tilt state, and the CPU 24 determines the threshold values THa, THb. , THc or THd is rotated by 90 °, so that even if the tilt sensor 46 is of a low performance that can determine only the upright state and the 90 ° tilt state, accurate allocation of the threshold values THa, THb, THc or THd Becomes possible.
[0128]
The electric charge corresponding to the optical image of the object scene is periodically read from the entire light receiving surface of the image sensor 16. The YUV data based on the charge read from the image sensor 16 is written in the SDRAM 34. The LCD 40 displays an image based on the YUV data stored in the SDRAM 34. However, when the shutter button 44 is operated, the brightness of the object scene is determined. If the brightness is sufficient, the CPU 24 enables a specific setting state in which charges are read from the central area of the light receiving surface of the image sensor 16 and writing of YUV data to the SDRAM 34 is prohibited.
[0129]
The focus is adjusted based on the charge read from the image sensor 16 after the specific setting state is validated. When the focus adjustment is completed, the specific setting state is released by the CPU 24. As a result, charges are periodically read from the entire light receiving surface of the image sensor 16, and YUV data based on the read charges is written to the SDRAM 34. The display on the LCD 40 transitions from the freeze image to a real-time moving image of the object scene.
[0130]
By reading the charges from the central area of the image sensor 16, the read cycle is shortened. Further, by prohibiting the writing of the YUV data based on the charge read from the central area into the SDRAM 34, the display of the LCD 40 shifts from a through image of the entire object scene to a frozen image. By shortening the read cycle, it is possible to speed up the focus adjustment. Further, although the image is a frozen image, an image of the entire object scene is displayed on the LCD 34, so that a decrease in operability can be suppressed.
[0131]
Since the speed of the focus adjustment is increased, the time during which the freeze image is displayed is short, and the operator does not have a significant discomfort. Further, since the specific setting state is activated due to the manual operation of the operator, the monitor display does not transition from the through image to the freeze image at a timing not intended by the operator.
[0132]
When the half-pressing state of the shutter button 44 continues, the automatic tracking function of the main subject is activated. At this time, the CPU 24 individually calculates the luminance change rates Ef [j] for a plurality of invalid focus areas other than the valid focus area Zc among the focus areas 0 to 4, and calculates the movement of the subject existing in the plural invalid focus areas. To evaluate. The CPU 24 further determines whether or not the main subject has moved based on the calculated luminance change rate Ef [j] and the plurality of thresholds K respectively assigned to the plurality of invalid focus areas. When the main subject moves, the effective focus area Zc is changed. The focus is adjusted based on the effective focus area Zc thus changed. Here, the numerical value indicated by each of the plurality of thresholds K is changed according to the position of the effective focus area Zc so as to reflect the possibility of movement of the main subject.
[0133]
Specifically, when the focus area 3 or 4 is the effective focus area Zc when the digital camera 10 is in the upright state, the threshold K of the focus area 1 or 2 is set lower than the threshold of the focus area 0. . This takes into account that, when the main subject is located at the upper center or lower center of the field, the main subject moves in an oblique direction rather than in a vertical direction of the field.
[0134]
If the focus area 1 or 2 is the effective focus area Zc when the digital camera 10 is in the upright state, the threshold values K of the focus areas 0, 3 and 4 are set to the same numerical value. This takes into account that when the main subject is located on the center right side or the center left side of the scene, the main subject may move in both the horizontal direction and the oblique direction of the scene. .
[0135]
Further, if the focus area 0 is an effective focus area when the digital camera 10 is in the upright state, the threshold K of the focus area 1 or 2 is set lower than the focus area 3 or 4 threshold. This takes into account that when the main subject is located in the center of the scene, the main subject moves in the horizontal direction rather than in the vertical direction of the scene.
[0136]
Thus, by changing the value of the threshold value K in consideration of the possibility of movement of the main subject, the accuracy of determining whether the main subject has moved can be improved, and the focus can be surely adjusted to the main subject. Become.
[0137]
In this embodiment, only the focus lens 12 is moved in the optical axis direction for focus adjustment. Instead, only the image sensor 16 or both the focus lens 12 and the image sensor 16 are moved along the optical axis. You may make it move to a direction.
[0138]
Further, in this embodiment, the scene is selected according to the operation of the scene selection key 48, but the scene may be automatically determined.
[0139]
Further, the tilt sensor 46 of this embodiment can determine only one of the upright state, the right 90 ° tilt state, and the left 90 ° tilt state, but a higher precision sensor may be used. Then, by arranging innumerable focus areas radially extending from the center of the object field in the object field, it is possible to rotate the numerical values indicated by the threshold values C [0] to C [4] little by little. It becomes.
[Brief description of the drawings]
FIG. 1 is a block diagram showing one embodiment of the present invention.
FIG. 2 is an illustrative view showing one example of a distribution state of focus areas 0 to 4 and a center area formed on a screen;
FIG. 3A is an illustrative view showing one example of a setting state of threshold values C [0] to C [4] when a portrait scene is selected; FIG. 3B is a diagram illustrating a portrait scene; FIG. 14 is an illustrative view showing another example of the setting state of the thresholds C [0] to C [4] at the time, and FIG. 14C is a diagram illustrating the thresholds C [0] to C [4] when the portrait scene is selected; FIG. 9 is an illustrative view showing another example of the setting state;
FIG. 4A is an illustrative view showing one example of a setting state of thresholds C [0] to C [4] when a sports scene is selected, and FIG. 4B is an illustrative view showing a case where a sports scene is selected; FIG. 10 is an illustrative view showing another example of the setting state of the thresholds C [0] to C [4], and FIG. 11C is a diagram illustrating the setting state of the thresholds C [0] to C [4] when the sports scene is selected; It is an illustrative view showing other examples.
5A is an illustrative view showing one example of a setting state of threshold values C [0] to C [4] when a landscape scene, an evening scene, or a night scene is selected; FIG. FIG. 14 is an illustrative view showing another example of the setting state of the threshold values C [0] to C [4] when the evening scene scene or the night scene scene is selected, and FIG. FIG. 9 is an illustrative view showing another example of a setting state of threshold values C [0] to C [4] when the setting is performed;
6A is an illustrative view showing one example of a setting state of threshold values C [0] to C [4] when a default scene is selected; FIG. 6B is an illustrative view showing a case where a default scene is selected; FIG. 9 is an illustrative view showing another example of the setting state of the thresholds C [0] to C [4]; FIG. 10C shows the setting state of the thresholds C [0] to C [4] when the default scene is selected; It is an illustrative view showing other examples.
FIG. 7A is an illustrative view showing one example of a scene shot in a state where a portrait scene is selected, and FIG. 7B is a view showing a scene shot in a state where a portrait scene is selected; It is an illustrative view showing another example of.
FIG. 8 is an illustrative view showing one example of a scene shot in a state where a sports scene is selected;
FIG. 9 is an illustrative view showing one example of a scene photographed in a state where a landscape mode, a sunset scene mode or a night scene scene is selected;
FIG. 10A is an illustrative view showing one example of a tracking operation of a main subject, and FIG. 10B is an illustrative view showing another example of a tracking operation of a main subject;
FIG. 11A is an illustrative view showing another example of the tracking operation of the main subject; FIG. 11B is an illustrative view showing still another example of the tracking operation of the main subject;
FIG. 12A is an illustrative view showing another example of the tracking operation of the main subject; FIG. 12B is an illustrative view showing another example of the tracking operation of the main subject;
13A is an illustrative view showing a part of a series of operations for tracking a main subject, and FIG. 13B is an illustrative view showing another part of a series of operations for tracking a main subject; (C) is an illustrative view showing another portion of a series of operations for tracking the main subject.
FIG. 14 is a flowchart showing a part of the operation of the embodiment in FIG. 1;
FIG. 15 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 16 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 17 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
FIG. 18 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 19 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 20 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
FIG. 21 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 22 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 23 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
FIG. 24 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 25 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 26 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
FIG. 27 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 28 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 29 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
FIG. 30 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 31 is a flowchart showing another portion of the operation of the embodiment in FIG. 1;
FIG. 32 is a flowchart showing yet another portion of the operation of the embodiment in FIG. 1;
[Explanation of symbols]
10 Digital camera 12 Focus lens 14 Aperture mechanism 24 CPU
42 AE / AF evaluation circuit 44 tilt sensor 48 scene selection key

Claims (5)

  1. Autofocus that determines a focus area to be used for focus adjustment by comparing a plurality of focus evaluation values respectively obtained in a plurality of focus areas arranged in the object scene with a plurality of thresholds assigned to the plurality of focus areas. In the camera,
    An auto-focus camera, comprising: a detection unit that detects a camera attitude; and a first change unit that changes an assignment destination of the plurality of thresholds based on a detection result of the detection unit.
  2. 2. The auto-focus camera according to claim 1, wherein the detection unit detects a posture of the camera when a focus adjustment is instructed.
  3. The plurality of focus areas are arranged radially with respect to the center of the object scene,
    3. The autofocus camera according to claim 1, wherein the first changing unit rotates an assignee of the plurality of thresholds according to a posture of the camera. 4.
  4. The plurality of focus areas are arranged in a predetermined number in a vertical direction and a predetermined number in a horizontal direction,
    The detecting means determines whether the camera posture is an upright state or a 90 ° tilt state,
    The autofocus camera according to claim 3, wherein the first changing unit rotates the assignment destination of the plurality of thresholds by 90 °.
  5. 5. The apparatus according to claim 1, further comprising: a determination unit configured to determine an object scene; and a second changing unit configured to change a numerical value indicated by each of the plurality of thresholds according to a determination result of the determination unit. 6. Auto focus camera.
JP2003036824A 2003-02-14 2003-02-14 Auto focus camera Expired - Fee Related JP3949067B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003036824A JP3949067B2 (en) 2003-02-14 2003-02-14 Auto focus camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003036824A JP3949067B2 (en) 2003-02-14 2003-02-14 Auto focus camera

Publications (2)

Publication Number Publication Date
JP2004246160A true JP2004246160A (en) 2004-09-02
JP3949067B2 JP3949067B2 (en) 2007-07-25

Family

ID=33021811

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003036824A Expired - Fee Related JP3949067B2 (en) 2003-02-14 2003-02-14 Auto focus camera

Country Status (1)

Country Link
JP (1) JP3949067B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008004996A (en) * 2006-06-20 2008-01-10 Casio Comput Co Ltd Imaging apparatus and program thereof
JP2008009263A (en) * 2006-06-30 2008-01-17 Casio Comput Co Ltd Imaging device and program therefor
KR101336235B1 (en) 2007-02-05 2013-12-03 삼성전자주식회사 Method and apparatus of detecting attitude of digital image processing device using automatic focusing
JP2015148783A (en) * 2014-02-10 2015-08-20 オリンパス株式会社 Focus adjustment device
JP2016081019A (en) * 2014-10-22 2016-05-16 株式会社 日立産業制御ソリューションズ Focus control device, imaging apparatus, and focus control method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008004996A (en) * 2006-06-20 2008-01-10 Casio Comput Co Ltd Imaging apparatus and program thereof
JP2008009263A (en) * 2006-06-30 2008-01-17 Casio Comput Co Ltd Imaging device and program therefor
US8284256B2 (en) 2006-06-30 2012-10-09 Casio Computer Co., Ltd. Imaging apparatus and computer readable recording medium
KR101336235B1 (en) 2007-02-05 2013-12-03 삼성전자주식회사 Method and apparatus of detecting attitude of digital image processing device using automatic focusing
JP2015148783A (en) * 2014-02-10 2015-08-20 オリンパス株式会社 Focus adjustment device
JP2016081019A (en) * 2014-10-22 2016-05-16 株式会社 日立産業制御ソリューションズ Focus control device, imaging apparatus, and focus control method

Also Published As

Publication number Publication date
JP3949067B2 (en) 2007-07-25

Similar Documents

Publication Publication Date Title
US9681040B2 (en) Face tracking for controlling imaging parameters
US9137445B2 (en) Efficient display and selection of images generated during bracketed imaging
TWI399082B (en) Display control device, display control method and program
US7646420B2 (en) Digital camera with a number of photographing systems
US7978248B2 (en) Data processing apparatus and data processing method for displaying image capture mode candidates
US7720369B2 (en) Image taking apparatus
US8106995B2 (en) Image-taking method and apparatus
JP4898475B2 (en) Imaging control apparatus, imaging apparatus, and imaging control method
JP4582212B2 (en) Imaging apparatus and program
CN100437332C (en) Automatic focusing device and method, camera device
CN101013470B (en) Face importance level determining apparatus and method, and image pickup apparatus
KR100933416B1 (en) Camera device, imaging method and computer program recording medium
KR101342477B1 (en) Imaging apparatus and imaging method for taking moving image
JP4025865B2 (en) Electronic camera
JP4961965B2 (en) Subject tracking program, subject tracking device, and camera
US8264585B2 (en) Imaging apparatus capable of readily specifying undesirable photographing parameters form auto photographing parameters
US7791668B2 (en) Digital camera
JP4725802B2 (en) Imaging apparatus, focusing method, and focusing program
US8760568B2 (en) Image pickup and focus detection apparatus, control method, and storage medium
JP3867687B2 (en) Imaging device
US8514296B2 (en) Imaging apparatus capable of recognizing photographic scene and method for the same
US8466977B2 (en) Image data management apparatus and method, and recording medium
JP4840848B2 (en) Imaging apparatus, information processing method, and program
JP4040613B2 (en) Imaging device
TWI389557B (en) Imaging apparatus having focus control function

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060714

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060725

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060920

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20061024

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20061222

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20070219

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070320

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070417

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110427

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees