US20090022486A1 - Image apparatus and evaluation method - Google Patents
Image apparatus and evaluation method Download PDFInfo
- Publication number
- US20090022486A1 US20090022486A1 US12/060,961 US6096108A US2009022486A1 US 20090022486 A1 US20090022486 A1 US 20090022486A1 US 6096108 A US6096108 A US 6096108A US 2009022486 A1 US2009022486 A1 US 2009022486A1
- Authority
- US
- United States
- Prior art keywords
- image
- image information
- focus
- information
- detection position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
Definitions
- the present invention relates to an image apparatus and an evaluation method.
- an autofocusing system having an autofocusing (AF) function of detecting defocus amounts of a photographic lens in a plurality of focus detection areas set within a shooting screen and driving the photographic lens to an in-focus position based on the defocus amount in any one of the focus detection areas, and an image tracking function of storing an image of a subject (an object), which is included in a captured image and is to be tracked, as a template image (reference image) and tracking the object, i.e., the tracking target, while performing a search to find a position of an image, which is the same as or similar to the template image, in repeatedly captured images, wherein the focus detection area at an in-screen position obtained by an image tracking result is selected and the photographic lens is driven to an in-focus position based on the defocus amount in the selected focus detection area.
- AF autofocusing
- Japanese Unexamined Patent Application Publication No. 2005-215040 discloses such an autofocusing system.
- the contents of this publication are incorporated herein by reference in their entirety.
- a photographic lens is not always focused at a target object.
- a lens having a long focal length for example, if there is just a slight difference between a setting distance set by a range ring for the lens and an actual shooting distance, the shooting target is photographed in a substantially blurred state.
- a blurred image of the shooting target is employed as the template image.
- the target object is lost during the tracking, or an object other than the target is falsely recognized as the tracking target, whereby tracking performance deteriorates.
- an image apparatus includes a screen, a detecting unit, a determining unit, an acquiring unit and a control unit.
- the screen is configured to display an image formed by an optical system and at least one detection position superimposed onto the image.
- the detecting unit is configured to detect information regarding focus of the optical system at the detection position.
- the determining unit is connected to the detecting unit and configured to determine whether the information regarding the focus is within a predetermined range.
- the acquiring unit is connected to the determining unit and configured to acquire first image information at the detection position when the determining unit determines that the information regarding the focus is within the predetermined range.
- the control unit is connected to the acquiring unit and configured to evaluate arbitrary second image information in the screen using the first image information.
- an evaluation method includes displaying in a screen an image formed by an optical system, displaying at least one detection position superimposed onto the image, detecting information regarding focus of the optical system at the at least one detection position, acquiring first image information at the at least one detection position when the detected information is within a predetermined range, acquiring arbitrary image information in the screen as second image information, and evaluating the second image information using the first image information.
- FIG. 1 is a schematic view showing the construction of an image-taking apparatus (single-lens reflex digital still camera) equipped with an object tracking device according to one embodiment of the present invention
- FIG. 2 is a block diagram showing the detailed construction of a control device
- FIG. 3 is a front view showing the detailed construction of a second image-sensing device
- FIG. 4 illustrates the pixel arrangement of the second image-sensing device
- FIG. 5 is an explanatory view for explaining an object tracking method according to the embodiment.
- FIGS. 6A and 6B are explanatory views for explaining the object tracking method according to the embodiment.
- FIG. 7 is an explanatory view for explaining the object tracking method according to the embodiment.
- FIGS. 8A and 8B are flowcharts showing an object tracking process according to the embodiment.
- FIG. 9 is a flowchart showing the object tracking process according to the embodiment.
- FIG. 10 is a flowchart showing the object tracking process according to the embodiment.
- FIGS. 11A and 11B are timing charts showing an autofocusing (AF) operation and an image tracking operation according to the embodiment.
- FIGS. 12A to 12D are timing charts showing an autofocusing (AF) operation and an image tracking operation according to a modification.
- an image-taking apparatus single-reflex digital still camera equipped with an image tracking device having an autofocusing (AF) function of detecting defocus amounts of a photographic lens in a plurality of focus detection areas set within a shooting screen and driving the photographic lens to an in-focus position based on the defocus amount in any one of the focus detection areas, and an image tracking function of storing an image of an object (subject), which is included in a captured image and is to be tracked, as a template image (reference image) and tracking the object, i.e., the tracking target, while performing a search to find a position of an image, which is the same as or similar to the template image, in repeatedly captured images (i.e., while performing template matching), wherein the focus detection area at an in-screen position obtained by an image tracking result is selected and the photographic lens is driven to an in-focus position based on the defocus amount in the selected focus detection area.
- AF autofocusing
- FIG. 1 shows the construction of an image-taking apparatus (single-lens reflex digital still camera) 100 equipped with an object tracking device according to the embodiment.
- An interchangeable lens 11 is mounted to a camera body 10 in an interchangeable manner.
- the camera body 10 includes a first image-sensing device 12 for capturing an object image and recording the image.
- the first image-sensing device 12 can be constituted by, e.g., a CCD or a CMOS.
- a quick return mirror 13 is retracted to a position, indicated by solid lines, which is located outside a photographic optical path, and a shutter 14 is released so that the object image is focused by the photographic lens 15 on a light receiving surface of the first image-sensing device 12 .
- a focus detection optical system 16 and a ranging device 17 for cooperatively detecting a focus adjusted state of the photographic lens 15 are disposed at the bottom of the camera body 10 .
- the illustrated embodiment represents the case using a focus detection method with a pupil-division phase difference detection technique.
- the quick return mirror 13 is set to a position, indicated by broken lines, which is located within the photographic optical path. Paired beams of focus detection light from the photographic lens 15 pass through a half-mirror portion of the quick return mirror 13 , and after being reflected by a sub-mirror 18 , they are introduced to the focus detection optical system 16 and the ranging device 17 .
- the focus detection optical system 16 introduces the paired beams of focus detection light having entered through the photographic lens 15 to a light receiving surface of the ranging device 17 , thereby focusing a pair of optical images thereon.
- the ranging device 17 includes a pair of CCD line sensors, for example, and outputs focus detection signals corresponding to the pair of optical images.
- a finder optical system is disposed at the top of the camera body 10 .
- the quick return mirror 13 is set in the position, indicated by broken lines, which is located within the photographic optical path, light from the object, having entered through the photographic lens 15 , is introduced to a focus plate 20 and an image of the object is focused on the focus plate 20 .
- a liquid crystal display device 21 displays not only information, such as focus detection area marks, in a superimposing relation to the object image focused on the focus plate 20 , but also various items of shooting information, such as an exposure value, at a position outside the object image.
- the object image on the focus plate 20 is introduced to an ocular window 24 through a pentagonal roof prism 22 and an ocular lens 23 such that the photographer can visually confirm the object image.
- the finder optical system disposed at the top of the camera body 10 includes a second image-sensing device 25 for capturing the object image for the purposes of object tracking and photometry.
- the second image-sensing device 25 will be described in detail later.
- the object image focused on the focus plate 20 is refocused on a light receiving surface of the second image-sensing device 25 through the pentagonal roof prism 22 , a prism 26 , and an imaging lens 27 .
- the second image-sensing device 25 outputs an image signal corresponding to the object image.
- the camera body 10 includes an operating member 28 , a control device 29 , and a lens driving device 30 .
- the operating member 28 includes switches and selectors for operating the camera 100 , such as a shutter button and a focus detection area select switch.
- the control device 29 is constituted by a CPU and peripheral components, and it executes various kinds of control and processing necessary for the camera 100 .
- the lens driving device 30 is constituted by a motor and a drive circuit, and it performs focus adjustment of the photographic lens 15 through a lens driving mechanism (not shown).
- the interchangeable lens 11 includes a lens ROM 31 for storing lens information, such as the focal length, the full F-number and the aperture value (f-stop number) of the photographic lens 15 , along with a conversion coefficient between an amount of image deviation and a defocus amount.
- the control device 29 can read the lens information from the lens ROM 31 through a contact (not shown) provided in a lens mount.
- FIG. 2 shows the detailed configuration of the control device 29 . It is to be noted that control functions not directly related to the embodiment of the present invention are not shown and a description of those control functions is omitted here.
- the control device 29 includes various control sections which are constituted by software installed in a microcomputer.
- a CCD control section 31 controls accumulation and read of charges in and from the second image-sensing device 25 .
- An A/D conversion section 32 converts an analog image signal output from the second image-sensing device 25 to a digital image signal that represents image information.
- An exposure processing section 33 calculates an exposure value based on the image signal captured by the second image-sensing device 25 .
- a focus detection processing section 34 detects a focus adjusted state, i.e., a defocus amount in this embodiment, of the photographic lens 15 based on the focus detection signals corresponding to the pair of optical images, which are output from the ranging device 17 .
- the focus detection areas are set, as focus detection positions, at a plurality of positions within a shooting screen formed by the photographic lens 15 .
- the ranging device 17 outputs the focus detection signals corresponding to the pair of optical images for each of the focus detection areas, and the focus detection processing section 34 detects the defocus amount based on the focus detection signals corresponding to the pair of optical images for each of the focus detection areas.
- a lens drive amount processing section 35 converts the detected defocus amount to a lens drive amount. In accordance with the lens drive amount, the lens driving device 30 drives a focusing lens (not shown) in the photographic lens 15 to perform the focus adjustment.
- a tracking control section 36 stores, as the template image, one of the object images captured by the second image-sensing device 25 , which corresponds to a tracking target position manually designated by the photographer or a tracking target position automatically set by the camera 100 , in a storage section 37 (described later). Also, the tracking control section 36 commands the focus detection processing section 34 to detect the defocus amount of the photographic lens 15 at the tracking target position. Further, the tracking control section 36 performs a search to find an image area, which is matched with or similar to the template image, in images repeatedly captured thereafter, thus recognizing the target position, and detects the defocus amount of the photographic lens 15 at the focus detection position corresponding to the image area which is matched with or similar to the template image.
- the storage section 37 stores not only information such as the template image under the tracking by the tracking control section 36 and the defocus amounts, but also the lens information, such as the focal length, the full F-number and the aperture value (f-stop number) of the photographic lens 15 , along with the conversion coefficient between an amount of image deviation and a defocus amount, which are read from the lens ROM 31 in the interchangeable lens 11 .
- FIG. 3 is a front view showing the detailed construction of the second image-sensing device 25 .
- each pixel 40 is divided into three portions 40 a , 40 b and 40 c .
- Primary color filters in red R, green G and blue B are disposed respectively in the three portions 40 a , 40 b and 40 c . Therefore, each pixel 40 can output RGB signals of the object image.
- FIGS. 5-7 are explanatory views for explaining the object tracking operation according to the embodiment
- FIGS. 8-10 are flowcharts showing an object tracking process according to the embodiment
- FIGS. 11A and 11B are timing charts showing an automatic focusing (AF) operation and an image tracking operation.
- the control device 29 starts the object tracking operation when the photographer half-presses the shutter button of the operating member 28 after the photographer has manually designated the tracking target position within the object image captured by the second image-sensing device 25 , or after the camera 100 has automatically set the tracking target position.
- the quick return mirror 13 is set in the position, indicated by broken lines in FIG. 1 , which is located within the photographic optical path, the light from the object, having entered through the photographic lens 15 , is focused on the focus plate 20 .
- the object image on the focus plate 20 is introduced to the second image-sensing device 25 through the pentagonal roof prism 22 , the prism 26 , and the imaging lens 27 . In such a manner, the object image is repeatedly output from the second image-sensing device 25 .
- the focus detection areas are set at a plurality of positions within the shooting screen formed by the photographic lens 15 , and the liquid crystal display device 21 displays area marks in a superimposing relation to the object image on the focus plate 20 .
- focus detection areas 45 a - 45 k are set at eleven positions within the shooting screen. Further, when an arbitrary area is selected by the focus detection area select switch of the operating member 28 , the mark corresponding to the selected area is displayed in an illuminated state.
- the focus detection area 45 g corresponding to the image of the object as the tracking target is selected in step 1 of FIG. 8A by the focus detection area select switch of the operating member 28 , and the mark corresponding to the focus detection area 45 g is illuminated (indicated by the solid black mark in the illustrated embodiment).
- the shutter button of the operating member 28 is half-pressed in such a state, the object to be tracked can be designated. Also, the control device 29 starts the object tracking process in response to the half-pressing operation.
- the camera 100 may automatically set the tracking target object.
- the automatic setting can be performed by using an object face recognition technique through the steps of detecting a face portion of the object from the captured image and setting the detected face portion as the tracking target, or by using a method of setting, as the tracking target, the object in the focus detection area which indicates the smallest one of the defocus amounts detected in the plurality of focus detection areas within the shooting screen.
- step 2 the pair of optical images corresponding to the selected focus detection area ( 45 g in this embodiment) are obtained by the ranging device 17 , and the focus adjusted state, i.e., the defocus amount, of the photographic lens 15 in the selected focus detection area ( 45 g ) is detected based on the pair of optical images.
- step 3 the defocus amount detected in the selected focus detection area ( 45 g ) is converted to a lens drive amount, and the photographic lens 15 is driven by the lens driving device 30 for focus adjustment.
- next step 4 the pair of optical images corresponding to the selected focus detection area ( 45 g ) are obtained by the ranging device 17 , and the defocus amount of the photographic lens 15 in the selected focus detection area ( 45 g ) is detected based on the pair of optical images.
- step 5 it is determined whether the photographic lens 15 is in focus with the tracking target object in the selected focus detection area ( 45 g ). Stated another way, if the defocus amount detected in the selected focus detection area ( 45 g ) after the focus adjustment is within a preset threshold for making the in-focus determination, it is determined that the photographic lens 15 is in focus with the tracking target object in the selected focus detection area ( 45 g ).
- the threshold for making the in-focus determination may be set to the same value as that of a threshold used in the automatic focusing (AF) function.
- the threshold for making the in-focus determination may be set to have an absolute value larger than that of the threshold used in the AF function so as to provide a wider range in which the in-focus is determined.
- the shooting target is photographed in a significantly blurred state.
- a blurred image of the shooting target is employed as the template image.
- the threshold may be set to a smaller value so that a sharper image of the tracking target is obtained.
- an optimum threshold may be set depending on a selected shooting mode. For example, when a sport shooting mode is selected to take a sport photograph of a quickly moving object, a photographed image is more apt to blur due to shake of the camera body held by the hand and motion of the object itself. In such a case, therefore, the threshold may be set to a smaller value than that set when a normal shooting mode is selected, to thereby provide an image being in focus with the tracking object.
- step 5 If it is determined in step 5 that the defocus amount in the selected focus detection area ( 45 g ) is larger than the preset threshold for making the in-focus determination and the photographic lens 15 is not in focus with the tracking target object, the control flow returns to step 3 to repeat the above-described focus detection and focus adjustment. If it is determined that the photographic lens 15 is in focus with the tracking target object, the control flow proceeds to step 6 in which the liquid crystal display device 21 displays the fact that the designated tracking target object is in focus, and in which voice guide is provided through a speaker (not shown). The process of step 6 for displaying the in-focus of the tracking target object and providing the voice guide may be dispensed with.
- step 7 a tracking initial image is obtained from the second image-sensing device 25 .
- step 8 an initial process for the tracking control, shown in FIG. 9 , is executed.
- step 101 of FIG. 9 an image at a position corresponding to the position of the selected focus detection area 45 g in the tracking initial image, which is obtained from the second image-sensing device 25 , is stored as object color information.
- step 102 in peripheral part around the selected focus detection area ( 45 g ), a same-color information region having similar color information to the object color information is detected as shown in FIG. 6( a ).
- the same-color information region is set as an initial tracking object region 47 .
- step 104 an image of the tracking object region 47 within the initial tracking image is stored in the storage section 37 as a template image 48 (see FIG. 6( b )) used in the image tracking process.
- step 105 a region obtained by enlarging the tracking object region 47 in size corresponding to the number of predetermined pixels (two in the illustrated embodiment) in the up-and-down directions and the right-and-left directions with the region 47 being at a center is set as a search region 49 . Thereafter, the control flow returns to step 9 of FIG. 8A .
- the tracking target region 47 may be decided based on the object color information
- the tracking object region 47 may be decided based on brightness information.
- the region size may be decided as a constant size of 4 ⁇ 4 pixels for simplification of the process, or may be decided depending on the focal length information of the photographic lens 15 .
- step 9 of FIG. 8A after the initial process for the tracking control, it is confirmed whether the shutter button of the operating member 28 has been fully pressed, i.e., whether the shutter release operation has been performed. If the shutter release operation has been performed, the control flow proceeds to step 16 in which the mirror 13 is lifted up and the shutter 14 is released, thereby shooting an image by the first image-sensing device 12 . On the other hand, if the shutter release operation is not performed, the control flow proceeds to step 10 in which next image information is obtained from the second image-sensing device 25 and the pair of optical images for the focus detection are obtained by the ranging device 17 for each of the focus detection areas 45 a - 45 k.
- step 11 a tracking computation process shown in FIG. 10 is executed.
- step 201 of FIG. 10 a region having the same size as the template image 48 is successively cut out from the search region 49 in a tracking next image.
- a difference in image information between the cut-out image and the template image 48 per corresponding pixel is calculated and the total sum of the differences for all the pixels is obtained.
- the differences can be calculated by using the raw image signals output from the second image-sensing device 25 and pre-processing, such as white balancing and filtering, for the raw image signals is not required, whereby the tracking process can be simplified.
- a color difference per pixel may be calculated.
- a brightness difference per pixel may be calculated.
- step 203 a region obtained by enlarging the new tracking object region 47 in size corresponding to the number of predetermined pixels (two in the illustrated embodiment) in the up-and-down directions and the right-and-left directions with the region 47 being at a center is set as a new search region 49 . Thereafter, the control flow returns to step 12 of FIG. 8B .
- a region set as the search region 49 may be an arbitrary region within the screen of the second image-sensing device 25 . Such a region is preferably larger than the tracking object region 47 and, more preferably, it includes the focus detection area 45 g in the tracking object region 47 .
- the control flow may additionally include a process of updating the image information of the template image 48 by using the image information of the newly decided tracking object region 47 .
- a process of updating the image information of the template image 48 by using the image information of the newly decided tracking object region 47 by adding 20% of the image information of the new tracking object region 47 to 80% of the image information of the template image 48 , for example, the latest image information is updated a little by a little with respect to the image information of the template image 48 such that the tracking operation is able to more easily follow changes of the tracking object.
- the tracking computation method may be modified as follows. Image information in the surroundings of the tracking object region is stored along with the image information of the designated tracking object region. From each of the repeatedly captured images, a plurality of candidate regions for a new tracking object region are detected which have relatively small total sum values of the differentials with respect to the template image.
- This modified method can prevent erroneous detection of the tracking target object and can improve the accuracy in tracking the object.
- step 12 of FIG. 8B after the return, the focus adjusted state, i.e., the defocus amount, of the photographic lens 15 in each of the set focus detection areas 45 a - 45 k is detected based on the pair of optical images which have been obtained in step 10 .
- step 13 based on defocus amounts of all the focus detection areas, a search is performed to find focus detection areas having the defocus amounts close to the defocus amount of the focus detection area which has been employed in the previous focus adjustment.
- the focus detection areas 45 b , 45 f and 45 j provide the defocus amounts close to the defocus amount of the focus detection area which has been employed in the previous focus adjustment.
- step 14 an area where the focus adjustment is to be performed is decided based on both the new tracking object region obtained as the image tracking result in step 11 and the focus detection areas obtained as the area search result in steps 12 - 13 . Comparing the focus detection area 45 f corresponding to the new tracking object region 47 obtained as the image tracking result with the focus detection areas 45 b , 45 f and 45 j obtained as the area search result, as shown in FIG. 7 , the focus detection area 45 f , for example, is considered to be suitable as a position of the tracking object in any process and hence the area 45 f is decided as an area where the focus adjustment is to be performed (i.e., a focus adjustment area).
- step 15 the defocus amount detected in the focus detection area 45 f is converted to a lens drive amount, and the photographic lens 15 is driven by the lens driving device 30 for the focus adjustment.
- the processing in step 14 may be modified as follows. Whether the tracking target object is moving closer to or away from the camera in the direction of an optical axis of the camera is determined based on the difference between the defocus amount in the previous focus adjustment and the currently detected defocus amount in the focus detection area which has been found by the search. If it is determined that the tracking target object is moving closer to the camera, the template image is enlarged. To the contrary, if it is determined that the tracking target object is moving away from the camera, the template image is reduced. Such a modification can provide an appropriate template image more adapted for motions of the tracking target object.
- step 10 the processing of steps 10 - 15 is repeatedly executed. If the shutter button is fully pressed, the control flow proceeds to step 16 in which the above-described shooting process is executed.
- FIGS. 11A and 11B the automatic focusing (AF) operation and the image tracking operation according to the embodiment will be described with reference to FIGS. 11A and 11B .
- the object tracking process shown in FIGS. 8A and 8B
- the AF operation is first executed.
- charge accumulation in the ranging device 17 i.e., AF accumulation
- focus detecting computation i.e., AF computation
- the focus adjustment of the photographic lens 15 is performed based on the AF computation result, i.e., the defocus amount, in the selected focus detection area.
- the AF processing of steps 2 - 4 in FIG. 8A is repeated three times. If in-focus with the tracking target object is determined in step 5 in the third cycle of the AF processing, the image tracking operation is started.
- the threshold for making the in-focus determination in step 5 is larger than the threshold used in the AF function, an in-focus determination signal and a lens driving signal are both output.
- a tracking initial image is first accumulated in the second image-sensing device 25 (i.e., image accumulation; step 7 of FIG. 8A ).
- the tracking initial process is executed (i.e., tracking initial process; step 8 of FIG. 8A ).
- a tracking next image is accumulated in the second image-sensing device 25 (i.e., image accumulation; step 10 of FIG. 8B ).
- the tracking computation process is executed (i.e., tracking computation; step 11 of FIG. 8B ).
- the AF operation is executed again with the image tracking result reflected thereon. More specifically, the defocus amount is detected for each of the focus detection areas, and the area having the defocus amount close to that used in the previous focus adjustment is searched for. The focus adjustment area is decided based on the area search result and the image tracking result. Then, the focus adjustment is performed based on the defocus amount of the decided focus adjustment area. Thereafter, during a period in which the shutter button is held half-pressed, the image tracking operation and the AF operation based on the image tracking result are executed repeatedly.
- the image tracking device having the image tracking function of comparing the object images, which are formed by the photographic lens 15 and are repeatedly captured by the second image-sensing device 25 , with the template image and detecting the position of the tracking target in the screen of the photographic lens 15 , and the autofocusing (AF) function of detecting the focus of the photographic lens 15 in the focus detection areas set within the screen and performing the focus adjustment of the photographic lens 15 based on the focus detection result, it is determined whether or not the focus detection result with the AF function is in focus, and the focus detection and the focus adjustment based on the AF function are performed in the focus detection area corresponding to the position of the tracking target in the screen.
- the image at the position of the tracking target in the object image, which is captured when the in-focus is determined, is obtained as the template image. Therefore, a sharp target image can be obtained as the reference image and tracking performance can be improved.
- the tracking process can be efficiently executed while making the image tracking operation synchronized with the autofocusing (AF) function.
- the charge accumulation in the ranging device 17 for the AF operation and the charge accumulation in the second image-sensing device 25 for the image tracking operation are successively executed in a synchronous relation as shown in FIGS. 11A and 11B
- the charge accumulation in the ranging device 17 for the AF operation and the charge accumulation in the second image-sensing device 25 for the image tracking operation may be executed in an asynchronous relation.
- FIGS. 12A , 12 B, 12 C and 12 D are timing charts showing the autofocusing (AF) operation and the image tracking operation according to a modification of the embodiment when the charge accumulation in the ranging device 17 for the AF operation and the charge accumulation in the second image-sensing device 25 for the image tracking operation are executed in an asynchronous relation. Note that the following description is made primarily on different points while the description of similar points to those in the case of the synchronous operation, shown in FIGS. 11A and 11B , is omitted.
- the charge accumulation for each of the AF operation and the image tracking operation is always performed at a predetermined time interval even when the computation process based on the charge accumulation result is not executed.
- the tracking control section 36 executes the tracking initial process after the in-focus determination (after a lapse of time T 1 ) by employing, as the tracking initial image, the tracking object image accumulated in the second image-sensing device 25 .
- the tracking control section 36 executes the tracking initial process immediately after the in-focus determination by employing, as the tracking initial image, the tracking object image accumulated in the second image-sensing device 25 immediately before the in-focus determination (time T 2 earlier).
- the time (T 2 ) from the end of the charge accumulation immediately before the in-focus determination to arrival of a signal sent upon the in-focus determination in the asynchronous operation is shorter than the time from the in-focus determination to the start of the charge accumulation.
- an image obtained as a result of the charge accumulation completed earlier than the in-focus determination by the time T 2 represents an image that is accumulated at the timing closer to the timing of the in-focus determination than an image obtained as a result of the charge accumulation started after the time T 1 from the in-focus determination.
- the tracking control section 36 sets T 1 as a period from the time at which the signal is sent upon the in-focus determination to the time at which the image accumulation performed after the in-focus determination is started. Also, the tracking control section 36 sets T 2 as a period from the time at which the signal is sent upon the in-focus determination to the time at which the image accumulation performed before the in-focus determination is ended. Additionally, the tracking control section 36 may decide the timing closer to the in-focus determination based on respective periods from the time at the middle of a period during which the AF accumulation is performed, to the start time, the end time and the middle time of the image accumulation.
- the tracking performance can be improved by using a sharper template image that is more in focus with the tracking target.
- the present invention can be realized with any type of image-taking apparatus so long as it is able to capture images on the time serial basis.
- the present invention is also applicable to, other than the single-lens reflex digital still camera, a consumer digital camera, a video camera used as an image-taking apparatus that captures moving images, and so on.
- the present invention can also be applied to the case where the tracking target object is changed from one to another during the image tracking operation for some reason.
- a new template image is obtained after confirming the in-focus state of a new tracking target object.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
An image apparatus includes a screen, a detecting unit, a determining unit, an acquiring unit and a control unit. The screen is configured to display an image formed by an optical system and at least one detection position superimposed onto the image. The detecting unit is configured to detect information regarding focus of the optical system at the detection position. The determining unit is connected to the detecting unit and configured to determine whether the information regarding the focus is within a predetermined range. The acquiring unit is connected to the determining unit and configured to acquire first image information at the detection position when the determining unit determines that the information regarding the focus is within the predetermined range. The control unit is connected to the acquiring unit and configured to evaluate arbitrary second image information in the screen using the first image information.
Description
- The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2007-140114, filed May 28, 2007. The contents of this application are incorporated herein by reference in their entirety.
- 1. Field of the Invention
- The present invention relates to an image apparatus and an evaluation method.
- 2. Discussion of the Background
- There is known an autofocusing system having an autofocusing (AF) function of detecting defocus amounts of a photographic lens in a plurality of focus detection areas set within a shooting screen and driving the photographic lens to an in-focus position based on the defocus amount in any one of the focus detection areas, and an image tracking function of storing an image of a subject (an object), which is included in a captured image and is to be tracked, as a template image (reference image) and tracking the object, i.e., the tracking target, while performing a search to find a position of an image, which is the same as or similar to the template image, in repeatedly captured images, wherein the focus detection area at an in-screen position obtained by an image tracking result is selected and the photographic lens is driven to an in-focus position based on the defocus amount in the selected focus detection area.
- Japanese Unexamined Patent Application Publication No. 2005-215040 discloses such an autofocusing system. The contents of this publication are incorporated herein by reference in their entirety.
- However, when a photographer instructs the start of tracking of a shooting target, a photographic lens is not always focused at a target object. In the case of a lens having a long focal length, for example, if there is just a slight difference between a setting distance set by a range ring for the lens and an actual shooting distance, the shooting target is photographed in a substantially blurred state. When the tracking is started in such a state, a blurred image of the shooting target is employed as the template image. As a result, the target object is lost during the tracking, or an object other than the target is falsely recognized as the tracking target, whereby tracking performance deteriorates.
- According to one aspect of the present invention, an image apparatus includes a screen, a detecting unit, a determining unit, an acquiring unit and a control unit. The screen is configured to display an image formed by an optical system and at least one detection position superimposed onto the image. The detecting unit is configured to detect information regarding focus of the optical system at the detection position. The determining unit is connected to the detecting unit and configured to determine whether the information regarding the focus is within a predetermined range. The acquiring unit is connected to the determining unit and configured to acquire first image information at the detection position when the determining unit determines that the information regarding the focus is within the predetermined range. The control unit is connected to the acquiring unit and configured to evaluate arbitrary second image information in the screen using the first image information.
- According to another aspect of the present invention, an evaluation method includes displaying in a screen an image formed by an optical system, displaying at least one detection position superimposed onto the image, detecting information regarding focus of the optical system at the at least one detection position, acquiring first image information at the at least one detection position when the detected information is within a predetermined range, acquiring arbitrary image information in the screen as second image information, and evaluating the second image information using the first image information.
- A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
-
FIG. 1 is a schematic view showing the construction of an image-taking apparatus (single-lens reflex digital still camera) equipped with an object tracking device according to one embodiment of the present invention; -
FIG. 2 is a block diagram showing the detailed construction of a control device; -
FIG. 3 is a front view showing the detailed construction of a second image-sensing device; -
FIG. 4 illustrates the pixel arrangement of the second image-sensing device; -
FIG. 5 is an explanatory view for explaining an object tracking method according to the embodiment; -
FIGS. 6A and 6B are explanatory views for explaining the object tracking method according to the embodiment; -
FIG. 7 is an explanatory view for explaining the object tracking method according to the embodiment; -
FIGS. 8A and 8B are flowcharts showing an object tracking process according to the embodiment; -
FIG. 9 is a flowchart showing the object tracking process according to the embodiment; -
FIG. 10 is a flowchart showing the object tracking process according to the embodiment; -
FIGS. 11A and 11B are timing charts showing an autofocusing (AF) operation and an image tracking operation according to the embodiment; and -
FIGS. 12A to 12D are timing charts showing an autofocusing (AF) operation and an image tracking operation according to a modification. - The embodiments will now be described with reference to the accompanying drawings, wherein like reference numerals designate corresponding or identical elements throughout the various drawings.
- One embodiment of an image-taking apparatus (single-reflex digital still camera) equipped with an image tracking device having an autofocusing (AF) function of detecting defocus amounts of a photographic lens in a plurality of focus detection areas set within a shooting screen and driving the photographic lens to an in-focus position based on the defocus amount in any one of the focus detection areas, and an image tracking function of storing an image of an object (subject), which is included in a captured image and is to be tracked, as a template image (reference image) and tracking the object, i.e., the tracking target, while performing a search to find a position of an image, which is the same as or similar to the template image, in repeatedly captured images (i.e., while performing template matching), wherein the focus detection area at an in-screen position obtained by an image tracking result is selected and the photographic lens is driven to an in-focus position based on the defocus amount in the selected focus detection area.
-
FIG. 1 shows the construction of an image-taking apparatus (single-lens reflex digital still camera) 100 equipped with an object tracking device according to the embodiment. Aninterchangeable lens 11 is mounted to acamera body 10 in an interchangeable manner. Thecamera body 10 includes a first image-sensing device 12 for capturing an object image and recording the image. The first image-sensing device 12 can be constituted by, e.g., a CCD or a CMOS. At the time of shooting, aquick return mirror 13 is retracted to a position, indicated by solid lines, which is located outside a photographic optical path, and ashutter 14 is released so that the object image is focused by thephotographic lens 15 on a light receiving surface of the first image-sensing device 12. - A focus detection
optical system 16 and a rangingdevice 17 for cooperatively detecting a focus adjusted state of thephotographic lens 15 are disposed at the bottom of thecamera body 10. The illustrated embodiment represents the case using a focus detection method with a pupil-division phase difference detection technique. Before the shooting, thequick return mirror 13 is set to a position, indicated by broken lines, which is located within the photographic optical path. Paired beams of focus detection light from thephotographic lens 15 pass through a half-mirror portion of thequick return mirror 13, and after being reflected by asub-mirror 18, they are introduced to the focus detectionoptical system 16 and the rangingdevice 17. The focus detectionoptical system 16 introduces the paired beams of focus detection light having entered through thephotographic lens 15 to a light receiving surface of the rangingdevice 17, thereby focusing a pair of optical images thereon. The rangingdevice 17 includes a pair of CCD line sensors, for example, and outputs focus detection signals corresponding to the pair of optical images. - A finder optical system is disposed at the top of the
camera body 10. Before the shooting, since thequick return mirror 13 is set in the position, indicated by broken lines, which is located within the photographic optical path, light from the object, having entered through thephotographic lens 15, is introduced to afocus plate 20 and an image of the object is focused on thefocus plate 20. A liquidcrystal display device 21 displays not only information, such as focus detection area marks, in a superimposing relation to the object image focused on thefocus plate 20, but also various items of shooting information, such as an exposure value, at a position outside the object image. The object image on thefocus plate 20 is introduced to anocular window 24 through apentagonal roof prism 22 and anocular lens 23 such that the photographer can visually confirm the object image. - Further, the finder optical system disposed at the top of the
camera body 10 includes a second image-sensing device 25 for capturing the object image for the purposes of object tracking and photometry. The second image-sensing device 25 will be described in detail later. The object image focused on thefocus plate 20 is refocused on a light receiving surface of the second image-sensing device 25 through thepentagonal roof prism 22, aprism 26, and animaging lens 27. The second image-sensing device 25 outputs an image signal corresponding to the object image. - In addition, the
camera body 10 includes anoperating member 28, acontrol device 29, and alens driving device 30. The operatingmember 28 includes switches and selectors for operating thecamera 100, such as a shutter button and a focus detection area select switch. Thecontrol device 29 is constituted by a CPU and peripheral components, and it executes various kinds of control and processing necessary for thecamera 100. Thelens driving device 30 is constituted by a motor and a drive circuit, and it performs focus adjustment of thephotographic lens 15 through a lens driving mechanism (not shown). Theinterchangeable lens 11 includes alens ROM 31 for storing lens information, such as the focal length, the full F-number and the aperture value (f-stop number) of thephotographic lens 15, along with a conversion coefficient between an amount of image deviation and a defocus amount. Thecontrol device 29 can read the lens information from thelens ROM 31 through a contact (not shown) provided in a lens mount. -
FIG. 2 shows the detailed configuration of thecontrol device 29. It is to be noted that control functions not directly related to the embodiment of the present invention are not shown and a description of those control functions is omitted here. Thecontrol device 29 includes various control sections which are constituted by software installed in a microcomputer. ACCD control section 31 controls accumulation and read of charges in and from the second image-sensingdevice 25. An A/D conversion section 32 converts an analog image signal output from the second image-sensingdevice 25 to a digital image signal that represents image information. Anexposure processing section 33 calculates an exposure value based on the image signal captured by the second image-sensingdevice 25. - A focus
detection processing section 34 detects a focus adjusted state, i.e., a defocus amount in this embodiment, of thephotographic lens 15 based on the focus detection signals corresponding to the pair of optical images, which are output from the rangingdevice 17. Though described in detail later, the focus detection areas are set, as focus detection positions, at a plurality of positions within a shooting screen formed by thephotographic lens 15. The rangingdevice 17 outputs the focus detection signals corresponding to the pair of optical images for each of the focus detection areas, and the focusdetection processing section 34 detects the defocus amount based on the focus detection signals corresponding to the pair of optical images for each of the focus detection areas. A lens driveamount processing section 35 converts the detected defocus amount to a lens drive amount. In accordance with the lens drive amount, thelens driving device 30 drives a focusing lens (not shown) in thephotographic lens 15 to perform the focus adjustment. - A
tracking control section 36 stores, as the template image, one of the object images captured by the second image-sensingdevice 25, which corresponds to a tracking target position manually designated by the photographer or a tracking target position automatically set by thecamera 100, in a storage section 37 (described later). Also, thetracking control section 36 commands the focusdetection processing section 34 to detect the defocus amount of thephotographic lens 15 at the tracking target position. Further, thetracking control section 36 performs a search to find an image area, which is matched with or similar to the template image, in images repeatedly captured thereafter, thus recognizing the target position, and detects the defocus amount of thephotographic lens 15 at the focus detection position corresponding to the image area which is matched with or similar to the template image. When the detected result is compared with the defocus amount previously detected in the tracking target area and the compared result does not indicate a significant change, focusing control is performed at the focus detection position corresponding to the found image area. By repeating the above-described procedures, a particular object is tracked. Thestorage section 37 stores not only information such as the template image under the tracking by thetracking control section 36 and the defocus amounts, but also the lens information, such as the focal length, the full F-number and the aperture value (f-stop number) of thephotographic lens 15, along with the conversion coefficient between an amount of image deviation and a defocus amount, which are read from thelens ROM 31 in theinterchangeable lens 11. -
FIG. 3 is a front view showing the detailed construction of the second image-sensingdevice 25. The second image-sensingdevice 25 has a plurality of pixels (photoelectric transducers) 40 arrayed in a matrix pattern (16 in the horizontal direction×12 in the vertical direction=192 in the illustrated embodiment). As shown inFIG. 4 , eachpixel 40 is divided into three portions 40 a, 40 b and 40 c. Primary color filters in red R, green G and blue B are disposed respectively in the three portions 40 a, 40 b and 40 c. Therefore, eachpixel 40 can output RGB signals of the object image. - Next, an object tracking operation according to one embodiment will be described.
FIGS. 5-7 are explanatory views for explaining the object tracking operation according to the embodiment, andFIGS. 8-10 are flowcharts showing an object tracking process according to the embodiment.FIGS. 11A and 11B are timing charts showing an automatic focusing (AF) operation and an image tracking operation. Thecontrol device 29 starts the object tracking operation when the photographer half-presses the shutter button of the operatingmember 28 after the photographer has manually designated the tracking target position within the object image captured by the second image-sensingdevice 25, or after thecamera 100 has automatically set the tracking target position. - Other than when a photograph is taken by fully pressing the shutter button, the
quick return mirror 13 is set in the position, indicated by broken lines inFIG. 1 , which is located within the photographic optical path, the light from the object, having entered through thephotographic lens 15, is focused on thefocus plate 20. The object image on thefocus plate 20 is introduced to the second image-sensingdevice 25 through thepentagonal roof prism 22, theprism 26, and theimaging lens 27. In such a manner, the object image is repeatedly output from the second image-sensingdevice 25. - The focus detection areas are set at a plurality of positions within the shooting screen formed by the
photographic lens 15, and the liquidcrystal display device 21 displays area marks in a superimposing relation to the object image on thefocus plate 20. In the illustrated embodiment, as shown inFIG. 5 , focusdetection areas 45 a-45 k are set at eleven positions within the shooting screen. Further, when an arbitrary area is selected by the focus detection area select switch of the operatingmember 28, the mark corresponding to the selected area is displayed in an illuminated state. - Assuming, for example, that the
focus detection area 45 g corresponding to the image of the object as the tracking target is selected instep 1 ofFIG. 8A by the focus detection area select switch of the operatingmember 28, and the mark corresponding to thefocus detection area 45 g is illuminated (indicated by the solid black mark in the illustrated embodiment). When the shutter button of the operatingmember 28 is half-pressed in such a state, the object to be tracked can be designated. Also, thecontrol device 29 starts the object tracking process in response to the half-pressing operation. - Instead of manually designating the tracking target object by the photographer, the
camera 100 may automatically set the tracking target object. For example, the automatic setting can be performed by using an object face recognition technique through the steps of detecting a face portion of the object from the captured image and setting the detected face portion as the tracking target, or by using a method of setting, as the tracking target, the object in the focus detection area which indicates the smallest one of the defocus amounts detected in the plurality of focus detection areas within the shooting screen. - In
step 2, the pair of optical images corresponding to the selected focus detection area (45 g in this embodiment) are obtained by the rangingdevice 17, and the focus adjusted state, i.e., the defocus amount, of thephotographic lens 15 in the selected focus detection area (45 g) is detected based on the pair of optical images. Instep 3, the defocus amount detected in the selected focus detection area (45 g) is converted to a lens drive amount, and thephotographic lens 15 is driven by thelens driving device 30 for focus adjustment. Innext step 4, the pair of optical images corresponding to the selected focus detection area (45 g) are obtained by the rangingdevice 17, and the defocus amount of thephotographic lens 15 in the selected focus detection area (45 g) is detected based on the pair of optical images. - In
step 5, it is determined whether thephotographic lens 15 is in focus with the tracking target object in the selected focus detection area (45 g). Stated another way, if the defocus amount detected in the selected focus detection area (45 g) after the focus adjustment is within a preset threshold for making the in-focus determination, it is determined that thephotographic lens 15 is in focus with the tracking target object in the selected focus detection area (45 g). - The threshold for making the in-focus determination may be set to the same value as that of a threshold used in the automatic focusing (AF) function. Alternatively, if the template image (reference image) used in the image tracking function is obtained as an image which is sharp at such a level as ensuring the predetermined tracking performance, the threshold for making the in-focus determination may be set to have an absolute value larger than that of the threshold used in the AF function so as to provide a wider range in which the in-focus is determined.
- In the case of the
photographic lens 15 having a longer focal length, if there is just a slight difference between a setting distance set by a range ring for the lens and an actual shooting distance, the shooting target is photographed in a significantly blurred state. When the tracking is started in such a state, a blurred image of the shooting target is employed as the template image. To avoid such a problem, corresponding to the focal length of thephotographic lens 15, obtained from thelens ROM 31, which has a larger value, the threshold may be set to a smaller value so that a sharper image of the tracking target is obtained. - Further, an optimum threshold may be set depending on a selected shooting mode. For example, when a sport shooting mode is selected to take a sport photograph of a quickly moving object, a photographed image is more apt to blur due to shake of the camera body held by the hand and motion of the object itself. In such a case, therefore, the threshold may be set to a smaller value than that set when a normal shooting mode is selected, to thereby provide an image being in focus with the tracking object.
- If it is determined in
step 5 that the defocus amount in the selected focus detection area (45 g) is larger than the preset threshold for making the in-focus determination and thephotographic lens 15 is not in focus with the tracking target object, the control flow returns to step 3 to repeat the above-described focus detection and focus adjustment. If it is determined that thephotographic lens 15 is in focus with the tracking target object, the control flow proceeds to step 6 in which the liquidcrystal display device 21 displays the fact that the designated tracking target object is in focus, and in which voice guide is provided through a speaker (not shown). The process ofstep 6 for displaying the in-focus of the tracking target object and providing the voice guide may be dispensed with. - In
step 7, a tracking initial image is obtained from the second image-sensingdevice 25. Innext step 8, an initial process for the tracking control, shown inFIG. 9 , is executed. In step 101 ofFIG. 9 , an image at a position corresponding to the position of the selectedfocus detection area 45 g in the tracking initial image, which is obtained from the second image-sensingdevice 25, is stored as object color information. In step 102, in peripheral part around the selected focus detection area (45 g), a same-color information region having similar color information to the object color information is detected as shown inFIG. 6( a). Innext step 103, the same-color information region is set as an initialtracking object region 47. In step 104, an image of thetracking object region 47 within the initial tracking image is stored in thestorage section 37 as a template image 48 (seeFIG. 6( b)) used in the image tracking process. Instep 105, a region obtained by enlarging thetracking object region 47 in size corresponding to the number of predetermined pixels (two in the illustrated embodiment) in the up-and-down directions and the right-and-left directions with theregion 47 being at a center is set as asearch region 49. Thereafter, the control flow returns to step 9 ofFIG. 8A . - While the illustrated embodiment represents an example in which the
tracking target region 47 is decided based on the object color information, thetracking object region 47 may be decided based on brightness information. Also, the region size may be decided as a constant size of 4×4 pixels for simplification of the process, or may be decided depending on the focal length information of thephotographic lens 15. - In step 9 of
FIG. 8A after the initial process for the tracking control, it is confirmed whether the shutter button of the operatingmember 28 has been fully pressed, i.e., whether the shutter release operation has been performed. If the shutter release operation has been performed, the control flow proceeds to step 16 in which themirror 13 is lifted up and theshutter 14 is released, thereby shooting an image by the first image-sensingdevice 12. On the other hand, if the shutter release operation is not performed, the control flow proceeds to step 10 in which next image information is obtained from the second image-sensingdevice 25 and the pair of optical images for the focus detection are obtained by the rangingdevice 17 for each of thefocus detection areas 45 a-45 k. - In
step 11, a tracking computation process shown inFIG. 10 is executed. In step 201 ofFIG. 10 , a region having the same size as thetemplate image 48 is successively cut out from thesearch region 49 in a tracking next image. A difference in image information between the cut-out image and thetemplate image 48 per corresponding pixel is calculated and the total sum of the differences for all the pixels is obtained. At that time, for example, when the difference in image information is provided by differences between the hues B and G and between the hues R and G for each pixel, the differences can be calculated by using the raw image signals output from the second image-sensingdevice 25 and pre-processing, such as white balancing and filtering, for the raw image signals is not required, whereby the tracking process can be simplified. Instead of the hue difference, a color difference per pixel may be calculated. When thetracking object region 47 is decided based on brightness information, a brightness difference per pixel may be calculated. - When the total sum of the differentials between each of all the images cut out from the
search region 49 and thetemplate image 48 has been calculated, one of the cut-out images is selected which has a minimum total sum value of the differentials among the calculation results, and the region of the selected cut-out image is decided as a newtracking object region 47, thus recognizing the target position. In step 203, a region obtained by enlarging the newtracking object region 47 in size corresponding to the number of predetermined pixels (two in the illustrated embodiment) in the up-and-down directions and the right-and-left directions with theregion 47 being at a center is set as anew search region 49. Thereafter, the control flow returns to step 12 ofFIG. 8B . A region set as thesearch region 49 may be an arbitrary region within the screen of the second image-sensingdevice 25. Such a region is preferably larger than thetracking object region 47 and, more preferably, it includes thefocus detection area 45 g in thetracking object region 47. - The control flow may additionally include a process of updating the image information of the
template image 48 by using the image information of the newly decided trackingobject region 47. In such a case, by adding 20% of the image information of the newtracking object region 47 to 80% of the image information of thetemplate image 48, for example, the latest image information is updated a little by a little with respect to the image information of thetemplate image 48 such that the tracking operation is able to more easily follow changes of the tracking object. - The foregoing embodiment has been described in connection with the case where the template image in the designated tracking object region is stored, the differentials in the image information per pixel with respect to the template image are calculated for regions in each of the repeatedly captured images, and the region having the minimum total sum value of the differentials is set as the tracking object region. However, the tracking computation method may be modified as follows. Image information in the surroundings of the tracking object region is stored along with the image information of the designated tracking object region. From each of the repeatedly captured images, a plurality of candidate regions for a new tracking object region are detected which have relatively small total sum values of the differentials with respect to the template image. Then, image information in the surroundings of each of the candidate regions is successively compared with the image information in the surroundings of the previous tracking object region, and the candidate region having a maximum coincidence is set as the new tracking object region. This modified method can prevent erroneous detection of the tracking target object and can improve the accuracy in tracking the object.
- In
step 12 ofFIG. 8B after the return, the focus adjusted state, i.e., the defocus amount, of thephotographic lens 15 in each of the setfocus detection areas 45 a-45 k is detected based on the pair of optical images which have been obtained instep 10. Innext step 13, based on defocus amounts of all the focus detection areas, a search is performed to find focus detection areas having the defocus amounts close to the defocus amount of the focus detection area which has been employed in the previous focus adjustment. Herein, as shown inFIG. 7 , it is assumed, by way of example, that thefocus detection areas - In
step 14, an area where the focus adjustment is to be performed is decided based on both the new tracking object region obtained as the image tracking result instep 11 and the focus detection areas obtained as the area search result in steps 12-13. Comparing thefocus detection area 45 f corresponding to the newtracking object region 47 obtained as the image tracking result with thefocus detection areas FIG. 7 , thefocus detection area 45 f, for example, is considered to be suitable as a position of the tracking object in any process and hence thearea 45 f is decided as an area where the focus adjustment is to be performed (i.e., a focus adjustment area). Instep 15, the defocus amount detected in thefocus detection area 45 f is converted to a lens drive amount, and thephotographic lens 15 is driven by thelens driving device 30 for the focus adjustment. - While the focus adjustment area is decided in
step 14 based on the new tracking object region obtained as the image tracking result and the focus detection area found by the search made depending on the AF result, the processing instep 14 may be modified as follows. Whether the tracking target object is moving closer to or away from the camera in the direction of an optical axis of the camera is determined based on the difference between the defocus amount in the previous focus adjustment and the currently detected defocus amount in the focus detection area which has been found by the search. If it is determined that the tracking target object is moving closer to the camera, the template image is enlarged. To the contrary, if it is determined that the tracking target object is moving away from the camera, the template image is reduced. Such a modification can provide an appropriate template image more adapted for motions of the tracking target object. - During a period in which the shutter button is half-pressed, the processing of steps 10-15 is repeatedly executed. If the shutter button is fully pressed, the control flow proceeds to step 16 in which the above-described shooting process is executed.
- Next, the automatic focusing (AF) operation and the image tracking operation according to the embodiment will be described with reference to
FIGS. 11A and 11B . As described above, when the shutter button is half-pressed, the object tracking process, shown inFIGS. 8A and 8B , is started and the AF operation is first executed. In the AF operation, as shown inFIGS. 11A and 11B , charge accumulation in the ranging device 17 (i.e., AF accumulation) is performed and focus detecting computation (i.e., AF computation) is executed based on a focus detection signal obtained as the AF accumulation result. The focus adjustment of the photographic lens 15 (i.e., lens driving) is performed based on the AF computation result, i.e., the defocus amount, in the selected focus detection area. In an example shown inFIGS. 11A and 11B , the AF processing of steps 2-4 inFIG. 8A is repeated three times. If in-focus with the tracking target object is determined instep 5 in the third cycle of the AF processing, the image tracking operation is started. Incidentally, as shown inFIGS. 11A and 11B , when the threshold for making the in-focus determination instep 5 is larger than the threshold used in the AF function, an in-focus determination signal and a lens driving signal are both output. - In the image tracking operation, as shown in
FIGS. 11A and 11B , a tracking initial image is first accumulated in the second image-sensing device 25 (i.e., image accumulation; step 7 ofFIG. 8A ). After obtaining the tracking initial image, the tracking initial process is executed (i.e., tracking initial process; step 8 ofFIG. 8A ). Then, a tracking next image is accumulated in the second image-sensing device 25 (i.e., image accumulation; step 10 ofFIG. 8B ). After obtaining the tracking next image, the tracking computation process is executed (i.e., tracking computation;step 11 ofFIG. 8B ). - After the image tracking operation described above, the AF operation is executed again with the image tracking result reflected thereon. More specifically, the defocus amount is detected for each of the focus detection areas, and the area having the defocus amount close to that used in the previous focus adjustment is searched for. The focus adjustment area is decided based on the area search result and the image tracking result. Then, the focus adjustment is performed based on the defocus amount of the decided focus adjustment area. Thereafter, during a period in which the shutter button is held half-pressed, the image tracking operation and the AF operation based on the image tracking result are executed repeatedly.
- According to the embodiment, as described above, in the image tracking device having the image tracking function of comparing the object images, which are formed by the
photographic lens 15 and are repeatedly captured by the second image-sensingdevice 25, with the template image and detecting the position of the tracking target in the screen of thephotographic lens 15, and the autofocusing (AF) function of detecting the focus of thephotographic lens 15 in the focus detection areas set within the screen and performing the focus adjustment of thephotographic lens 15 based on the focus detection result, it is determined whether or not the focus detection result with the AF function is in focus, and the focus detection and the focus adjustment based on the AF function are performed in the focus detection area corresponding to the position of the tracking target in the screen. The image at the position of the tracking target in the object image, which is captured when the in-focus is determined, is obtained as the template image. Therefore, a sharp target image can be obtained as the reference image and tracking performance can be improved. - Also, according to the embodiment, since the template image is obtained from the image after the in-focus determination, the tracking process can be efficiently executed while making the image tracking operation synchronized with the autofocusing (AF) function.
- While the above-described embodiment represents the example in which the charge accumulation in the ranging
device 17 for the AF operation and the charge accumulation in the second image-sensingdevice 25 for the image tracking operation are successively executed in a synchronous relation as shown inFIGS. 11A and 11B , the charge accumulation in the rangingdevice 17 for the AF operation and the charge accumulation in the second image-sensingdevice 25 for the image tracking operation may be executed in an asynchronous relation. -
FIGS. 12A , 12B, 12C and 12D are timing charts showing the autofocusing (AF) operation and the image tracking operation according to a modification of the embodiment when the charge accumulation in the rangingdevice 17 for the AF operation and the charge accumulation in the second image-sensingdevice 25 for the image tracking operation are executed in an asynchronous relation. Note that the following description is made primarily on different points while the description of similar points to those in the case of the synchronous operation, shown inFIGS. 11A and 11B , is omitted. In the case of executing the AF operation and the image tracking operation in an asynchronous relation, during a period in which a power source of the camera is turned on, the charge accumulation for each of the AF operation and the image tracking operation is always performed at a predetermined time interval even when the computation process based on the charge accumulation result is not executed. - Referring to
FIGS. 12A to 12D , in an asynchronous operation (1), thetracking control section 36 executes the tracking initial process after the in-focus determination (after a lapse of time T1) by employing, as the tracking initial image, the tracking object image accumulated in the second image-sensingdevice 25. On the other hand, in an asynchronous operation (2), thetracking control section 36 executes the tracking initial process immediately after the in-focus determination by employing, as the tracking initial image, the tracking object image accumulated in the second image-sensingdevice 25 immediately before the in-focus determination (time T2 earlier). In the asynchronous operation (2), the time (T2) from the end of the charge accumulation immediately before the in-focus determination to arrival of a signal sent upon the in-focus determination in the asynchronous operation is shorter than the time from the in-focus determination to the start of the charge accumulation. Stated another way, in the examples shown inFIGS. 12A to 12D , an image obtained as a result of the charge accumulation completed earlier than the in-focus determination by the time T2 represents an image that is accumulated at the timing closer to the timing of the in-focus determination than an image obtained as a result of the charge accumulation started after the time T1 from the in-focus determination. Therefore, the image of the tracking object closer to the timing of in-focus with the tracking target can be obtained as the template image, and the tracking accuracy can be improved by using a sharp template image that is in focus with the tracking target. Thus, thetracking control section 36 sets T1 as a period from the time at which the signal is sent upon the in-focus determination to the time at which the image accumulation performed after the in-focus determination is started. Also, thetracking control section 36 sets T2 as a period from the time at which the signal is sent upon the in-focus determination to the time at which the image accumulation performed before the in-focus determination is ended. Additionally, thetracking control section 36 may decide the timing closer to the in-focus determination based on respective periods from the time at the middle of a period during which the AF accumulation is performed, to the start time, the end time and the middle time of the image accumulation. - According to the modification of the embodiment, as described above, since the template image is obtained from the object image captured at the timing closest to the timing of the in-focus determination, the tracking performance can be improved by using a sharper template image that is more in focus with the tracking target.
- While a single-lens reflex digital still camera has been described in the foregoing embodiment, the present invention can be realized with any type of image-taking apparatus so long as it is able to capture images on the time serial basis. Thus, the present invention is also applicable to, other than the single-lens reflex digital still camera, a consumer digital camera, a video camera used as an image-taking apparatus that captures moving images, and so on.
- While the foregoing embodiment has been described as obtaining the template image after confirming the in-focus state with respect to the tracking target object when the template image of the tracking target object is obtained after the start of the image tracking, the present invention can also be applied to the case where the tracking target object is changed from one to another during the image tracking operation for some reason. When the tracking target object is changed from one to another during the image tracking operation, a new template image is obtained after confirming the in-focus state of a new tracking target object.
- It is obvious that the present invention is not limited to the above-described embodiment, but can be applied to, for example, motorcycles or three-wheeled vehicles without limiting to the saddle-ride type four-wheeled vehicles, and various modifications can be made in a range without departing from the gist of the invention.
Claims (27)
1. An image apparatus comprising:
a screen configured to display an image formed by an optical system and at least one detection position superimposed onto the image;
a detecting unit configured to detect information regarding focus of the optical system at the at least one detection position;
a determining unit connected to the detecting unit and configured to determine whether the information regarding the focus is within a predetermined range;
an acquiring unit connected to the determining unit and configured to acquire first image information at the at least one detection position when the determining unit determines that the information regarding the focus is within the predetermined range; and
a control unit connected to the acquiring unit and configured to evaluate arbitrary second image information in the screen using the first image information.
2. The image apparatus according to claim 1 , wherein the acquiring unit is configured to acquire the first image information from image information captured after the determining unit has determined that the information regarding the focus is within the predetermined range.
3. The image apparatus according to claim 1 , wherein the acquiring unit is configured to acquire the first image information from image information captured at timing closest to timing at which the determining unit has determined that the information regarding the focus is within the predetermined range.
4. The image apparatus according to claim 1 , further comprising:
an operating member configured to set the at least one detection position arbitrarily.
5. The image apparatus according to claim 1 , further comprising:
a setting unit configured to set the at least one detection position included in the first image information based on a detection result of the detecting unit.
6. The image apparatus according to claim 1 ,
wherein the information regarding the focus comprises a defocus amount of the optical system, and
wherein the determining unit is configured to determine that the information regarding the focus is within the predetermined range when the defocus amount is equal to or smaller than a preset threshold value.
7. The image apparatus according to claim 6 , wherein the threshold value is set to a smaller value as the optical system has a longer focal length.
8. The image apparatus according to claim 6 , wherein the threshold value is set to be a reduced value when an image capturing mode is set at a sports capturing mode.
9. The image apparatus according to claim 1 , wherein the second image information relates to a second area which is larger than a first area which relates to the first image information.
10. The image apparatus according to claim 1 , wherein the second image information relates to an area including the at least one detection position.
11. The image apparatus according to claim 1 , wherein the control unit is configured to evaluate the second image information by detecting in the second image information a region identical to or similar to the first image information.
12. The image apparatus according to claim 1 , wherein the first image information includes image information of at least part of a main subject in the screen.
13. An image-taking apparatus comprising:
an image apparatus comprising:
a screen configured to display an image formed by an optical system and at least one detection position superimposed onto the image;
a detecting unit configured to detect information regarding focus of the optical system at the at least one detection position;
a determining unit connected to the detecting unit and configured to determine whether the information regarding the focus is within a predetermined range;
an acquiring unit connected to the determining unit and configured to acquire first image information at the at least one detection position when
the determining unit determines that the information regarding the focus is within the predetermined range; and
a control unit connected to the acquiring unit and configured to evaluate arbitrary second image information in the screen using the first image information.
14. An evaluation method comprising:
displaying in a screen an image formed by an optical system;
displaying at least one detection position superimposed onto the image;
detecting information regarding focus of the optical system at the at least one detection position;
acquiring first image information at the at least one detection position when the detected information is within a predetermined range;
acquiring arbitrary image information in the screen as second image information; and
evaluating the second image information using the first image information.
15. The evaluation method according to claim 14 , wherein the first image information is acquired from image information captured after the detected information has been within the predetermined range.
16. The evaluation method according to claim 14 , wherein the first image information is acquired from image information captured at timing closest to timing at which the detected information has been within the predetermined range.
17. The evaluation method according to claim 14 , further comprising:
setting the at least one detection position arbitrarily.
18. The evaluation method according to claim 14 , further comprising:
setting the at least one detection position included in the first image information based on the detected information.
19. The evaluation method according to claim 14 ,
wherein the information regarding the focus comprises a defocus amount of the optical system, and
wherein the detected information is within the predetermined range when the defocus amount is equal to or smaller than a preset threshold value.
20. The evaluation method according to claim 19 , wherein the threshold value is set to a smaller value as the optical system has a longer focal length.
21. The evaluation method according to claim 19 , wherein the threshold value is set to be a reduced value when an image capturing mode is set at a sports capturing mode.
22. The evaluation method according to claim 14 , wherein the second image information is acquired after acquiring the first image information.
23. The evaluation method according to claim 14 , wherein the second image information relates to a second area which is larger than a first area which relates to the first image information.
24. The evaluation method according to claim 14 , wherein the second image information relates to an area including the at least one detection position.
25. The evaluation method according to claim 14 , wherein the second image information is evaluated by detecting in the second image information a region identical to or similar to the first image information.
26. The evaluation method according to claim 14 , wherein the first image information includes image information of at least part of a main subject in the screen.
27. An image apparatus comprising:
screen means for displaying an image formed by an optical system and at least one detection position superimposed onto the image;
detecting means for detecting information regarding focus of the optical system at the at least one detection position;
determining means for determining whether the information regarding the focus is within a predetermined range;
acquiring means for acquiring first image information at the at least one detection position when the determining means determines that the information regarding the focus is within the predetermined range; and
control means for evaluating arbitrary second image information in the screen using the first image information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/915,121 US8254773B2 (en) | 2007-05-28 | 2010-10-29 | Image tracking apparatus and tracking evaluation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-140114 | 2007-05-28 | ||
JP2007140114A JP5157256B2 (en) | 2007-05-28 | 2007-05-28 | Image tracking device and imaging device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/915,121 Continuation US8254773B2 (en) | 2007-05-28 | 2010-10-29 | Image tracking apparatus and tracking evaluation method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090022486A1 true US20090022486A1 (en) | 2009-01-22 |
Family
ID=40167641
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/060,961 Abandoned US20090022486A1 (en) | 2007-05-28 | 2008-04-02 | Image apparatus and evaluation method |
US12/915,121 Active US8254773B2 (en) | 2007-05-28 | 2010-10-29 | Image tracking apparatus and tracking evaluation method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/915,121 Active US8254773B2 (en) | 2007-05-28 | 2010-10-29 | Image tracking apparatus and tracking evaluation method |
Country Status (2)
Country | Link |
---|---|
US (2) | US20090022486A1 (en) |
JP (1) | JP5157256B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090185798A1 (en) * | 2008-01-23 | 2009-07-23 | Niko Corporation | Focus detection device, focus detection method, and camera |
US20110262123A1 (en) * | 2010-04-27 | 2011-10-27 | Canon Kabushiki Kaisha | Focus detection apparatus |
US20130120571A1 (en) * | 2011-11-14 | 2013-05-16 | Itx Security Co., Ltd. | Security camera and method for controlling auto-focusing of the same |
CN103581538A (en) * | 2012-07-25 | 2014-02-12 | 三星电子株式会社 | Digital photographing apparatus and method of controlling the digital photographing apparatus |
US20140307124A1 (en) * | 2011-12-28 | 2014-10-16 | Fujifilm Corporation | Imaging apparatus, control method of imaging apparatus, interchangeable lens and lens-interchangeable type imaging apparatus body |
US20150222806A1 (en) * | 2012-09-11 | 2015-08-06 | Sony Corporation | Imaging control device, imaging apparatus, and control method performed by imaging control device |
US20150350524A1 (en) * | 2013-01-09 | 2015-12-03 | Sony Corporation | Image processing device, image processing method, and program |
US20190020826A1 (en) * | 2017-07-13 | 2019-01-17 | Canon Kabushiki Kaisha | Control apparatus, image capturing apparatus, and non-transitory computer-readable storage medium |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5171468B2 (en) * | 2008-08-06 | 2013-03-27 | キヤノン株式会社 | IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD |
KR20100099008A (en) * | 2009-03-02 | 2010-09-10 | 삼성전자주식회사 | Auto focusing method and apparatus, and digital photographing apparatus using thereof |
JP5499531B2 (en) * | 2009-07-02 | 2014-05-21 | 株式会社ニコン | Electronic camera |
JP6148431B2 (en) * | 2010-12-28 | 2017-06-14 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP2012155044A (en) * | 2011-01-25 | 2012-08-16 | Sanyo Electric Co Ltd | Electronic camera |
JP5966249B2 (en) * | 2011-03-10 | 2016-08-10 | 株式会社ニコン | Image tracking device |
JP6742173B2 (en) * | 2016-06-30 | 2020-08-19 | キヤノン株式会社 | Focus adjusting device and method, and imaging device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4903134A (en) * | 1986-11-17 | 1990-02-20 | Sanyo Electric Co., Ltd. | Automatic focusing circuit for automatically matching focus in response to video signal and zoom position |
US5061951A (en) * | 1988-05-13 | 1991-10-29 | Canon Kabushiki Kaisha | Auto focus device |
US5192966A (en) * | 1986-03-31 | 1993-03-09 | Minolta Camera Kabushiki Kaisha | Automatic focus control device |
US6308015B1 (en) * | 1999-06-18 | 2001-10-23 | Olympus Optical Co., Ltd. | Camera having automatic focusing device |
US20050162540A1 (en) * | 2004-01-27 | 2005-07-28 | Fujinon Corporation | Autofocus system |
US7003223B2 (en) * | 2003-02-24 | 2006-02-21 | Fujinon Corporation | Lens control system and focus information display apparatus |
US20070196091A1 (en) * | 2006-02-22 | 2007-08-23 | Pentax Corporation | Auto focus unit and digital camera |
US7515200B2 (en) * | 2001-08-10 | 2009-04-07 | Canon Kabushiki Kaisha | Image sensing apparatus, focus adjustment method, and focus adjustment computer control program |
US7519285B2 (en) * | 2005-01-24 | 2009-04-14 | Canon Kabushiki Kaisha | Imaging apparatus, imaging method, imaging program, and storage medium |
US7630623B2 (en) * | 2006-07-28 | 2009-12-08 | Canon Kabuhsiki Kaisha | Optical apparatus, imaging apparatus, and control method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0862486A (en) * | 1994-08-26 | 1996-03-08 | Nikon Corp | Automatic focusing device and automatic focusing method |
US5913079A (en) * | 1995-07-31 | 1999-06-15 | Canon Kabushiki Kaisha | Optical apparatus having a line of sight detection device |
JPH09297348A (en) * | 1996-05-09 | 1997-11-18 | Minolta Co Ltd | Camera |
JP2004289383A (en) * | 2003-03-20 | 2004-10-14 | Seiko Epson Corp | Image pickup device, image data generating device, image data processor, and image data processing program |
JP2005338352A (en) * | 2004-05-26 | 2005-12-08 | Fujinon Corp | Autofocus system |
JP4984491B2 (en) * | 2005-10-31 | 2012-07-25 | 株式会社ニコン | Focus detection apparatus and optical system |
-
2007
- 2007-05-28 JP JP2007140114A patent/JP5157256B2/en not_active Expired - Fee Related
-
2008
- 2008-04-02 US US12/060,961 patent/US20090022486A1/en not_active Abandoned
-
2010
- 2010-10-29 US US12/915,121 patent/US8254773B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5192966A (en) * | 1986-03-31 | 1993-03-09 | Minolta Camera Kabushiki Kaisha | Automatic focus control device |
US4903134A (en) * | 1986-11-17 | 1990-02-20 | Sanyo Electric Co., Ltd. | Automatic focusing circuit for automatically matching focus in response to video signal and zoom position |
US5061951A (en) * | 1988-05-13 | 1991-10-29 | Canon Kabushiki Kaisha | Auto focus device |
US6308015B1 (en) * | 1999-06-18 | 2001-10-23 | Olympus Optical Co., Ltd. | Camera having automatic focusing device |
US7515200B2 (en) * | 2001-08-10 | 2009-04-07 | Canon Kabushiki Kaisha | Image sensing apparatus, focus adjustment method, and focus adjustment computer control program |
US7003223B2 (en) * | 2003-02-24 | 2006-02-21 | Fujinon Corporation | Lens control system and focus information display apparatus |
US20050162540A1 (en) * | 2004-01-27 | 2005-07-28 | Fujinon Corporation | Autofocus system |
US7519285B2 (en) * | 2005-01-24 | 2009-04-14 | Canon Kabushiki Kaisha | Imaging apparatus, imaging method, imaging program, and storage medium |
US20070196091A1 (en) * | 2006-02-22 | 2007-08-23 | Pentax Corporation | Auto focus unit and digital camera |
US7630623B2 (en) * | 2006-07-28 | 2009-12-08 | Canon Kabuhsiki Kaisha | Optical apparatus, imaging apparatus, and control method |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8676051B2 (en) * | 2008-01-23 | 2014-03-18 | Nikon Corporation | Focus detection device, focus detection method, and camera |
US20090185798A1 (en) * | 2008-01-23 | 2009-07-23 | Niko Corporation | Focus detection device, focus detection method, and camera |
US20110262123A1 (en) * | 2010-04-27 | 2011-10-27 | Canon Kabushiki Kaisha | Focus detection apparatus |
US8369699B2 (en) * | 2010-04-27 | 2013-02-05 | Canon Kabushiki Kaisha | Focus detection apparatus |
US20130120571A1 (en) * | 2011-11-14 | 2013-05-16 | Itx Security Co., Ltd. | Security camera and method for controlling auto-focusing of the same |
US20140307124A1 (en) * | 2011-12-28 | 2014-10-16 | Fujifilm Corporation | Imaging apparatus, control method of imaging apparatus, interchangeable lens and lens-interchangeable type imaging apparatus body |
US9172887B2 (en) * | 2011-12-28 | 2015-10-27 | Fujifilm Corporation | Imaging apparatus, control method of imaging apparatus, interchangeable lens and lens-interchangeable type imaging apparatus body |
CN103581538A (en) * | 2012-07-25 | 2014-02-12 | 三星电子株式会社 | Digital photographing apparatus and method of controlling the digital photographing apparatus |
EP2690859A3 (en) * | 2012-07-25 | 2015-05-20 | Samsung Electronics Co., Ltd. | Digital photographing apparatus and method of controlling same |
US20150222806A1 (en) * | 2012-09-11 | 2015-08-06 | Sony Corporation | Imaging control device, imaging apparatus, and control method performed by imaging control device |
US9219856B2 (en) * | 2012-09-11 | 2015-12-22 | Sony Corporation | Imaging control device, imaging apparatus, and control method performed by imaging control device |
US20150350524A1 (en) * | 2013-01-09 | 2015-12-03 | Sony Corporation | Image processing device, image processing method, and program |
US9942460B2 (en) * | 2013-01-09 | 2018-04-10 | Sony Corporation | Image processing device, image processing method, and program |
US20190020826A1 (en) * | 2017-07-13 | 2019-01-17 | Canon Kabushiki Kaisha | Control apparatus, image capturing apparatus, and non-transitory computer-readable storage medium |
US10863079B2 (en) * | 2017-07-13 | 2020-12-08 | Canon Kabushiki Kaisha | Control apparatus, image capturing apparatus, and non-transitory computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP5157256B2 (en) | 2013-03-06 |
JP2008292894A (en) | 2008-12-04 |
US8254773B2 (en) | 2012-08-28 |
US20110038624A1 (en) | 2011-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8254773B2 (en) | Image tracking apparatus and tracking evaluation method | |
US9639754B2 (en) | Subject tracking apparatus, camera having the subject tracking apparatus, and method for tracking subject | |
JP5176483B2 (en) | Image recognition device, image tracking device, and imaging device | |
JP5167750B2 (en) | TRACKING DEVICE, IMAGING DEVICE, AND TRACKING METHOD | |
JP5247076B2 (en) | Image tracking device, focus adjustment device, and imaging device | |
JP5288015B2 (en) | Image tracking device, image tracking method, and camera | |
JP4872834B2 (en) | Image recognition device, focus adjustment device, and imaging device | |
JP4893334B2 (en) | Image tracking device and imaging device | |
JP2009109839A (en) | Image tracking device and imaging device | |
JP2008046354A (en) | Object tracking device and camera | |
JP5056136B2 (en) | Image tracking device | |
JP5403111B2 (en) | Image tracking device | |
JP4888249B2 (en) | Focus detection apparatus and imaging apparatus | |
US8325977B2 (en) | Image recognition and focus adjustment using weighted image difference information | |
JP2010200138A (en) | Photographic subject tracking device | |
JP5233646B2 (en) | Image tracking device, imaging device, and image tracking method | |
JP5447579B2 (en) | Tracking device, focus adjustment device, and photographing device | |
JP2010054586A (en) | Focusing device and imaging apparatus | |
JP2012093775A (en) | Focus detector and imaging apparatus | |
JP5772933B2 (en) | Imaging device | |
JP4973369B2 (en) | Image processing apparatus and imaging apparatus | |
JP2010093320A (en) | Image tracking device and image tracking method | |
JP2018063439A (en) | Focus adjustment device and imaging device | |
JP2016028298A (en) | Focus adjustment device and imaging device | |
JP2014095915A (en) | Focus adjustment device and imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NIKON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MURAMATSU, KEIKO;REEL/FRAME:021624/0642 Effective date: 20080507 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |