US20070263904A1 - Subject tracking device, subject tracking method, subject tracking program product and optical device - Google Patents
Subject tracking device, subject tracking method, subject tracking program product and optical device Download PDFInfo
- Publication number
- US20070263904A1 US20070263904A1 US11/701,360 US70136007A US2007263904A1 US 20070263904 A1 US20070263904 A1 US 20070263904A1 US 70136007 A US70136007 A US 70136007A US 2007263904 A1 US2007263904 A1 US 2007263904A1
- Authority
- US
- United States
- Prior art keywords
- tracking
- image
- subject
- area
- zone
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/16—Special procedures for taking photographs; Apparatus therefor for photographing the track of moving objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B39/00—High-speed photography
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
Definitions
- the present invention relates to a subject tracking device, a subject tracking method, a subject tracking program product and an optical device, to be adopted when automatically tracking a subject that moves within an imaging field.
- a subject tracking device comprises: a tracking zone setting unit that sets an area where a main subject is present within a captured image as a tracking zone; a tracking unit that tracks the main subject based upon an image output corresponding to the tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- the subject tracking device in the subject tracking device according to the 1st embodiment, it is preferred that: there is further provided a converting unit that converts a captured image data to be reduced and outputs a converted image data; and the tracking unit tracks movement of the main subject by using the converted image data.
- the tracking zone setting unit sets an area corresponding to a subject present at a center of the captured image as the tracking zone.
- the subject tracking device in the subject tracking device according to the 1st embodiment, it is preferred that there is further provided a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone.
- an optical device comprises a subject tracking device according to the 1st embodiment.
- the subject tracking device further comprises a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone; a focus detection area indicating a position at which focus detection is executed for the subject is set in the captured image; and the mark indicates the focus detection area present within the tracking zone.
- the tracking zone setting unit sets an area corresponding to the focus detection area having been selected as the tracking zone.
- a subject tracking method comprises: setting an area where a main subject is present within a captured image as a tracking zone; tracking the main subject based upon an image output corresponding to the tracking zone; and determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- a computer-readable computer program product having contained therein a subject tracking program, and the subject tracking program comprises: processing for tracking a main subject based upon an image output corresponding to a tracking zone; and processing for determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
- the arithmetic operation unit executes exposure calculation by using image data over a central area of the second tracking zone.
- the arithmetic operation unit includes a focus detection calculation unit that executes focus detection calculation for a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
- the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using an image output corresponding to an area near a central area of the tracking zone.
- the subject tracking device in the subject tracking device according to the 13rd embodiment, it is preferred that there is further provided a display unit that displays a mark indicating a central area of the tracking zone.
- the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
- the subject tracking device in the subject tracking device according to the 15th embodiment, it is preferred that there is further provided a display unit that displays a mark indicating a central area within the second tracking zone.
- a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using at least either information corresponding to a central area within the second tracking zone or information corresponding to an area near a central area of the second tracking zone.
- the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information indicating a higher level of contrast, selected from the information corresponding to a central area of the second tracking zone and the information corresponding to the area near a central area of the second tracking zone.
- the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by weighting a central area of the second tracking zone.
- a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone and information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
- the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
- a display unit that displays a mark indicating a central area within the second tracking zone.
- a subject tracking method comprises: designating image data in an area where a subject is present within a first image as a first tracking zone; tracking the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and determining through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
- exposure calculation is executed by using image data over a central area of the second tracking zone.
- focus detection calculation is executed in correspondence to a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
- the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to an area near a central area of the second tracking zone.
- the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
- an optical apparatus comprises a subject tracking device according to the 1st embodiment.
- FIG. 1 illustrates the essential structure adopted in the digital camera achieved in an embodiment of the present invention
- FIG. 2 shows the positions at which the focus detection areas are indicated on the viewfinder of the digital camera achieved in the embodiment
- FIG. 3 is a conceptual diagram of the various areas used in the photographic subject tracking control in the embodiment.
- FIGS. 4 A ⁇ 4 C illustrate the subject tracking method adopted in the embodiment
- FIG. 5 presents a flowchart of the photographic subject tracking processing operation executed in the embodiment
- FIG. 6 shows the focus detection area set at the center of the tracking zone
- FIG. 7 shows the image screen with a great number of focus detection areas set therein.
- FIG. 8 shows how the program may be provided to the digital camera.
- FIG. 1 shows the essential structure adopted in the digital camera according to the present invention.
- a lens barrel 20 which includes a photographic lens 21 , is exchangeably mounted at a camera body 10 .
- a first image sensor 12 used to capture an image of a photographic subject is installed in the camera body 10 .
- the first image sensor 12 may be a CCD or a CMOS.
- a quick-return mirror 11 that reflects subject light having passed through the photographic lens 21 toward a viewfinder optical system is disposed between the photographic lens 21 and the first image sensor 12 . Some of the subject light is transmitted through a semi-transmissive area of the quick return mirror 11 , is reflected downward at a sub mirror 111 and enters an AF sensor module 112 adopting the phase detection system.
- the phase detection type AF sensor module 112 may include, for instance, a focus detection optical system that splits a focus detection light flux into a pair of focus detection optical images and a plurality of pairs of CCD line sensors that each output focus detection signals corresponding to the pair of split optical images having entered therein.
- the focus detection signals output from the plurality of pairs of CCD line sensors are input to a control unit 30 , which, in turn, outputs a lens drive signal in response to which a focus adjustment lens is driven to the focus match position, as explained later, by using the focus detection signals output from a specific pair of CCD line sensors among the plurality of pairs of CCD line sensors.
- the subject light reflected at the quick return mirror 11 forms an image on a focusing screen 13 disposed at a position optically equivalent to the position of the first image sensor 12 .
- the subject image formed on the focusing screen 13 can be observed by the photographer via a pentaprism 14 and an eyepiece lens 15 , and it also passes through a prism 17 and an image forming lens 18 from the pentaprism 14 to form an image on a light-receiving surface of a second image sensor 19 .
- the second image sensor 19 is constituted with pixels, the number of which is smaller than the number of pixels constituting the first image sensor 12 , and it may be a CCD with 640 ⁇ 480 pixels disposed in an RGB Bayer array.
- the quick return mirror 11 rotates to the position indicated by the dotted line in the figure to allow the subject light to form an image on the first image sensor 12 .
- Image data obtained at the second image sensor 19 are input to the control unit 30 .
- the control unit 30 comprises a CPU, a ROM, a RAM and various peripheral circuits. Its functional blocks include an image converting unit 31 that converts and reduces the image data obtained at the second image sensor 19 into image data achieving a predetermined image size, e.g., a 64 ⁇ 48 pixels, a photographic subject tracking control unit 32 that tracks and follows a main photographic subject, an exposure calculation unit 33 that calculates the optimal exposure for the main photographic subject, a focus detection calculation unit 34 that executes focus detection calculation, a lens drive quantity calculation unit 35 that calculates a drive quantity indicating the extent to which the photographic lens 21 needs to be driven, and an ON/OFF control unit 36 that controls the ON/OFF state of an AF area mark indicating a focus detection area 40 at a display unit 361 , as explained later.
- the AF area mark can be displayed via a liquid crystal display panel 22 disposed near the focusing screen 13 .
- the focus detection calculation unit 34 determines through an arithmetic operation the focus adjustment state indicated by the defocused quantity and the like based upon the focus detection signals output from the pair of CCD line sensors corresponding to the focus detection area 40 selected by the photographer by operating a focusing area selecting operation unit 341 .
- seven focus detection areas 40 a ⁇ 40 g may be set within the photographic field (the imaging field or the imaging-capturing field) in the camera achieved in the embodiment.
- the focus detection calculation unit 34 determines through arithmetic operation the focus adjustment state (focus adjustment quantity) in correspondence to the selected focus detection area 40 .
- the area selecting operation unit 341 outputs a selected focus detection area signal, indicating the specific focus detection area having been selected among the seven focus detection areas 40 a ⁇ 40 g , to the photographic subject tracking control unit 32 .
- the focus adjustment quantity calculated in the focus detection calculation unit 34 is output to the lens drive quantity calculation unit 35 .
- the lens drive quantity calculation unit 35 calculates the lens drive quantity based upon the focus adjustment quantity input thereto and outputs the resulting lens drive signal to a lens drive motor 351 .
- the lens drive motor 351 drives the photographic lens 21 along the optical axis in response to the lens drive signal, thereby adjusting the focus adjustment state.
- the photographic subject tracking control unit 32 Based upon the selected focus detection area signal input thereto, the photographic subject tracking control unit 32 extracts the image data corresponding to a tracking zone from the image data having been converted by the image converting unit 31 and executes subject tracking calculation based upon the image data in the tracking zone.
- a tracking zone 41 and a tracking calculation zone 43 ranging over a greater area than the tracking zone 41 are set on an initial image 44 obtained via the second image sensor 19 and the tracking zone 41 is designated as a template image 42 , as shown in FIG. 3 .
- the tracking zone 41 should range over a predetermined area, e.g., a 4 ⁇ 4 pixel area, around the focus detection area 40 recognized based upon the selected focus detection area signal. It is to be noted that a range with a hue identical to that of the focus detection area 40 may be designated as the tracking zone 41 .
- the photographic subject tracking control is executed by preparing a cutout area, which is equal to the template image 42 in size, within the tracking calculation zone 43 in an image obtained successively via the second image sensor 19 and by calculating the differences between the R (red), C (green) and B (blue) colors in the cutout area and the R (red), G (green) and B (blue) colors in the template image 42 .
- the size of the tracking zone 41 is selected as a default setting and that the photographer does not set the size of the tracking zone 41 .
- the control unit 30 utilizes in calculation the reduced image 44 (64 ⁇ 48 pixels), for example, into which the image (640 ⁇ 480 pixels) obtained by the second image sensor 19 has been converted and reduced.
- FIG. 3 shows the tracking calculation zone 43 (8 ⁇ 8 pixels, for example) over the reduced image 44 (64 ⁇ 48 pixels) and the tracking zone 41 (4 ⁇ 4 pixels, for example) over the reduced image 44 (64 ⁇ 48 pixels). It is to be noted that FIG. 3 presents a conceptual diagram and that the proportional relationships among the sizes of the various areas of the actual product may not match those shown in FIG. 3 .
- the photographic subject tracking control method is explained in detail in reference to FIGS. 4 A ⁇ 4 C. It is assumed that the photographer has selected the focus detection area 40 d by operating the focusing area selecting operation unit 341 . Under such circumstances, the photographic subject tracking control unit 32 regards a subject present at the focus detection area 40 d as the main photographic subject and designates a relatively wide range containing the focus detection area 40 d as a tracking zone 41 a , as shown in FIG. 4A .
- the image data in the tracking zone 41 a are stored into a predetermined storage area within the photographic subject tracking control unit 32 as the template image 42 .
- the ON/OFF control unit 36 issues an instruction for the display unit 361 to turn on the focus detection area 40 d based upon the selected focus detection area signal provided from the area selecting operation unit 341 .
- FIG. 4B shows an image 44 b obtained in time sequence at the second image sensor 19 to follow the image 44 a .
- the photographic subject tracking control unit 32 selects a range greater than the initial tracking zone 41 a as a tracking calculation zone 43 b .
- An area inside the tracking calculation zone 43 b which is equal in size to the tracking zone 41 a , is extracted as a cutout area and the cut-out position is sequentially displaced.
- the differences between the R (red), G (green) and B (blue) colors in the individual cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 are calculated and the cut-out area indicating the smallest value representing the sum of the differences is judged to be the area most similar to the template image 42 .
- This cut-out area is then designated as a new tracking zone 41 b in the image 44 b.
- the photographic subject tracking control unit 32 outputs information indicating the range of the tracking zone 41 b and the coordinates of the center of the tracking zone 41 b to the focus detection calculation unit 34 and the exposure calculation unit 33 .
- the focus detection calculation unit 34 determines that the main photographic subject is present at the focus detection area 40 d closest to the center of the tracking zone 41 b , and calculates the focus adjustment quantity by using the focus detection signals output from the pair of CCD line sensors corresponding to the focus detection area 40 d .
- the focus detection calculation unit 34 outputs the selected focus detection area signal indicating the focus detection area 40 d to the ON/OFF control unit 36 via the photographic subject tracking control unit 32 .
- the display unit 361 turns on the AF area mark for the focus detection area 40 d based upon the selected focus detection area signal.
- the exposure calculation unit 33 extracts the image data corresponding to the central area of the tracking zone 41 b from the image data provided via the image converting unit 31 based upon the coordinates of the tracking zone 41 b , calculates the optimal exposure based upon the extracted image data, and outputs the calculation results to an aperture drive unit 331 and a shutter drive unit 332 . Namely, the exposure calculation unit 33 executes the exposure calculation for the area corresponding to the center of the tracking zone.
- FIG. 4C shows an image 44 c obtained in time sequence at the second image sensor 19 to follow the image 44 b .
- the photographic subject tracking control unit 32 selects a range greater than the tracking zone 41 b as a tracking calculation zone 43 c .
- the differences between the R (red), G (green) and B (blue) colors in the individual cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 are calculated and the cut-out area indicating the smallest value representing the sum of the differences is designated as a new tracking zone 41 c.
- the photographic subject tracking control unit 32 outputs information indicating the range of the tracking zone 41 c and the coordinates of the center of the tracking zone 41 c to the focus detection calculation unit 34 and the exposure calculation unit 33 .
- the focus detection calculation unit 34 determines that the main photographic subject is present at the focus detection area 40 c closest to the center of the tracking zone 41 c and calculates the focus adjustment quantity by using the focus detection signals output from the pair of CCD line sensors corresponding to the focus detection area 40 c .
- the focus detection calculation unit 34 outputs the selected focus detection area signal indicating the new focus detection area 40 c to the ON/OFF control unit 36 via the photographic subject tracking control unit 32 .
- the display unit 361 then turns on the AF area mark for the focus detection area 40 c based upon the selected focus detection area signal.
- the exposure calculation unit 33 extracts the image data corresponding to the central area of the tracking zone 41 c from the image data provided via the image converting unit 31 based upon the coordinates of the tracking zone 41 c , calculates the optimal exposure based upon the extracted image data, and outputs the calculation results to the aperture drive unit 331 and the shutter drive unit 332 .
- the tracking zone 41 ( 41 a , 41 b or 41 c ) contains a plurality of focus detection areas 40 present over equal distances from the center of the tracking zone 41 , the focus detection area 40 present along the direction from the center of the tracking zone 41 in which the subject is expected to move is selected based upon the direction in which the subject moved over the previous images obtained in sequence, i.e., based upon the movement history.
- step S 101 focus adjustment is executed by driving the photographic lens 21 based upon the focus detection calculation results provided from the focus detection calculation unit 34 in correspondence to the focus detection area 40 having been set by the photographer, and then the operation proceeds to step S 102 .
- step S 102 the image captured at the second image sensor 19 is obtained before the operation proceeds to step S 103 .
- step S 103 the image data having been obtained in step S 102 are converted into image data of an image having a predetermined number of pixels by the image converting unit 31 and then the operation proceeds to step S 104 .
- step S 104 the exposure calculation is executed by using image data selected from the image data having been converted in step S 103 in correspondence to the focus detection area for which the focus detection calculation has been executed, and then the operation proceeds to step S 105 .
- step S 105 the tracking zone 41 is set as explained earlier based upon the focus detection area 40 having been set in step S 101 and the image corresponding to the tracking zone 41 is stored as the template image 42 , before the operation proceeds to step S 106 .
- step S 106 a decision is made as to whether or not the photographer has pressed the shutter release switch (not shown) all the way down. If an affirmative decision is made in step S 106 , i.e., if it is decided in step S 106 that the shutter release switch has been pressed all the way down, the operation proceeds to step S 112 . If, on the other hand, a negative decision is made in step S 106 , i.e., if it is decided in step S 106 that the shutter release switch has not been pressed all the way down, the operation proceeds to step S 107 .
- step S 107 the next image (new image) is obtained from the second image sensor 19 , and then the operation proceeds to step S 108 .
- step S 108 the obtained image is converted into an image having the predetermined number of pixels as in step S 103 explained earlier, and then the operation proceeds to step S 109 .
- step S 109 based upon the image resulting from the conversion having been executed in step S 108 , the differences between the R (red), G (green) and B (blue) colors in the cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 having been stored in step S 105 are calculated. Based upon the calculation results, the cut-out area with the highest level of similarity is designated as the new photographic subject tracking zone 41 for the next image, before the operation proceeds to step S 110 .
- step S 110 the selected focus detection area signal indicating the focus detection area 40 closest to the center of the tracking zone 41 having been set in step S 109 is output to the focus detection calculation unit 34 to enable the focus detection calculation unit 34 to calculate the focus adjustment quantity for this focus detection area. Then the operation proceeds to step S 111 to execute the exposure calculation based upon image data selected from the image data having been obtained in step S 108 , in correspondence to the new focus detection area 40 having been set in step S 110 . Subsequently, the operation returns to step S 106 .
- step S 106 If it is decided in step S 106 that the shutter release switch has been pressed all the way down, the operation proceeds to step S 112 as described earlier to execute a photographing operation. As the photographing operation is executed, the processing in the flowchart ends.
- a photographic subject present at the selected focus detection area 40 is designated as the tracking target subject, the tracking zone 41 is set accordingly, and the photographic subject is tracked based upon the image outputs corresponding to the tracking zone 41 .
- the focus detection calculation is executed by using the focus detection signals corresponding to the focus detection area 40 near the center of the tracking zone 41 , which is updated sequentially, and the exposure calculation is executed based upon the image output corresponding to the area around the center of the tracking zone 41 which is sequentially updated. Since the tracking zone 41 does not need to be a small area, the photographic subject can be tracked with high accuracy.
- the photographing conditions (the image-capturing conditions) are determined in correspondence to the area at the center of the tracking zone 41 , optimal focus adjustment for the main subject is achieved and the onus on the arithmetic processing executed to determine the photographing conditions, such as the exposure control, can be minimized.
- the subject can be tracked with a high level of accuracy. Since the focus detection and the exposure calculation are executed by using the focus detection signals and the image data corresponding to the central area, which is smaller than the tracking zone 41 , the position of the subject in the image plane can be ascertained with higher accuracy so as to enable photographing condition calculation such as focus adjustment and exposure control for the specific subject.
- the tracking zone 41 is adjusted as the main subject moves, and if the new tracking zone 41 contains a plurality of focus detection areas 40 present over equal distances from the center of the tracking zone 41 , the focus detection area 40 present along the direction from the center of the tracking zone 41 in which the subject is expected to move is reset as the new focus detection area 40 based upon the direction in which the subject moved over the previous images obtained in sequence.
- the focus detection and the exposure calculation can be executed in correspondence to the new focus detection area 40 even when the tracking zone contains a plurality of focus detection areas 40 .
- the template image 42 may instead be updated for each subsequent image by using image data corresponding to the newly detected tracking zone 41 , i.e., the nearly set tracking zone.
- a new template image 42 may be created by combining the color information corresponding to the R (red), G (green) and B (blue) colors in the template image 42 set in the initial setting screen and the color information corresponding to the R (red), G (green) and B (blue) colors in the newly set tracking zone 41 .
- the sets of color information should be combined by setting a higher ratio for the color information corresponding to the template image 42 set in the initial screen in the latter case.
- the different sets of color information may be combined at a ratio of 4:1 for the color information corresponding to the template image 42 set in the initial screen and the color information corresponding to the newly set tracking zone 41 .
- the focus may instead be detected through the contrast detection method by using image data over an arbitrary area in the captured image data output from the first image sensor 12 .
- a specific area at the center of the tracking zone 41 smaller than the tracking zone 41 , can be used as the focus detection area.
- a rectangular or a circular frame around a small area 45 around the center of the tracking zone 41 may be displayed as shown in FIG. 6 .
- An advantage similar to that achieved with the display unit that displays the AF area mark in the embodiment can be achieved by displaying the frame, as well.
- the photographic subject tracking processing may instead be executed based upon image data output from the first image sensor 12 .
- the image may be checked on the electronic viewfinder in a mirror-raised state, or a half-mirror (pellicle mirror) may be used to allow the optical image to be observed and captured at the same time.
- the present invention is not limited to this example and may be adopted in a camera with an integrated lens, a camera mounted at a portable telephone or a video camera. In other words, the present invention may be adopted in all types of optical devices with photographic subject tracking functions.
- a focus match may be achieved in correspondence to any of the focus detection areas 40 f , 40 h , 40 m , 40 n , 40 r , 40 s and the like present in close proximity to the center of the tracking zone 41 and the focus detection area 40 g closest to the tracking zone center, by using the information obtained in the nearby focus detection areas 40 f , 40 h , 40 m , 40 n , 40 r , 405 and the like as well as the information obtained over the focus detection area 40 g .
- a focus match may be achieved by using the focus detection area information corresponding to the focus detection area achieving the highest level of contrast among the focus detection area 40 g closest to the center and the nearby focus detection areas 40 f , 40 h , 40 m , 40 n , 40 r and 40 s .
- a focus match can be achieved in the area where the focus is detected with the greatest ease and the tracking target subject is likely to be present (e.g., 40 f , 40 h , 40 r or 40 s ) among the nearby areas 40 even when the contrast in the focus detection area 40 g is low and the focus cannot be detected with ease for the subject over the focus detection area 40 g.
- the optimal focus detection area 40 among the focus areas including the nearby areas, where the focus match control should be executed may be determined through arithmetic operation by weighting the focus detection area 40 g closest to the center of the tracking zone 41 .
- the focus detection area 40 g closest to the center of the tracking zone 41 may still be indicated in the viewfinder image. In this case, even when the focus match control has been executed over a nearby area where an edge of the tracking target subject has been detected, the center of the tracking target subject is marked to allow the user to recognize the tracking target subject with ease.
- the present invention provides a program that enables the computer in the camera to execute processing for tracking the main photographic subject based upon the image output corresponding to the tracking zone and processing for calculating the photographing conditions based upon the image output corresponding to the central area of the tracking zone.
- the tracking processing and the calculation processing executed in conformance to this program respectively corresponds to step S 109 and steps S 110 and S 111 in the flowchart presented in FIG. 5 .
- FIG. 8 shows how a personal computer having obtained the program (update program) via the Internet or via a portable recording medium, may provide the program to a digital camera.
- a personal computer 100 obtains the program via a recording medium 104 which may be a CD-ROM or a DVD-ROM.
- the personal computer 100 has a function of connecting with a communication line 101 .
- a computer 102 is a server computer that provides the program stored in a recording medium such as a hard disk 103 .
- the communication line 101 is a communication network such as the Internet.
- the computer 102 reads out the program from the hard disk 103 and transmits the program to the personal computer 100 via the communication line 101 .
- the program embodied as a data signal on a carrier wave, is transmitted via the communication line 101 .
- the personal computer 100 downloads the program thus obtained to a digital camera 105 which is connected via a cable or wirelessly connected thereto.
- the program to be installed in the digital camera 105 can be distributed as a computer-readable computer program product adopting any of various modes including a recording medium and a carrier wave, to allow an easy update.
- the program may be downloaded directly to the digital camera.
- the program may be directly obtained via a recording medium such as a memory card that can be loaded into the digital camera 105 .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Exposure Control For Cameras (AREA)
- Focusing (AREA)
Abstract
A subject tracking device includes: a tracking zone setting unit that sets an area where a main subject is present within a captured image as a tracking zone; a tracking unit that tracks the main subject based upon an image output corresponding to the tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
Description
- The disclosures of the following priority applications are herein incorporated by reference:
- Japanese Patent Application No. 2006-070654 filed Mar. 15, 2006.
- Japanese Patent Application No. 2007-020336 filed Jan. 31, 2007
- 1. Field of the Invention
- The present invention relates to a subject tracking device, a subject tracking method, a subject tracking program product and an optical device, to be adopted when automatically tracking a subject that moves within an imaging field.
- 2. Description of the Related Art
- There are cameras known in the related art that execute focus detection and exposure calculation by automatically tracking a moving subject being photographed. In such a camera disclosed in the related art, a wide area around the point at which the photographer's eyes are set is selected as a moving object detection zone while tracking the photographic subject, the moving object detection zone is then divided into a plurality of areas and the area where the photographic subject is present or the area with the highest contrast among the divided areas is detected as a tracking zone (see, for instance, Japanese Laid Open Patent Publication No. H05-288982).
- While the tracking reliability achieved with the camera in the related art is low if the tracking zone is small, the camera effectively tracks the photographic subject over a large tracking zone with better reliability. However, since the volume of information sampled over the large tracking zone is bound to be large, the load of processing executed for the focus detection and the exposure calculation for the photographic subject, too, increases.
- According to the 1st embodiment of the present invention, a subject tracking device comprises: a tracking zone setting unit that sets an area where a main subject is present within a captured image as a tracking zone; a tracking unit that tracks the main subject based upon an image output corresponding to the tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- According to the 2nd embodiment of the present invention, in the subject tracking device according to the 1st embodiment, it is preferred that: there is further provided a converting unit that converts a captured image data to be reduced and outputs a converted image data; and the tracking unit tracks movement of the main subject by using the converted image data.
- According to the 3rd embodiment of the present invention, in the subject tracking device according to the 1st embodiment, it is preferred that the tracking zone setting unit sets an area corresponding to a subject present at a center of the captured image as the tracking zone.
- According to the 4th embodiment of the present invention, in the subject tracking device according to the 1st embodiment, it is preferred that there is further provided a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone.
- According to the 5th embodiment of the present invention, an optical device comprises a subject tracking device according to the 1st embodiment.
- According to the 6th embodiment of the present invention, in the optical device according to the 5th embodiment, it is preferred that: the subject tracking device further comprises a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone; a focus detection area indicating a position at which focus detection is executed for the subject is set in the captured image; and the mark indicates the focus detection area present within the tracking zone.
- According to the 7th embodiment of the present invention, in the optical device according to the 6th embodiment, it is preferred that the tracking zone setting unit sets an area corresponding to the focus detection area having been selected as the tracking zone.
- According to the 8th embodiment of the present invention, a subject tracking method comprises: setting an area where a main subject is present within a captured image as a tracking zone; tracking the main subject based upon an image output corresponding to the tracking zone; and determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- According to the 9th embodiment of the present invention, a computer-readable computer program product having contained therein a subject tracking program, and the subject tracking program comprises: processing for tracking a main subject based upon an image output corresponding to a tracking zone; and processing for determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
- According to the 10th embodiment of the present invention, a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
- According to the 11th embodiment of the present invention, in the subject tracking device according to the 10th embodiment, it is preferred that the arithmetic operation unit executes exposure calculation by using image data over a central area of the second tracking zone.
- According to the 12th embodiment of the present invention, in the subject tracking device according to the 10th embodiment, it is preferred that the arithmetic operation unit includes a focus detection calculation unit that executes focus detection calculation for a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
- According to the 13th embodiment of the present invention, in the subject tracking device according to the 1st embodiment, it is preferred that the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using an image output corresponding to an area near a central area of the tracking zone.
- According to the 14th embodiment of the present invention, in the subject tracking device according to the 13rd embodiment, it is preferred that there is further provided a display unit that displays a mark indicating a central area of the tracking zone.
- According to the 15th embodiment of the present invention, in the subject tracking device according to the 10th embodiment, it is preferred that the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
- According to the 16th embodiment of the present invention, in the subject tracking device according to the 15th embodiment, it is preferred that there is further provided a display unit that displays a mark indicating a central area within the second tracking zone.
- According to the 17th embodiment of the present invention, a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using at least either information corresponding to a central area within the second tracking zone or information corresponding to an area near a central area of the second tracking zone.
- According to the 18th embodiment of the present invention, in the subject tracking device according to the 17th embodiment, it is preferred that the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information indicating a higher level of contrast, selected from the information corresponding to a central area of the second tracking zone and the information corresponding to the area near a central area of the second tracking zone.
- According to the 19th embodiment of the present invention, in the subject tracking device according to the 17th embodiment, it is preferred that the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by weighting a central area of the second tracking zone.
- According to the 20th embodiment of the present invention, a subject tracking device comprises: a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone and information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
- According to the 21st embodiment of the present invention, in the subject tracking device according to the 20th embodiment, it is preferred that the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
- According to the 22nd embodiment of the present invention, in the subject tracking device according to the 20th embodiment, it is preferred that there is further provided a display unit that displays a mark indicating a central area within the second tracking zone.
- According to the 23rd embodiment of the present invention, a subject tracking method comprises: designating image data in an area where a subject is present within a first image as a first tracking zone; tracking the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and determining through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
- According to the 24th embodiment of the present invention, in the subject tracking method according to the 23rd embodiment, it is preferred that exposure calculation is executed by using image data over a central area of the second tracking zone.
- According to the 25th embodiment of the present invention, in the subject tracking method according to the 23rd embodiment, it is preferred that focus detection calculation is executed in correspondence to a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
- According to the 26th embodiment of the present invention, in the subject tracking method according to the 23rd embodiment, it is preferred that the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to an area near a central area of the second tracking zone.
- According to the 27th embodiment of the present invention, in the subject tracking method according to the 23rd embodiment, it is preferred that the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
- According to the 28th embodiment of the present invention, an optical apparatus comprises a subject tracking device according to the 1st embodiment.
-
FIG. 1 illustrates the essential structure adopted in the digital camera achieved in an embodiment of the present invention; -
FIG. 2 shows the positions at which the focus detection areas are indicated on the viewfinder of the digital camera achieved in the embodiment; -
FIG. 3 is a conceptual diagram of the various areas used in the photographic subject tracking control in the embodiment; - FIGS. 4A˜4C illustrate the subject tracking method adopted in the embodiment;
-
FIG. 5 presents a flowchart of the photographic subject tracking processing operation executed in the embodiment; -
FIG. 6 shows the focus detection area set at the center of the tracking zone; -
FIG. 7 shows the image screen with a great number of focus detection areas set therein; and -
FIG. 8 shows how the program may be provided to the digital camera. - The following is an explanation of a single lens reflex digital camera achieved in an embodiment, which is equipped with a photographic subject tracking device (or a photographic subject following device), given in reference to the drawings.
FIG. 1 shows the essential structure adopted in the digital camera according to the present invention. Alens barrel 20, which includes aphotographic lens 21, is exchangeably mounted at acamera body 10. - A
first image sensor 12 used to capture an image of a photographic subject is installed in thecamera body 10. Thefirst image sensor 12 may be a CCD or a CMOS. A quick-return mirror 11 that reflects subject light having passed through thephotographic lens 21 toward a viewfinder optical system is disposed between thephotographic lens 21 and thefirst image sensor 12. Some of the subject light is transmitted through a semi-transmissive area of thequick return mirror 11, is reflected downward at a sub mirror 111 and enters anAF sensor module 112 adopting the phase detection system. - The phase detection type
AF sensor module 112 may include, for instance, a focus detection optical system that splits a focus detection light flux into a pair of focus detection optical images and a plurality of pairs of CCD line sensors that each output focus detection signals corresponding to the pair of split optical images having entered therein. The focus detection signals output from the plurality of pairs of CCD line sensors are input to acontrol unit 30, which, in turn, outputs a lens drive signal in response to which a focus adjustment lens is driven to the focus match position, as explained later, by using the focus detection signals output from a specific pair of CCD line sensors among the plurality of pairs of CCD line sensors. - The subject light reflected at the
quick return mirror 11 forms an image on a focusingscreen 13 disposed at a position optically equivalent to the position of thefirst image sensor 12. The subject image formed on the focusingscreen 13 can be observed by the photographer via a pentaprism 14 and aneyepiece lens 15, and it also passes through aprism 17 and animage forming lens 18 from the pentaprism 14 to form an image on a light-receiving surface of asecond image sensor 19. Thesecond image sensor 19 is constituted with pixels, the number of which is smaller than the number of pixels constituting thefirst image sensor 12, and it may be a CCD with 640×480 pixels disposed in an RGB Bayer array. It is to be noted that when a photographic operation is executed in response to a full-press operation of a shutter release button (not shown), thequick return mirror 11 rotates to the position indicated by the dotted line in the figure to allow the subject light to form an image on thefirst image sensor 12. - Image data obtained at the
second image sensor 19 are input to thecontrol unit 30. Thecontrol unit 30 comprises a CPU, a ROM, a RAM and various peripheral circuits. Its functional blocks include animage converting unit 31 that converts and reduces the image data obtained at thesecond image sensor 19 into image data achieving a predetermined image size, e.g., a 64×48 pixels, a photographic subject trackingcontrol unit 32 that tracks and follows a main photographic subject, anexposure calculation unit 33 that calculates the optimal exposure for the main photographic subject, a focusdetection calculation unit 34 that executes focus detection calculation, a lens drivequantity calculation unit 35 that calculates a drive quantity indicating the extent to which thephotographic lens 21 needs to be driven, and an ON/OFF control unit 36 that controls the ON/OFF state of an AF area mark indicating afocus detection area 40 at adisplay unit 361, as explained later. At thedisplay unit 361, the AF area mark can be displayed via a liquidcrystal display panel 22 disposed near the focusingscreen 13. - The focus
detection calculation unit 34 determines through an arithmetic operation the focus adjustment state indicated by the defocused quantity and the like based upon the focus detection signals output from the pair of CCD line sensors corresponding to thefocus detection area 40 selected by the photographer by operating a focusing area selectingoperation unit 341. - As shown in
FIG. 2 , sevenfocus detection areas 40 a˜40 g, for instance, may be set within the photographic field (the imaging field or the imaging-capturing field) in the camera achieved in the embodiment. As one of the sevenfocus detection areas 40 a˜40 g is selected via the area selectingoperation unit 341, the focusdetection calculation unit 34 determines through arithmetic operation the focus adjustment state (focus adjustment quantity) in correspondence to the selectedfocus detection area 40. In addition, the area selectingoperation unit 341 outputs a selected focus detection area signal, indicating the specific focus detection area having been selected among the sevenfocus detection areas 40 a˜40 g, to the photographic subject trackingcontrol unit 32. The focus adjustment quantity calculated in the focusdetection calculation unit 34 is output to the lens drivequantity calculation unit 35. The lens drivequantity calculation unit 35 calculates the lens drive quantity based upon the focus adjustment quantity input thereto and outputs the resulting lens drive signal to alens drive motor 351. Thelens drive motor 351 drives thephotographic lens 21 along the optical axis in response to the lens drive signal, thereby adjusting the focus adjustment state. - Based upon the selected focus detection area signal input thereto, the photographic subject tracking
control unit 32 extracts the image data corresponding to a tracking zone from the image data having been converted by theimage converting unit 31 and executes subject tracking calculation based upon the image data in the tracking zone. In this embodiment, atracking zone 41 and a trackingcalculation zone 43 ranging over a greater area than the trackingzone 41 are set on aninitial image 44 obtained via thesecond image sensor 19 and thetracking zone 41 is designated as a template image 42, as shown inFIG. 3 . The trackingzone 41 should range over a predetermined area, e.g., a 4×4 pixel area, around thefocus detection area 40 recognized based upon the selected focus detection area signal. It is to be noted that a range with a hue identical to that of thefocus detection area 40 may be designated as thetracking zone 41. - Then, the photographic subject tracking control is executed by preparing a cutout area, which is equal to the template image 42 in size, within the tracking
calculation zone 43 in an image obtained successively via thesecond image sensor 19 and by calculating the differences between the R (red), C (green) and B (blue) colors in the cutout area and the R (red), G (green) and B (blue) colors in the template image 42. It is to be noted that the size of thetracking zone 41 is selected as a default setting and that the photographer does not set the size of thetracking zone 41. - Actually, the
control unit 30 utilizes in calculation the reduced image 44 (64×48 pixels), for example, into which the image (640×480 pixels) obtained by thesecond image sensor 19 has been converted and reduced.FIG. 3 shows the tracking calculation zone 43 (8×8 pixels, for example) over the reduced image 44 (64×48 pixels) and the tracking zone 41 (4×4 pixels, for example) over the reduced image 44 (64×48 pixels). It is to be noted thatFIG. 3 presents a conceptual diagram and that the proportional relationships among the sizes of the various areas of the actual product may not match those shown inFIG. 3 . - Now, the photographic subject tracking control method according to the present invention is explained in detail in reference to FIGS. 4A˜4C. It is assumed that the photographer has selected the
focus detection area 40 d by operating the focusing area selectingoperation unit 341. Under such circumstances, the photographic subject trackingcontrol unit 32 regards a subject present at thefocus detection area 40 d as the main photographic subject and designates a relatively wide range containing thefocus detection area 40 d as atracking zone 41 a, as shown inFIG. 4A . The image data in thetracking zone 41 a are stored into a predetermined storage area within the photographic subject trackingcontrol unit 32 as the template image 42. In addition, the ON/OFF control unit 36 issues an instruction for thedisplay unit 361 to turn on thefocus detection area 40 d based upon the selected focus detection area signal provided from the area selectingoperation unit 341. -
FIG. 4B shows animage 44 b obtained in time sequence at thesecond image sensor 19 to follow theimage 44 a. The photographic subject trackingcontrol unit 32 selects a range greater than theinitial tracking zone 41 a as a trackingcalculation zone 43 b. An area inside the trackingcalculation zone 43 b, which is equal in size to thetracking zone 41 a, is extracted as a cutout area and the cut-out position is sequentially displaced. The differences between the R (red), G (green) and B (blue) colors in the individual cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 are calculated and the cut-out area indicating the smallest value representing the sum of the differences is judged to be the area most similar to the template image 42. This cut-out area is then designated as a new tracking zone 41 b in theimage 44 b. - The photographic subject tracking
control unit 32 outputs information indicating the range of the tracking zone 41 b and the coordinates of the center of the tracking zone 41 b to the focusdetection calculation unit 34 and theexposure calculation unit 33. Based upon the coordinates of the tracking zone 41 b input thereto, the focusdetection calculation unit 34 determines that the main photographic subject is present at thefocus detection area 40 d closest to the center of the tracking zone 41 b, and calculates the focus adjustment quantity by using the focus detection signals output from the pair of CCD line sensors corresponding to thefocus detection area 40 d. The focusdetection calculation unit 34 outputs the selected focus detection area signal indicating thefocus detection area 40 d to the ON/OFF control unit 36 via the photographic subject trackingcontrol unit 32. Thedisplay unit 361 turns on the AF area mark for thefocus detection area 40 d based upon the selected focus detection area signal. - The
exposure calculation unit 33 extracts the image data corresponding to the central area of the tracking zone 41 b from the image data provided via theimage converting unit 31 based upon the coordinates of the tracking zone 41 b, calculates the optimal exposure based upon the extracted image data, and outputs the calculation results to anaperture drive unit 331 and ashutter drive unit 332. Namely, theexposure calculation unit 33 executes the exposure calculation for the area corresponding to the center of the tracking zone. -
FIG. 4C shows animage 44 c obtained in time sequence at thesecond image sensor 19 to follow theimage 44 b. As in the case of theimage 44 b, the photographic subject trackingcontrol unit 32 selects a range greater than the tracking zone 41 b as a trackingcalculation zone 43 c. Then, through the procedure having been explained in reference toFIG. 4B , the differences between the R (red), G (green) and B (blue) colors in the individual cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 are calculated and the cut-out area indicating the smallest value representing the sum of the differences is designated as anew tracking zone 41 c. - The photographic subject tracking
control unit 32 outputs information indicating the range of thetracking zone 41 c and the coordinates of the center of thetracking zone 41 c to the focusdetection calculation unit 34 and theexposure calculation unit 33. Based upon the coordinates of thetracking zone 41 c input thereto, the focusdetection calculation unit 34 determines that the main photographic subject is present at thefocus detection area 40 c closest to the center of thetracking zone 41 c and calculates the focus adjustment quantity by using the focus detection signals output from the pair of CCD line sensors corresponding to thefocus detection area 40 c. The focusdetection calculation unit 34 outputs the selected focus detection area signal indicating the newfocus detection area 40 c to the ON/OFF control unit 36 via the photographic subject trackingcontrol unit 32. Thedisplay unit 361 then turns on the AF area mark for thefocus detection area 40 c based upon the selected focus detection area signal. - The
exposure calculation unit 33 extracts the image data corresponding to the central area of thetracking zone 41 c from the image data provided via theimage converting unit 31 based upon the coordinates of thetracking zone 41 c, calculates the optimal exposure based upon the extracted image data, and outputs the calculation results to theaperture drive unit 331 and theshutter drive unit 332. - It is to be noted that if the tracking zone 41 (41 a, 41 b or 41 c) contains a plurality of
focus detection areas 40 present over equal distances from the center of thetracking zone 41, thefocus detection area 40 present along the direction from the center of thetracking zone 41 in which the subject is expected to move is selected based upon the direction in which the subject moved over the previous images obtained in sequence, i.e., based upon the movement history. - The following is an explanation of various phases of processing executed during the photographic subject tracking control operation, given in reference to the flowchart presented in
FIG. 5 . It is to be noted that the processing in the flowchart is executed by thecontrol unit 30 in conformance to a program. In addition, the processing in the flowchart is started in response to a halfway press operation at the shutter release switch (not shown). - In step S101, focus adjustment is executed by driving the
photographic lens 21 based upon the focus detection calculation results provided from the focusdetection calculation unit 34 in correspondence to thefocus detection area 40 having been set by the photographer, and then the operation proceeds to step S102. In step S102, the image captured at thesecond image sensor 19 is obtained before the operation proceeds to step S103. - In step S103, the image data having been obtained in step S102 are converted into image data of an image having a predetermined number of pixels by the
image converting unit 31 and then the operation proceeds to step S104. In step S104, the exposure calculation is executed by using image data selected from the image data having been converted in step S103 in correspondence to the focus detection area for which the focus detection calculation has been executed, and then the operation proceeds to step S105. - In step S105, the tracking
zone 41 is set as explained earlier based upon thefocus detection area 40 having been set in step S101 and the image corresponding to thetracking zone 41 is stored as the template image 42, before the operation proceeds to step S106. In step S106, a decision is made as to whether or not the photographer has pressed the shutter release switch (not shown) all the way down. If an affirmative decision is made in step S106, i.e., if it is decided in step S106 that the shutter release switch has been pressed all the way down, the operation proceeds to step S112. If, on the other hand, a negative decision is made in step S106, i.e., if it is decided in step S106 that the shutter release switch has not been pressed all the way down, the operation proceeds to step S107. - In step S107, the next image (new image) is obtained from the
second image sensor 19, and then the operation proceeds to step S108. In step S108, the obtained image is converted into an image having the predetermined number of pixels as in step S103 explained earlier, and then the operation proceeds to step S109. - In step S109, based upon the image resulting from the conversion having been executed in step S108, the differences between the R (red), G (green) and B (blue) colors in the cut-out areas and the R (red), G (green) and B (blue) colors in the template image 42 having been stored in step S105 are calculated. Based upon the calculation results, the cut-out area with the highest level of similarity is designated as the new photographic
subject tracking zone 41 for the next image, before the operation proceeds to step S110. - In step S110, the selected focus detection area signal indicating the
focus detection area 40 closest to the center of thetracking zone 41 having been set in step S109 is output to the focusdetection calculation unit 34 to enable the focusdetection calculation unit 34 to calculate the focus adjustment quantity for this focus detection area. Then the operation proceeds to step S111 to execute the exposure calculation based upon image data selected from the image data having been obtained in step S108, in correspondence to the newfocus detection area 40 having been set in step S110. Subsequently, the operation returns to step S106. - If it is decided in step S106 that the shutter release switch has been pressed all the way down, the operation proceeds to step S112 as described earlier to execute a photographing operation. As the photographing operation is executed, the processing in the flowchart ends.
- The following advantages are achieved with the digital camera in the embodiment explained above.
- (1) A photographic subject present at the selected
focus detection area 40 is designated as the tracking target subject, the trackingzone 41 is set accordingly, and the photographic subject is tracked based upon the image outputs corresponding to thetracking zone 41. In addition, the focus detection calculation is executed by using the focus detection signals corresponding to thefocus detection area 40 near the center of thetracking zone 41, which is updated sequentially, and the exposure calculation is executed based upon the image output corresponding to the area around the center of thetracking zone 41 which is sequentially updated. Since thetracking zone 41 does not need to be a small area, the photographic subject can be tracked with high accuracy. In addition, since the photographing conditions (the image-capturing conditions) are determined in correspondence to the area at the center of thetracking zone 41, optimal focus adjustment for the main subject is achieved and the onus on the arithmetic processing executed to determine the photographing conditions, such as the exposure control, can be minimized. - Namely, since a greater volume of information sampled from the relatively
large tracking zone 41 can be used in the tracking processing, the subject can be tracked with a high level of accuracy. Since the focus detection and the exposure calculation are executed by using the focus detection signals and the image data corresponding to the central area, which is smaller than the trackingzone 41, the position of the subject in the image plane can be ascertained with higher accuracy so as to enable photographing condition calculation such as focus adjustment and exposure control for the specific subject. - (2) The
tracking zone 41 is adjusted as the main subject moves, and if thenew tracking zone 41 contains a plurality offocus detection areas 40 present over equal distances from the center of thetracking zone 41, thefocus detection area 40 present along the direction from the center of thetracking zone 41 in which the subject is expected to move is reset as the newfocus detection area 40 based upon the direction in which the subject moved over the previous images obtained in sequence. Thus, the focus detection and the exposure calculation can be executed in correspondence to the newfocus detection area 40 even when the tracking zone contains a plurality offocus detection areas 40. - The embodiment explained above allows for the following variations.
- (1) While the image data corresponding to the
tracking zone 41 set in the initial screen are designated as the template image 42 to be continuously compared with the subsequent images, the template image 42 may instead be updated for each subsequent image by using image data corresponding to the newly detected trackingzone 41, i.e., the nearly set tracking zone. Alternatively, a new template image 42 may be created by combining the color information corresponding to the R (red), G (green) and B (blue) colors in the template image 42 set in the initial setting screen and the color information corresponding to the R (red), G (green) and B (blue) colors in the newly set trackingzone 41. However, the sets of color information should be combined by setting a higher ratio for the color information corresponding to the template image 42 set in the initial screen in the latter case. For instance, the different sets of color information may be combined at a ratio of 4:1 for the color information corresponding to the template image 42 set in the initial screen and the color information corresponding to the newly set trackingzone 41. - (2) While the focus detection is executed by adopting the phase detection method, the focus may instead be detected through the contrast detection method by using image data over an arbitrary area in the captured image data output from the
first image sensor 12. By adopting the contrast detection method, a specific area at the center of thetracking zone 41, smaller than the trackingzone 41, can be used as the focus detection area. Under such circumstances, a rectangular or a circular frame around asmall area 45 around the center of thetracking zone 41 may be displayed as shown inFIG. 6 . An advantage similar to that achieved with the display unit that displays the AF area mark in the embodiment can be achieved by displaying the frame, as well. - (3) While an explanation is given above on an example in which the photographic subject tracking processing is executed based upon the image data output from the
second image sensor 19, the photographic subject tracking processing may instead be executed based upon image data output from thefirst image sensor 12. In such a case, the image may be checked on the electronic viewfinder in a mirror-raised state, or a half-mirror (pellicle mirror) may be used to allow the optical image to be observed and captured at the same time. - (4) While an explanation is given above on an example in which the present invention is adopted in a single lens reflex digital camera that allows the use of exchangeable photographic lenses, the present invention is not limited to this example and may be adopted in a camera with an integrated lens, a camera mounted at a portable telephone or a video camera. In other words, the present invention may be adopted in all types of optical devices with photographic subject tracking functions.
- (5) While an explanation is given above in reference to the embodiment on an example in which seven
focus detection areas 40 are set, a greater number offocus detection areas 40 may be set at a higher level of density, as shown inFIG. 7 . In such a case, it becomes more likely that thetracking zone 41 contains a plurality of focus detection areas. In the example presented inFIG. 7 , nineteenfocus detection areas 40 a˜40 g are set. In this instance, a plurality offocus detection areas 40 are present over a region corresponding to the tracking target subject position. - Accordingly, a focus match may be achieved in correspondence to any of the
focus detection areas tracking zone 41 and thefocus detection area 40 g closest to the tracking zone center, by using the information obtained in the nearbyfocus detection areas focus detection area 40 g. For instance, a focus match may be achieved by using the focus detection area information corresponding to the focus detection area achieving the highest level of contrast among thefocus detection area 40 g closest to the center and the nearbyfocus detection areas nearby areas 40 even when the contrast in thefocus detection area 40 g is low and the focus cannot be detected with ease for the subject over thefocus detection area 40 g. - In case of using the nearby focus detection areas, the optimal
focus detection area 40 among the focus areas including the nearby areas, where the focus match control should be executed, may be determined through arithmetic operation by weighting thefocus detection area 40 g closest to the center of thetracking zone 41. - (6) Even when the focus match has been achieved in one of the nearby
focus detection areas 40 and thefocus detection area 40 g closest to the center of thetracking zone 41 as explained in (5), thefocus detection area 40 g closest to the center of thetracking zone 41 may still be indicated in the viewfinder image. In this case, even when the focus match control has been executed over a nearby area where an edge of the tracking target subject has been detected, the center of the tracking target subject is marked to allow the user to recognize the tracking target subject with ease. - An explanation is given in reference to the embodiment on an example in which the present invention is adopted in a digital camera with the program shown in
FIG. 5 stored in thecontrol unit 30. However, user support may need to be extended to existing users via the Internet or portable recording media in the event of a program upgrade or the like. The present invention may be adopted in software used for upgrade support as well. Namely, the present invention provides a program that enables the computer in the camera to execute processing for tracking the main photographic subject based upon the image output corresponding to the tracking zone and processing for calculating the photographing conditions based upon the image output corresponding to the central area of the tracking zone. The tracking processing and the calculation processing executed in conformance to this program respectively corresponds to step S109 and steps S110 and S111 in the flowchart presented inFIG. 5 . -
FIG. 8 shows how a personal computer having obtained the program (update program) via the Internet or via a portable recording medium, may provide the program to a digital camera. Apersonal computer 100 obtains the program via arecording medium 104 which may be a CD-ROM or a DVD-ROM. In addition, thepersonal computer 100 has a function of connecting with a communication line 101. Acomputer 102 is a server computer that provides the program stored in a recording medium such as ahard disk 103. The communication line 101 is a communication network such as the Internet. Thecomputer 102 reads out the program from thehard disk 103 and transmits the program to thepersonal computer 100 via the communication line 101. In other words, the program, embodied as a data signal on a carrier wave, is transmitted via the communication line 101. - The
personal computer 100 downloads the program thus obtained to adigital camera 105 which is connected via a cable or wirelessly connected thereto. Thus, the program to be installed in thedigital camera 105 can be distributed as a computer-readable computer program product adopting any of various modes including a recording medium and a carrier wave, to allow an easy update. It is to be noted that if thedigital camera 105 itself has an Internet communication function, the program may be downloaded directly to the digital camera. Alternatively, the program may be directly obtained via a recording medium such as a memory card that can be loaded into thedigital camera 105. - The above described embodiments are examples, and various modifications can be made without departing from the spirit and scope of the invention.
Claims (28)
1. A subject tracking device, comprising:
a tracking zone setting unit that sets an area where a main subject is present within a captured image as a tracking zone;
a tracking unit that tracks the main subject based upon an image output corresponding to the tracking zone; and
an arithmetic operation unit that determines through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
2. A subject tracking device according to claim 1 , further comprising:
a converting unit that converts a captured image data to be reduced and outputs a converted image data, wherein:
the tracking unit tracks movement of the main subject by using the converted image data.
3. A subject tracking device according to claim 1 , wherein:
the tracking zone setting unit sets an area corresponding to a subject present at a center of the captured image as the tracking zone.
4. A subject tracking device according to claim 1 , further comprising:
a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone.
5. An optical device, comprising:
a subject tracking device according to claim 1 .
6. An optical device according to claim 5 , wherein:
the subject tracking device further comprises a display unit that displays within the captured image a mark indicating an area corresponding to an area constituting part of the tracking zone;
a focus detection area indicating a position at which focus detection is executed for the subject is set in the captured image; and
the mark indicates the focus detection area present within the tracking zone.
7. An optical device according to claim 6 , wherein:
the tracking zone setting unit sets an area corresponding to the focus detection area having been selected as the tracking zone.
8. A subject tracking method, comprising:
setting an area where a main subject is present within a captured image as a tracking zone;
tracking the main subject based upon an image output corresponding to the tracking zone; and
determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
9. A computer-readable computer program product having contained therein a subject tracking program, the subject tracking program comprising:
processing for tracking a main subject based upon an image output corresponding to a tracking zone; and
processing for determining through arithmetic operation image-capturing conditions based upon an image output corresponding to a central area within the tracking zone.
10. A subject tracking device, comprising:
a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and
an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
11. A subject tracking device according to claim 10 , wherein:
the arithmetic operation unit executes exposure calculation by using image data over a central area of the second tracking zone.
12. A subject tracking device according to claim 10 , wherein:
the arithmetic operation unit includes a focus detection calculation unit that executes focus detection calculation for a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
13. A subject tracking device according to claim 1 , wherein:
the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using an image output corresponding to an area near a central area of the tracking zone.
14. A subject tracking device according to claim 13 , further comprising:
a display unit that displays a mark indicating a central area of the tracking zone.
15. A subject tracking device according to claim 10 , wherein:
the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
16. A subject tracking device according to claim 15 , further comprising:
a display unit that displays a mark indicating a central area within the second tracking zone.
17. A subject tracking device, comprising:
a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and
an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using at least either information corresponding to a central area within the second tracking zone or information corresponding to an area near a central area of the second tracking zone.
18. A subject tracking device according to claim 17 , wherein:
the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information indicating a higher level of contrast, selected from the information corresponding to a central area of the second tracking zone and the information corresponding to the area near a central area of the second tracking zone.
19. A subject tracking device according to claim 17 , wherein:
the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by weighting a central area of the second tracking zone.
20. A subject tracking device, comprising:
a tracking unit that designates image data in an area where a subject is present within a first image as a first tracking zone and tracks the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and
an arithmetic operation unit that determines through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone and information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
21. A subject tracking device according to claim 20 , wherein:
the arithmetic operation unit determines through arithmetic operation the image-capturing conditions by using information corresponding to an area near a central area of the second tracking zone.
22. A subject tracking device according to claim 20 , further comprising:
a display unit that displays a mark indicating a central area within the second tracking zone.
23. A subject tracking method, comprising:
designating image data in an area where a subject is present within a first image as a first tracking zone;
tracking the subject by designating an area where image data corresponding to the image data in the first tracking zone are present in a second image obtained after the first image as a second tracking zone; and
determining through arithmetic operation image-capturing conditions by using information corresponding to a central area within the second tracking zone.
24. A subject tracking method according to claim 23 , wherein:
exposure calculation is executed by using image data over a central area of the second tracking zone.
25. A subject tracking method according to claim 23 , wherein:
focus detection calculation is executed in correspondence to a focus detection area closest to a center of the second tracking zone among a plurality of focus detection areas.
26. A subject tracking method according to claim 23 , wherein:
the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to an area near a central area of the second tracking zone.
27. A subject tracking method according to claim 23 , wherein:
the image-capturing conditions are determined through arithmetic operation executed by using information corresponding to a direction along which the subject moves during a period elapsing between a time point at which the first image is captured and a time point at which the second image is captured.
28. An optical apparatus comprising:
a subject tracking device according to claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/316,686 US8433097B2 (en) | 2006-03-15 | 2011-12-12 | Subject tracking device, subject tracking method, subject tracking program product and optical device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006-070654 | 2006-03-15 | ||
JP2006070654 | 2006-03-15 | ||
JP2007020336A JP5045125B2 (en) | 2006-03-15 | 2007-01-31 | Subject tracking device and optical device |
JP2007-020336 | 2007-01-31 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/316,686 Continuation US8433097B2 (en) | 2006-03-15 | 2011-12-12 | Subject tracking device, subject tracking method, subject tracking program product and optical device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070263904A1 true US20070263904A1 (en) | 2007-11-15 |
Family
ID=38683136
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/701,360 Abandoned US20070263904A1 (en) | 2006-03-15 | 2007-02-02 | Subject tracking device, subject tracking method, subject tracking program product and optical device |
US13/316,686 Active US8433097B2 (en) | 2006-03-15 | 2011-12-12 | Subject tracking device, subject tracking method, subject tracking program product and optical device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/316,686 Active US8433097B2 (en) | 2006-03-15 | 2011-12-12 | Subject tracking device, subject tracking method, subject tracking program product and optical device |
Country Status (2)
Country | Link |
---|---|
US (2) | US20070263904A1 (en) |
JP (1) | JP5045125B2 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080002028A1 (en) * | 2006-06-30 | 2008-01-03 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
US20080285809A1 (en) * | 2007-05-02 | 2008-11-20 | Nikon Corporation | Photographic subject tracking method, computer program product and photographic subject tracking device |
US20090009606A1 (en) * | 2007-07-06 | 2009-01-08 | Nikon Corporation | Tracking device, focus adjustment device, image-capturing device, and tracking method |
US20110043680A1 (en) * | 2009-08-18 | 2011-02-24 | Canon Kabushiki Kaisha | Auto focusing apparatus and control method |
US20110261225A1 (en) * | 2010-04-23 | 2011-10-27 | Niinami Norikatsu | Image capturing apparatus, method of detecting tracking object, and computer program product |
WO2012116347A1 (en) * | 2011-02-24 | 2012-08-30 | Qualcomm Incorporated | Auto-focus tracking |
US20130236057A1 (en) * | 2007-10-10 | 2013-09-12 | Samsung Electronics Co., Ltd. | Detecting apparatus of human component and method thereof |
US8970723B2 (en) | 2010-05-10 | 2015-03-03 | Fujitsu Limited | Device and method for image processing capable of tracking target object |
DE202015105376U1 (en) | 2015-10-12 | 2015-10-19 | Sick Ag | 3D camera for taking three-dimensional images |
US20160044229A1 (en) * | 2014-08-05 | 2016-02-11 | Samsung Electronics Co., Ltd. | Imaging sensor capable of detecting phase difference of focus |
EP3021072A1 (en) | 2014-11-14 | 2016-05-18 | Sick Ag | Lighting device and method for projecting an illumination pattern |
US10812774B2 (en) * | 2018-06-06 | 2020-10-20 | At&T Intellectual Property I, L.P. | Methods and devices for adapting the rate of video content streaming |
US11019361B2 (en) | 2018-08-13 | 2021-05-25 | At&T Intellectual Property I, L.P. | Methods, systems and devices for adjusting panoramic view of a camera for capturing video content |
US11190820B2 (en) | 2018-06-01 | 2021-11-30 | At&T Intellectual Property I, L.P. | Field of view prediction in live panoramic video streaming |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5044321B2 (en) * | 2006-09-13 | 2012-10-10 | 株式会社リコー | Imaging apparatus and subject detection method |
JP2009109839A (en) | 2007-10-31 | 2009-05-21 | Nikon Corp | Image tracking device and imaging device |
JP5115210B2 (en) * | 2008-01-24 | 2013-01-09 | 株式会社ニコン | Imaging device |
JP5121505B2 (en) * | 2008-02-29 | 2013-01-16 | 三洋電機株式会社 | Video camera |
JP5458530B2 (en) * | 2008-08-29 | 2014-04-02 | 株式会社ニコン | camera |
JP5359715B2 (en) * | 2009-09-10 | 2013-12-04 | 株式会社ニコン | Subject tracking device, subject tracking method, and camera |
JP2011164543A (en) * | 2010-02-15 | 2011-08-25 | Hoya Corp | Ranging-point selecting system, auto-focus system, and camera |
JP5482447B2 (en) * | 2010-05-25 | 2014-05-07 | リコーイメージング株式会社 | Ranging point determination system, autofocus system, and camera |
JP2013015807A (en) * | 2011-06-09 | 2013-01-24 | Nikon Corp | Focus detector and imaging apparatus |
JP5965653B2 (en) * | 2012-01-27 | 2016-08-10 | オリンパス株式会社 | TRACKING DEVICE AND TRACKING METHOD |
JP5930792B2 (en) * | 2012-03-26 | 2016-06-08 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP5767755B2 (en) * | 2012-11-05 | 2015-08-19 | 富士フイルム株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
WO2014196388A1 (en) * | 2013-06-05 | 2014-12-11 | 富士フイルム株式会社 | Lens device |
JP5772933B2 (en) * | 2013-11-29 | 2015-09-02 | 株式会社ニコン | Imaging device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5196929A (en) * | 1989-07-05 | 1993-03-23 | Olympus Optical Co., Ltd. | Display system of camera having tracking apparatus |
US5418595A (en) * | 1993-02-02 | 1995-05-23 | Nikon Corporation | Camera having a subject tracking function and method therefor |
US5627586A (en) * | 1992-04-09 | 1997-05-06 | Olympus Optical Co., Ltd. | Moving body detection device of camera |
US5627856A (en) * | 1994-09-09 | 1997-05-06 | Omnipoint Corporation | Method and apparatus for receiving and despreading a continuous phase-modulated spread spectrum signal using self-synchronizing correlators |
US5896174A (en) * | 1992-09-11 | 1999-04-20 | Asahi Kogaku Kogyo Kabushiki Kaisha | Control system for inhibiting a calculating system in an automatic focusing device |
US20050264679A1 (en) * | 2004-05-26 | 2005-12-01 | Fujinon Corporation | Autofocus system |
US7315631B1 (en) * | 2006-08-11 | 2008-01-01 | Fotonation Vision Limited | Real-time face tracking in a digital image acquisition device |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2511409B2 (en) * | 1986-03-31 | 1996-06-26 | ミノルタ株式会社 | Automatic focus adjustment device |
JP2631215B2 (en) * | 1987-11-04 | 1997-07-16 | キヤノン株式会社 | Exposure control device |
JPH05145822A (en) * | 1991-03-26 | 1993-06-11 | Olympus Optical Co Ltd | Moving body tracking device |
JPH0580248A (en) | 1991-09-19 | 1993-04-02 | Ricoh Co Ltd | Automatic focusing device |
JPH05288982A (en) | 1992-04-09 | 1993-11-05 | Olympus Optical Co Ltd | Gazimg point selection device |
JP3653739B2 (en) | 1993-10-08 | 2005-06-02 | 株式会社ニコン | Camera with subject tracking function |
JPH0829674A (en) * | 1994-07-20 | 1996-02-02 | Nikon Corp | Automatic area selection camera |
JP4054422B2 (en) * | 1997-11-13 | 2008-02-27 | キヤノン株式会社 | Camera and interchangeable lens device |
JP4185271B2 (en) * | 2001-09-25 | 2008-11-26 | 日本放送協会 | Position detection device and position detection program |
JP4218446B2 (en) * | 2003-07-03 | 2009-02-04 | 株式会社ニコン | Electronic camera |
KR101108634B1 (en) * | 2004-01-06 | 2012-01-31 | 소니 주식회사 | Image processing device and image processing method and recording medium |
WO2006082967A1 (en) * | 2005-02-07 | 2006-08-10 | Matsushita Electric Industrial Co., Ltd. | Imaging device |
-
2007
- 2007-01-31 JP JP2007020336A patent/JP5045125B2/en active Active
- 2007-02-02 US US11/701,360 patent/US20070263904A1/en not_active Abandoned
-
2011
- 2011-12-12 US US13/316,686 patent/US8433097B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5196929A (en) * | 1989-07-05 | 1993-03-23 | Olympus Optical Co., Ltd. | Display system of camera having tracking apparatus |
US5627586A (en) * | 1992-04-09 | 1997-05-06 | Olympus Optical Co., Ltd. | Moving body detection device of camera |
US5896174A (en) * | 1992-09-11 | 1999-04-20 | Asahi Kogaku Kogyo Kabushiki Kaisha | Control system for inhibiting a calculating system in an automatic focusing device |
US5418595A (en) * | 1993-02-02 | 1995-05-23 | Nikon Corporation | Camera having a subject tracking function and method therefor |
US5627856A (en) * | 1994-09-09 | 1997-05-06 | Omnipoint Corporation | Method and apparatus for receiving and despreading a continuous phase-modulated spread spectrum signal using self-synchronizing correlators |
US20050264679A1 (en) * | 2004-05-26 | 2005-12-01 | Fujinon Corporation | Autofocus system |
US7315631B1 (en) * | 2006-08-11 | 2008-01-01 | Fotonation Vision Limited | Real-time face tracking in a digital image acquisition device |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080002028A1 (en) * | 2006-06-30 | 2008-01-03 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
US8284256B2 (en) * | 2006-06-30 | 2012-10-09 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
US20080285809A1 (en) * | 2007-05-02 | 2008-11-20 | Nikon Corporation | Photographic subject tracking method, computer program product and photographic subject tracking device |
US8144934B2 (en) * | 2007-05-02 | 2012-03-27 | Nikon Corporation | Photographic subject tracking method, computer program product and photographic subject tracking device |
US8355047B2 (en) * | 2007-07-06 | 2013-01-15 | Nikon Corporation | Tracking device, focus adjustment device, image-capturing device, and tracking method |
US20090009606A1 (en) * | 2007-07-06 | 2009-01-08 | Nikon Corporation | Tracking device, focus adjustment device, image-capturing device, and tracking method |
US9400935B2 (en) * | 2007-10-10 | 2016-07-26 | Samsung Electronics Co., Ltd. | Detecting apparatus of human component and method thereof |
US20130236057A1 (en) * | 2007-10-10 | 2013-09-12 | Samsung Electronics Co., Ltd. | Detecting apparatus of human component and method thereof |
US20110043680A1 (en) * | 2009-08-18 | 2011-02-24 | Canon Kabushiki Kaisha | Auto focusing apparatus and control method |
US8395695B2 (en) * | 2009-08-18 | 2013-03-12 | Canon Kabushiki Kaisha | Auto focusing apparatus and control method capable of selecting a main area and/or an auxiliary area of a frame as a focus detection area |
US20110261225A1 (en) * | 2010-04-23 | 2011-10-27 | Niinami Norikatsu | Image capturing apparatus, method of detecting tracking object, and computer program product |
US8599268B2 (en) * | 2010-04-23 | 2013-12-03 | Ricoh Company, Limited | Image capturing apparatus, method of detecting tracking object, and computer program product |
US8970723B2 (en) | 2010-05-10 | 2015-03-03 | Fujitsu Limited | Device and method for image processing capable of tracking target object |
WO2012116347A1 (en) * | 2011-02-24 | 2012-08-30 | Qualcomm Incorporated | Auto-focus tracking |
US9077890B2 (en) | 2011-02-24 | 2015-07-07 | Qualcomm Incorporated | Auto-focus tracking |
US20160044229A1 (en) * | 2014-08-05 | 2016-02-11 | Samsung Electronics Co., Ltd. | Imaging sensor capable of detecting phase difference of focus |
US9538067B2 (en) * | 2014-08-05 | 2017-01-03 | Samsung Electronics Co., Ltd. | Imaging sensor capable of detecting phase difference of focus |
EP3021072A1 (en) | 2014-11-14 | 2016-05-18 | Sick Ag | Lighting device and method for projecting an illumination pattern |
DE202015105376U1 (en) | 2015-10-12 | 2015-10-19 | Sick Ag | 3D camera for taking three-dimensional images |
US11190820B2 (en) | 2018-06-01 | 2021-11-30 | At&T Intellectual Property I, L.P. | Field of view prediction in live panoramic video streaming |
US11641499B2 (en) | 2018-06-01 | 2023-05-02 | At&T Intellectual Property I, L.P. | Field of view prediction in live panoramic video streaming |
US10812774B2 (en) * | 2018-06-06 | 2020-10-20 | At&T Intellectual Property I, L.P. | Methods and devices for adapting the rate of video content streaming |
US11019361B2 (en) | 2018-08-13 | 2021-05-25 | At&T Intellectual Property I, L.P. | Methods, systems and devices for adjusting panoramic view of a camera for capturing video content |
US11671623B2 (en) | 2018-08-13 | 2023-06-06 | At&T Intellectual Property I, L.P. | Methods, systems and devices for adjusting panoramic view of a camera for capturing video content |
Also Published As
Publication number | Publication date |
---|---|
US20120087544A1 (en) | 2012-04-12 |
US8433097B2 (en) | 2013-04-30 |
JP5045125B2 (en) | 2012-10-10 |
JP2007282188A (en) | 2007-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8433097B2 (en) | Subject tracking device, subject tracking method, subject tracking program product and optical device | |
JP6512810B2 (en) | Image pickup apparatus, control method and program | |
US8649574B2 (en) | Imaging apparatus, control method of imaging apparatus, and computer program | |
US8670064B2 (en) | Image capturing apparatus and control method therefor | |
JP2003344891A (en) | Automatic photographing mode setting camera | |
US8411159B2 (en) | Method of detecting specific object region and digital camera | |
US20120242888A1 (en) | Image Recognition Device, Focus Adjustment Device, Image-Capturing Device, and Image Recognition Method | |
JP2002271654A (en) | Electronic camera | |
JP2013013050A (en) | Imaging apparatus and display method using imaging apparatus | |
US7978254B2 (en) | Image capturing apparatus, its controlling method, and program | |
JP2009109839A (en) | Image tracking device and imaging device | |
JPH09149311A (en) | Image pickup device | |
JP4807582B2 (en) | Image processing apparatus, imaging apparatus, and program thereof | |
JP4888249B2 (en) | Focus detection apparatus and imaging apparatus | |
JP2005223658A (en) | Digital camera | |
CN107800956B (en) | Image pickup apparatus, control method, and storage medium | |
JP2015167310A (en) | Imaging apparatus and imaging method | |
JP5070856B2 (en) | Imaging device | |
JP2007259004A (en) | Digital camera, image processor, and image processing program | |
JP4985155B2 (en) | Focus adjustment device and imaging device | |
JP3415383B2 (en) | Camera with monitor display | |
JP2004117195A (en) | Digital camera with speed measuring function | |
JP2017021177A (en) | Range-finding point upon lens vignetting, range-finding area transition method | |
JP6493746B2 (en) | Image tracking device and image tracking method | |
JPH07301742A (en) | Camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NIKON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MURAMATSU, KEIKO;REEL/FRAME:018900/0572 Effective date: 20070202 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |