WO2012124669A1 - 撮像装置及びその制御方法 - Google Patents
撮像装置及びその制御方法 Download PDFInfo
- Publication number
- WO2012124669A1 WO2012124669A1 PCT/JP2012/056348 JP2012056348W WO2012124669A1 WO 2012124669 A1 WO2012124669 A1 WO 2012124669A1 JP 2012056348 W JP2012056348 W JP 2012056348W WO 2012124669 A1 WO2012124669 A1 WO 2012124669A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- main
- imaging
- image
- region
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
- G02B7/343—Systems for automatic generation of focusing signals using different areas in a pupil plane using light beam separating prisms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/704—Pixels specially adapted for focusing, e.g. phase difference pixel sets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
Definitions
- the present invention relates to an imaging apparatus having a continuous shooting function for continuously capturing a plurality of still images and a subject tracking function, and a control method therefor.
- Subject brightness and subject position will change over time. Therefore, when performing continuous shooting of still images using an imaging device having a continuous shooting function, the exposure and focus must be adjusted each time a still image is captured. Therefore, information obtained from an imaging device that obtains an exposure adjustment value for the next still image from a still image captured immediately before, an AE (auto exposure) sensor and an AF (auto focus) sensor provided separately from the image sensor There is an image pickup apparatus having a function of performing continuous shooting while automatically adjusting the exposure and focus using. There are also cameras equipped with functions such as tracking a subject using image data obtained during continuous shooting and detecting face information of a person.
- Patent Document 1 in the continuous shooting mode, a subject is detected from a still image obtained immediately before, and a still image of the next frame is captured in a state in which the focus follows the subject.
- a single lens reflex camera is described.
- the present invention has been made in view of such problems, and an object of the present invention is to provide a mechanism for realizing an improvement in a subject tracking function during continuous shooting of an imaging apparatus.
- the invention according to claim 1 of the present invention is an imaging device that captures a plurality of main images in the continuous shooting mode, and captures a plurality of auxiliary images between the capture of the main image and the next main image.
- An area where the same subject as the main subject exists from the imaging means for performing the operation, the main subject determining means for determining the main subject, and the first area which is a part of the first auxiliary image among the plurality of auxiliary images A region where the same subject as the main subject exists is detected from the first subject tracking processing means for detecting the second subject and the second region wider than the first region of the second auxiliary image among the plurality of auxiliary images.
- Second subject tracking processing means and the detection result of the first subject tracking processing means is used for focus adjustment performed before the next main image is picked up, and detected by the second subject tracking processing means.
- the result is the same as the main subject that is taken after the next main image is taken.
- subject is characterized by using the detection of the existing area.
- the accuracy of the subject tracking function during continuous shooting of the imaging apparatus can be improved.
- FIG. 1 is a schematic diagram illustrating an example of an electrical configuration of an imaging apparatus according to a first embodiment.
- 6 is a flowchart illustrating an example of a processing procedure in a continuous shooting mode of the imaging apparatus according to the first embodiment. It is a figure for demonstrating the tracking range which concerns on 1st Embodiment. It is a figure for demonstrating the timing when each process which concerns on 1st Embodiment is performed. It is a schematic diagram which shows an example of the mechanical structure of the imaging device which concerns on 2nd Embodiment.
- FIG. 10 is a flowchart illustrating an example of a processing procedure in a continuous shooting mode of the imaging apparatus according to the second embodiment. It is a figure for demonstrating the tracking range which concerns on 2nd Embodiment.
- FIG. 1 is a schematic diagram illustrating an example of a mechanical configuration of a digital single-lens reflex camera, which is an imaging apparatus according to the first embodiment.
- a photographing lens unit 202 is mounted on the front surface of the camera body 201.
- the camera body 201 can replace the photographic lens unit 202 to be mounted with another photographic lens unit.
- the camera body 201 and the photographic lens unit 202 communicate via a mount contact group (not shown).
- Inside the taking lens unit 202 are a lens group 213 and a diaphragm 214.
- the camera body 201 can adjust the amount of light to be taken into the camera by adjusting the aperture diameter of the diaphragm 214 by communication control via the mount contact group, and adjust the position of the lens group 213 to adjust the focus position. it can.
- the main mirror 203 is a half mirror.
- the main mirror 203 is obliquely provided on the photographing optical path except during the main photographing for photographing a still image. In this state, a part of the light flux from the photographic lens unit 202 is reflected by the main mirror 203 and guided to the finder optical system, while the remaining light flux that has passed through the main mirror 203 is reflected by the sub mirror 204 to be the AF unit. It is led to 205.
- the AF unit 205 includes a phase difference detection type AF sensor.
- the AF unit 205 forms the secondary image plane of the taking lens unit 202 on the focus detection line sensor.
- the camera body 201 detects the focus adjustment state of the taking lens unit 202 from the output of the focus detection line sensor, and outputs a control signal for driving the lens group 213 based on the detection result to perform automatic focus adjustment. Since focus detection by the phase difference detection method is a known technique, specific control is omitted here.
- FIG. 2 is a schematic diagram showing the arrangement of distance measuring points of the imaging apparatus according to the first embodiment.
- the AF unit 205 has, for example, a distance measuring point layout shown in FIG.
- the pentaprism 207 is a pentaprism for changing the finder optical path.
- the photographer can visually recognize the imaging target by observing the focus plate 206 through the eyepiece 209.
- the AE unit (AE sensor) 208 observes the brightness of the subject, an output related to the subject brightness can be obtained from the outputs of a large number of pixels arranged two-dimensionally.
- the AE sensor observes the outer frame area (imaging range) shown in FIG. 2, and pixels corresponding to the respective color filters of R (red), G (green), and B (blue). Are arranged in stripes for each color filter.
- the brightness of the subject is observed by the AE sensor 208, and at the same time, subject tracking processing is performed using image data obtained by the AE sensor 208.
- a focal plane shutter 210 and an image sensor 211 are provided inside the camera body 201.
- the main mirror 203 and the sub mirror 204 are retracted so as to be adjacent to the focus plate 206 so as not to block the light beam, and the focal plane shutter 210 is opened, so that the image sensor 211 is exposed.
- main imaging imaging with the image sensor 211 for the purpose of storing image data
- auxiliary imaging imaging with the AE sensor 208
- image data generated by the main imaging is referred to as “main image data”
- image data generated by the auxiliary imaging is referred to as “auxiliary image data”. That is, the image sensor 211 that generates the main image data functions as a first image sensor, and the AE sensor 208 that generates auxiliary image data functions as a second image sensor.
- the display unit 212 displays shooting information and a shot image so that the user can check the contents.
- FIG. 3 is a schematic diagram illustrating an example of an electrical configuration of the imaging apparatus 200 according to the first embodiment.
- symbol is attached
- the operation detection unit 308 outputs the SW1 signal to the system control unit 303 at the moment when the release button is pressed halfway, and outputs the SW2 signal to the system control unit 303 at the moment when the release button is pressed.
- SW1 holding state a state in which the user maintains the release button half-pressed
- SW2 holding state a state in which the release button is pressed down
- the operation detection unit 308 outputs a SW1 release signal to the system control unit 303 at the moment when the user releases the release button in the SW1 holding state, and performs system control on the SW2 release signal at the moment when the release button is released in the SW2 holding state.
- the data is output to the unit 303.
- the mirror control unit 309 controls the operations of the main mirror 203 and the sub mirror 204 based on the control signal sent from the system control unit 303.
- the system control unit 303 reads accumulated data from the line sensor corresponding to each ranging point of the AF unit 205 when receiving the SW1 signal from the operation detection unit 308 and in the mirror-down state in the continuous shooting mode, Selects the AF point to be adjusted and performs the focus adjustment calculation. Then, the system control unit 303 sends a lens driving signal based on the calculation result to the lens driving unit 314.
- the lens driving unit 314 moves the lens group 213 based on the lens driving signal sent from the system control unit 303 to perform a focusing operation.
- the image sensor 211 converts the light incident through the lens group 213 into an electrical signal, generates image data, and outputs the image data to the system control unit 303.
- the system control unit 303 outputs the image data output from the image sensor 211 to the display control unit 312 and writes (stores) it in the image storage device 311.
- the display control unit 312 displays an image based on the image data on the display based on the output of the image data from the system control unit 303.
- the main memory 307 is a storage device for storing data necessary for calculations performed by the system control unit 303 and the AE image processing unit 304.
- the AE image processing unit 304 performs an exposure adjustment calculation based on the image data read from the AE sensor 208 and outputs the calculation result to the system control unit 303.
- the system control unit 303 sends an aperture control signal to the aperture control unit 313 based on the result of the exposure adjustment calculation output from the AE image processing unit 304.
- the system control unit 303 sends a shutter control signal to the shutter control unit 310 at the time of release.
- the aperture control unit 313 drives the aperture 214 based on the aperture control signal received from the system control unit 303.
- the shutter control unit 310 drives the focal plane shutter 210 based on the shutter control signal sent from the system control unit 303.
- the AE image processing unit 304 performs subject tracking processing in the continuous shooting mode, and detects the main subject position in the auxiliary image data read from the AE sensor 208.
- the subject tracking process described above has two types of algorithms, a first subject tracking process and a second subject tracking process. Among these, the result of the first subject tracking process is output to the system control unit 303, and the result of the second subject tracking process is recorded in the main memory 307. Two types of subject tracking processing algorithms will be described later.
- the system control unit 303 selects one ranging point from the ranging point group in FIG. 2 based on the result of the first subject tracking process output from the AE image processing unit 304. Then, a lens driving signal is sent to the lens driving unit 314 based on the result of the focus adjustment calculation of the selected distance measuring point.
- FIG. 4 is a flowchart showing an example of a processing procedure in the continuous shooting mode of the imaging apparatus according to the first embodiment.
- the imaging apparatus according to the present embodiment captures a plurality of frames of still images in the continuous shooting mode.
- Steps S401 to S405 are processes performed in the SW1 holding state, and correspond to a continuous shooting preparation operation.
- Steps S406 to S416 are processes performed when the release button is pressed and the SW2 is held after completion of the above-described continuous shooting preparation operation.
- step S401 the system control unit 303 selects a distance measuring point and performs a focus adjustment calculation based on the output of the AF unit 205. For example, the system control unit 303 selects a distance measuring point that is superimposed on a subject that is considered to be present at a position close to the imaging apparatus 200.
- the AE image processing unit 304 reads auxiliary image data from the AE sensor 208, and extracts a certain region centered on the distance measuring point selected in step S401. Then, the AE image processing unit 304 (or the system control unit 303) records the image data of the extracted area and the center coordinates of the extracted area in the auxiliary image data in the main memory 307. These are used for a tracking process to be described later.
- the image data of the extracted area recorded in the main memory 307 will be referred to as “template image data”, and the center coordinates of the extracted area will be referred to as “previous subject position”. To do.
- the template image data relates to the main subject, and the AE image processing unit 304 that extracts the template image data determines the main subject before the start of continuous shooting.
- a subject determination unit is configured.
- step S403 the AE image processing unit 304 performs exposure calculation using the auxiliary image data read from the AE sensor 208 in step S402, and outputs the result to the system control unit 303.
- step S404 the system control unit 303 sends a control signal to the aperture control unit 313 and the lens driving unit 314 based on the focus adjustment calculation result in step S401 and the exposure calculation result in step S403. Accordingly, the aperture control unit 313 adjusts the aperture (exposure) based on the control signal, and the lens driving unit 314 adjusts the focus based on the control signal.
- the next step S405 is a step of waiting for the user's next operation. Specifically, in step S405, the system control unit 303 stands by until the user releases the release button (SW1 release) or pushes the release button (SW2 input). If the SW1 release signal or the SW2 signal is input from the operation detection unit 308, the system control unit 303 proceeds to step S406.
- step S406 the system control unit 303 determines whether or not the input signal is the SW2 signal when the signal for canceling the standby state in step S405 is input. As a result of the determination, if the input signal is not the SW2 signal (is the SW1 release signal), the continuous shooting process in the flowchart of FIG. 4 is terminated, while if the input signal is the SW2 signal. In step S407, continuous shooting is started.
- steps S407 to S409 and steps S415 to S416 are performed in parallel. Therefore, in this embodiment, the system control unit 303 performs steps S407 to S409, and the AE image processing unit 304 performs parallel processing by processing steps S415 to S416.
- Steps S407 to S409 are processes in which the system control unit 303 performs main imaging.
- step S407 the system control unit 303 sends a control signal to the mirror control unit 309 to raise the main mirror 203 and retract it from the optical path at the time of actual photographing.
- step S408 the system control unit 303 sends a control signal to the shutter control unit 310 to perform release, and exposes the image sensor 211 to perform the main imaging. Then, the system control unit 303 reads the image data generated by the image sensor 211 and records (stores) it in the image storage device 311.
- step S409 the system control unit 303 sends a control signal to the mirror control unit 309, lowers the main mirror 203 and the sub mirror 204, and enters the optical path at the time of actual photographing.
- or step S416 is mentioned later.
- step S410 the system control unit 303 determines whether or not the SW2 holding state is released. As a result of the determination, if the SW2 release signal has already been received and the SW2 holding state is released, the continuous shooting process in the flowchart of FIG. 4 is terminated, while the SW2 release signal is not received and the SW2 holding state is If not canceled, the process proceeds to step S411.
- step S411 the AE image processing unit 304 detects the main subject position by the first subject tracking process.
- the first subject tracking process is performed by a template matching method.
- step S411 first, the AE sensor 208 accumulates charges only for a period (first accumulation period) shorter than the accumulation period required for performing an appropriate exposure adjustment calculation.
- the AE image processing unit 304 reads the first auxiliary image data from the AE sensor 208 and the template image data from the main memory 307.
- the AE image processing unit 304 detects the position of the main subject in the first auxiliary image data by obtaining a correlation between the two pieces of image data. Then, the detection result is output to the system control unit 303.
- the process of detecting the position of the main subject is started by making the first accumulation period for obtaining the first auxiliary image data shorter than the accumulation period required for performing the appropriate exposure adjustment calculation. It is possible to advance the timing.
- the AE sensor 208 constitutes auxiliary image imaging means for imaging a plurality of auxiliary image data during the main imaging in the continuous shooting mode. Further, for example, the result of detection of the main subject position by the first subject tracking process is used for the focus adjustment calculation for the next main imaging.
- FIG. 5 is a diagram for explaining the tracking range according to the first embodiment.
- the area to be matched is set to one vertical distance measuring point and two right and left distance measuring points based on the immediately preceding subject position read from the main memory 307. It is limited to the range of minutes (the tracking range of the first subject tracking process). That is, in the first subject tracking process, the position of the main subject is detected for a partial range of the auxiliary image data. Further, matching is performed in a state where the resolution of the auxiliary image data and the template image data is converted to 1 ⁇ 2. Thereby, matching can be processed at high speed, and time for subsequent lens driving or the like can be secured.
- auxiliary image data 1 the first auxiliary image data read from the AE sensor 208 in step S411
- auxiliary image data 1 the range in which matching is performed
- tilt range the range in which matching is performed
- the AE sensor 208 accumulates charges only for the second accumulation period obtained by subtracting the first accumulation period from the accumulation period required for performing proper exposure adjustment calculation. I do.
- the AE image processing unit 304 reads the second auxiliary image data from the AE sensor 208, and generates new composite auxiliary image data added with the auxiliary image data 1 read from the AE sensor 208 in step S411.
- the AE image processing unit 304 performs an exposure adjustment calculation using the generated composite auxiliary image data, and outputs the exposure adjustment calculation result to the system control unit 303.
- auxiliary image data 2 the auxiliary image data read from the AE sensor 208 in step S412
- auxiliary image data 12 the combined auxiliary image data of the auxiliary image data 1 and the auxiliary image data 2
- the AE image processing unit 304 can perform an accurate exposure adjustment calculation by using the auxiliary image data 12. Furthermore, by combining the auxiliary image data, stable exposure calculation due to a reduction in noise and an increase in the amount of light can be expected as compared with the case of using image data that is not combined.
- step S413 the system control unit 303 performs a focus adjustment calculation on the distance measuring point of the subject position detected in step S411.
- step S414 the system control unit 303 sends a control signal to the aperture control unit 313 and the lens driving unit 314 based on the calculation results of step S412 and step S413, and performs exposure adjustment and focus adjustment.
- step S414 When step S414 is completed, the release operation is started.
- the system control unit 303 performs steps S407 to S409, and the AE image processing unit 304 performs steps S415 to S416.
- step S415 the AE image processing unit 304 determines whether or not the imaging is the first frame. If the result of this determination is that imaging is not in the first frame (imaging is in the second and subsequent frames), processing proceeds to step S416.
- the AE image processing unit 304 detects the main subject position by the second subject tracking process, and verifies and corrects the result of the first subject tracking process performed immediately before (in the immediately preceding frame). I do.
- This second subject tracking process is performed by the template matching method in the same manner as the first subject tracking process.
- the AE image processing unit 304 detects the main subject position by correlating the auxiliary image data 2 read in step S412 with the template image data recorded in the main memory 307.
- the area to be matched is the entire auxiliary image data 2 as shown in FIG. The resolution is also kept as it is. Thereby, it is possible to obtain a tracking result having a wider range and higher accuracy than the first subject tracking process.
- step S411 the first subject tracking result performed in step S411 immediately before is verified and corrected.
- the following three points will be verified.
- the entire auxiliary image data 2 is searched by the second subject tracking process to check whether there is a main subject outside the range of the first subject tracking process.
- the first subject tracking process when there is a subject similar to the main subject in the tracking range, the resolution of the image data is reduced, so that they cannot be determined, and the tracking target may be mistaken. There is sex. Therefore, by performing the second subject tracking process using the auxiliary image data 2 whose resolution is not further lowered, the main subject and the similar subject are discriminated, and whether or not the result of the first subject tracking process is correct is determined. Validate.
- the second subject tracking process with higher resolution is performed to detect the position of the subject with higher accuracy.
- the immediately preceding subject position information recorded in the main memory 307 is used as the second subject tracking information. Correct the processing result.
- the subject detected by the first subject tracking process and the second subject tracking process is the same, and there is a deviation greater than a predetermined threshold at those positions, the subject It is also possible to predict the movement of the subject based on the change in the position of the subject.
- step S415 If it is determined in step S415 that the image is captured in the first frame, or if step S416 is completed, the process proceeds to step S410 along with the end of steps S407 to S409, and the subsequent processing is performed.
- FIG. 6 is a diagram for explaining the timing at which each process is performed in the present embodiment.
- the AE image processing unit 304 causes the AE sensor 208 to accumulate charges for the first accumulation period and reads the auxiliary image data 1. Since the auxiliary image data 1 is set to have a short accumulation period, the luminance is insufficient to perform the exposure adjustment calculation.
- the AE image processing unit 304 performs development processing on the read auxiliary image data 1 and performs first subject tracking processing using the auxiliary image data 1 subjected to the development processing.
- the result of the first subject tracking process is output to the system control unit 303, and a new distance measuring point is selected based on this result.
- the system control unit 303 calculates the focus adjustment state of the photographing lens unit 202 at the newly selected distance measuring point from the output of the focus detection line sensor transmitted from the AF unit 205, and drives the lens group 213.
- the AE image processing unit 304 causes the AE sensor 208 to accumulate charges only for the first accumulation period, and immediately thereafter causes the AE sensor 208 to accumulate charges only for the second accumulation period, so that the auxiliary image data 2 is stored. read out. Since the auxiliary image data 2 is also set to have a short accumulation period, the luminance is insufficient to perform the exposure adjustment calculation.
- the AE image processing unit 304 performs a development process on the read auxiliary image data 2 and combines the auxiliary image data 1 and the auxiliary image data 2 subjected to the development process. By synthesizing the auxiliary image data 1 and the auxiliary image data 2, it is possible to obtain auxiliary image data 12 having a brightness necessary for performing an appropriate exposure adjustment calculation.
- the AE image processing unit 304 performs an exposure adjustment calculation using the auxiliary image data 12 and outputs the calculation result to the system control unit 303.
- the system control unit 303 sends a control signal to the aperture control unit 313 and the lens driving unit 314 based on the results of the focus adjustment calculation and the exposure adjustment calculation, and performs exposure adjustment and focus adjustment. Then, after completing the exposure adjustment and the focus adjustment, the system control unit 303 raises the main mirror 203 and the sub mirror 204 and performs the main imaging.
- the AE image processing unit 304 uses the auxiliary image data 2 to perform a second subject tracking process with higher accuracy than the first subject tracking process. Since the second subject tracking process requires more time than the first subject tracking process and the timing at which the process is started is slower than the first subject tracking process, the second subject tracking process is performed immediately after the main imaging. The result of the tracking process cannot be reflected. Further, the next main imaging is started while the second subject tracking process by the AE image processing unit 304 is continued.
- the “tracking range” for the first subject tracking process performed thereafter can be set correctly, so that the first subject tracking process performed thereafter is performed. Can contribute to improving the accuracy of
- the processing time of the subject tracking process can be secured. Then, the auxiliary image data 1 and the auxiliary image data 2 obtained thereafter are combined, and the combined auxiliary image data 12 is used for an exposure adjustment calculation with a relatively short processing time. Thereby, it is possible to perform exposure adjustment calculation using image data in which an appropriate accumulation period is ensured while ensuring the subject tracking processing time.
- the subject position can be corrected and the tracking range of the next subject tracking processing can be set correctly. It becomes possible.
- the accuracy of subject tracking can be improved as compared with the case where the subject tracking processing is performed only once in one frame.
- the second subject tracking process is started before the mirror-up operation, but the second subject tracking process may be started after the mirror-up operation. As long as the second subject tracking process is completed before the subsequent first subject tracking process is performed, the timing at which the second subject tracking process is started is arbitrarily set. Can do.
- this invention is not limited to this embodiment, A various deformation
- a method for detecting the position of the subject not only subject tracking by template matching but also tracking using color information or a result by face detection may be used.
- a moving object analysis using an optical flow or a scene detection technique based on edge detection may be used.
- the template used for template matching is only the first image data stored first.
- a new template is created by cutting out the main subject area from image data newly captured during continuous shooting. It is also good.
- the process of tracking the same subject by analyzing the auxiliary image the first subject tracking process operates at high speed with relatively low accuracy, and the second subject tracking process is slow. As long as the algorithm is configured to track with high accuracy.
- the auxiliary image data 2 generated in step S412 is used as the image data to be correlated with the template image data.
- any auxiliary image data out of a plurality of captured auxiliary image data can be applied to the present invention.
- template image data and single or plural image data after the second image are used. You may make it take the correlation.
- the second subject tracking process may be performed using auxiliary image data 12 obtained by combining auxiliary image data 1 and auxiliary image data 2 instead of auxiliary image data 2.
- the ranging point superimposed on the subject considered to be present at a position close to the imaging apparatus 200 is selected, but any ranging is performed according to the user's instruction.
- a point may be selected. If the AE image processing unit 304 has a face detection function for detecting a human face from the image data read from the AE sensor 208, the face is selected as a subject, and an image of an area surrounding the detected face is selected.
- the data may be template image data.
- FIG. 7 is a schematic diagram illustrating an example of a mechanical configuration of a mirrorless digital single-lens camera that is an imaging apparatus according to the second embodiment.
- a photographing lens unit 702 is mounted on the front surface of the camera body 701.
- the camera body 701 can replace the photographic lens unit 702 to be mounted with another photographic lens unit.
- the camera body 701 and the taking lens unit 702 communicate with each other via a mount contact group (not shown).
- Inside the taking lens unit 702 is a lens group 706 and a diaphragm 707.
- the camera body 701 can adjust the amount of light taken into the camera by adjusting the aperture diameter of the diaphragm 707 by communication control via the mount contact group, and adjust the position of the lens group 706 to adjust the focus position. it can.
- a focal plane shutter 703 and an image sensor 704 are provided in the camera body 701.
- the image sensor 704 is exposed by opening the focal plane shutter 703.
- the image sensor 704 has pixels for image generation and pixels for phase difference detection.
- the pixel for image generation is a pixel for generating image data at the time of exposure
- the pixel for phase difference detection is a pixel for detecting the phase difference and performing focus adjustment.
- FIG. 8A shows a front view of a pixel of the image sensor 704, and FIG. 8B shows a cross-sectional view of the pixel.
- 801 is a microlens
- 802 and 803 are photodiodes.
- pupil division is performed on the left and right in the figure.
- a photographic lens using a phase difference detection method. The focus adjustment state of the unit 702 can be detected.
- FIG. 9 is a schematic diagram showing a distance measuring point arrangement of the imaging apparatus (in the imaging element 704) according to the second embodiment. Focus adjustment is performed using the outputs of the phase difference detection pixels corresponding to the areas of the distance measuring points divided in a grid pattern.
- the display unit 705 in FIG. 7 displays shooting information and a shot image, and allows the user to check the contents.
- FIG. 10 is a schematic diagram illustrating an example of an electrical configuration of the imaging apparatus according to the second embodiment.
- symbol is attached
- the operation detection unit 1005 detects an operation performed by the user via a button, switch, dial, or the like attached to the camera body 701, and sends a signal according to the operation content to the system control unit 1003.
- the operation detection unit 1005 outputs the SW1 signal to the system control unit 1003 at the moment when the release button is pressed halfway, and outputs the SW2 signal to the system control unit 1003 at the moment when the release button is pressed.
- the operation detection unit 1005 outputs a SW1 release signal to the system control unit 1003 at the moment when the user releases the release button in the SW1 holding state, and performs system control on the SW2 release signal at the moment when the user releases the release button in the SW2 holding state. Output to the unit 1003.
- the system control unit 1003 reads out the moving image generated by the image sensor 704 and outputs it to the display control unit 1008. This is so-called live view display, and the display control unit 1008 displays an image based on the received moving image on the display unit 705.
- the system control unit 1003 When the system control unit 1003 receives the SW1 signal from the operation detection unit 1005, the system control unit 1003 reads the phase difference information from the phase difference detection pixels of the image sensor 704 and performs focus adjustment calculation. After completing the focus adjustment calculation, the system control unit 1003 sends a lens drive signal based on the focus adjustment calculation result to the lens drive unit 1010.
- the lens driving unit 1010 moves the lens group 706 based on the lens driving signal received from the system control unit 1003 and performs a focusing operation.
- the image sensor 704 converts the light incident through the photographing lens unit 702 into an electrical signal, generates image data, and outputs the image data to the system control unit 1003.
- the system control unit 1003 outputs the image data output from the image generation pixels of the image sensor 704 to the display control unit 1008 and writes (stores) the image data in the image storage device 1007 during the main imaging.
- the display control unit 1008 displays an image based on the image data on the display unit 705 based on the output of the image data from the system control unit 1003.
- “main image capture” is used to obtain image data to be recorded in the image storage device 1007 for the purpose of storage in accordance with the SW 2 signal.
- An image for live view display that is not recorded is referred to as an “auxiliary image”.
- the main memory 1004 is a storage device for storing data necessary for calculations performed by the system control unit 1003.
- the aperture control unit 1009 drives the aperture 707 based on the aperture control signal received from the system control unit 1003.
- the shutter control unit 1006 drives the focal plane shutter 703 based on the shutter control signal sent from the system control unit 1003.
- the system control unit 1003 performs subject tracking processing in the continuous shooting mode, and detects the main subject position from the image data read from the image sensor 704.
- the subject tracking process described above has two types of algorithms: a first subject tracking process and a second subject tracking process. These two types of subject tracking processing are performed using the same algorithm as in the first embodiment.
- the system control unit 1003 selects one distance measurement point from the distance measurement point group shown in FIG. 9 based on the result of the first subject tracking process. Then, a lens driving signal is sent to the lens driving unit 1010 based on the focus adjustment calculation result of the selected distance measuring point.
- FIG. 11 is a flowchart illustrating an example of a processing procedure in the continuous shooting mode of the imaging apparatus according to the second embodiment.
- the imaging apparatus according to the present embodiment captures a plurality of frames of still images in the continuous shooting mode.
- Steps S1101 to S1105 are processes performed in the SW1 holding state, and correspond to a continuous shooting preparation operation.
- Steps S1106 to S1115 are processes that are performed when the release button is pressed and the SW2 is held after completion of the above-described continuous shooting preparation operation.
- step S1101 the system control unit 1003 performs a focus adjustment calculation based on the output of the phase difference detection pixel of the image sensor 704. For example, the system control unit 1003 selects a distance measuring point that is superimposed on a subject that is considered to be present at a position close to the imaging apparatus 700.
- step S1102 the system control unit 1003 reads the image data from the image generation pixels of the image sensor 704, and extracts a certain region centered on the distance measuring point selected in step S1101. Then, the system control unit 1003 records the image data of the extracted area and the center coordinates of the extracted area in the image data in the main memory 1004. These are used for a tracking process to be described later.
- the image data of the extracted area recorded in the main memory 1004 will be referred to as “template image data”, and the center coordinates of the extracted area will be referred to as “previous subject position”.
- the template image data relates to the main subject, and the system control unit 1003 for extracting the template image determines the main subject before starting the continuous shooting. Configure the means.
- step S1103 the system control unit 1003 performs exposure calculation using the image data read in step S1102.
- step S1104 the system control unit 1003 sends a control signal to the aperture control unit 1009 and the lens driving unit 1010 based on the focus adjustment calculation result in step S1101 and the exposure calculation result in step S1103. Accordingly, the aperture control unit 1009 adjusts the aperture (exposure) based on the control signal, and the lens driving unit 1010 adjusts the focus based on the control signal.
- step S1105 the system control unit 1003 waits until the user releases the release button (SW1 release) or presses the release button (SW2 input). The system control unit 1003 proceeds to step S1106 when the SW1 release signal or the SW2 signal is input from the operation detection unit 1005 (when the SW1 holding state is released).
- step S1106 the system control unit 1003 determines whether or not the input signal is the SW2 signal when the signal for canceling the standby state in step S1105 is input. As a result of the determination, if the input signal is not the SW2 signal (is the SW1 release signal), the continuous shooting process in the flowchart of FIG. 11 is terminated, while if the input signal is the SW2 signal. In step S1107, the live view display is stopped and continuous shooting is started.
- step S1107 the system control unit 1003 stops live view display, and then sends a control signal to the shutter control unit 1006 to perform release and exposes the image sensor 704 to perform main imaging.
- the system control unit 1003 reads the image data generated by the image sensor 704 and records (stores) it in the image storage device 1007.
- step S1108 the system control unit 1003 determines whether or not the SW2 holding state has been released. As a result of this determination, if the SW2 release signal has already been received and the SW2 holding state has been released, the continuous shooting process in the flowchart of FIG. 11 is terminated, while the SW2 release signal has not been received and the SW2 holding state has been If not canceled, the process proceeds to step S1109.
- step S1109 first, the image sensor 704 accumulates charges only for a period (first accumulation period) shorter than the accumulation period required for performing proper exposure adjustment calculation.
- the system control unit 1003 reads the first auxiliary image data (auxiliary image data 1) from the image generation pixels of the image sensor 704, and also receives the template image data from the main memory 1004. read out.
- the system control unit 1003 detects the position of the main subject in the auxiliary image data 1 by obtaining a correlation between the two image data.
- the tracking algorithm is the same as that of the first embodiment.
- the image sensor 704 constitutes an auxiliary image imaging unit for imaging a plurality of auxiliary image data during the main imaging in the continuous shooting mode.
- FIG. 12 is a diagram for explaining the tracking range in the second embodiment.
- the matching range is a range of three distance measuring points on the left and right of the distance measuring point based on the immediately preceding subject position, and a range corresponding to one distance measuring point on the upper and lower sides. (Tracking range of the first subject tracking process).
- step S1110 the image sensor 704 accumulates charges only for the second accumulation period obtained by subtracting the first accumulation period from the accumulation period required for performing proper exposure adjustment calculation. I do.
- the system control unit 1003 reads the second auxiliary image data (auxiliary image data 2) from the image sensor 704, and adds a new composite auxiliary image obtained by adding the auxiliary image data 1 read from the image sensor 704 in step S1109. Generate data.
- the system control unit 1003 performs an exposure adjustment calculation using the composite auxiliary image data.
- step S1111 the system control unit 1003 performs focus adjustment calculation using the output from the phase difference detection pixel corresponding to the subject position detected in step S1109.
- step S1112 the system control unit 1003 sends a control signal to the aperture control unit 1009 and the lens driving unit 1010 based on the calculation results of step S1110 and step S1111 to perform exposure adjustment and focus adjustment.
- the system control unit 1003 performs live view display and also performs a second subject tracking process in steps S1113 to S1115.
- step S1113 the system control unit 1003 resumes live view display.
- the image sensor 704 continues the live view display using the composite auxiliary image data only for the first frame and the image data obtained by one accumulation as usual for the second and subsequent frames. This live view is continued until the main imaging is performed again.
- step S1114 the system control unit 1003 sets the entire composite auxiliary image data obtained in step S1110 as a tracking range, and starts detection of the main subject position by the second subject tracking process. Specifically, the system control unit 1003 detects the main subject position by correlating the composite auxiliary image data generated in step S1110 with the template image data recorded in the main memory 1004. This second subject tracking process uses the same algorithm as in the first embodiment.
- step S1115 the system control unit 1003 determines whether the exposure adjustment and focus adjustment in step S1112 are completed. If the result of this determination is that exposure adjustment and focus adjustment in step S1112 have not been completed, the process waits. If exposure adjustment and focus adjustment have been completed, the live view display is stopped and control returns to step S1107. If it is determined in step S1115 that the exposure adjustment and the focus adjustment have been completed, the process returns to step S1107 even if the second subject tracking process is not completed. This second subject tracking process only needs to be completed before the subsequent first subject tracking process is started.
- the first subject tracking process in step S1109 is a process performed between the main imaging during continuous shooting and the main imaging, the faster the processing, the higher the continuous shooting speed.
- the second subject tracking process in step S1114 is a process performed independently of the main imaging, it is possible to perform a highly accurate tracking process over time. If the second subject tracking process is completed during the main imaging process, the result is used to verify and correct the first subject tracking process result, which is recorded in the main memory 1004. The subject position information immediately before being updated is updated.
- the processing time of the subject tracking process can be secured. Then, the first auxiliary image data and the auxiliary image data obtained thereafter are synthesized, and the synthesized auxiliary image data is used for an exposure adjustment calculation with a relatively short processing time. Thereby, it is possible to perform exposure adjustment calculation using image data in which an appropriate accumulation period is ensured while ensuring the subject tracking processing time.
- the present invention can also be realized by executing the following processing. That is, software (program) that realizes the functions of the above-described embodiments is supplied to a system or apparatus via a network or various storage media, and a computer (or CPU, MPU, or the like) of the system or apparatus reads the program. It is a process to be executed.
- This program and a computer-readable non-volatile recording medium storing the program are included in the present invention.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
Description
(第1の実施形態)
まず、図1乃至図6を参照して、本発明の第1の実施形態に係る撮像装置について説明する。
図3は、第1の実施形態に係る撮像装置200の電気的構成の一例を示す模式図である。なお、図1と同じ部材については、同一の符号を付している。
ここで、第1の被写体追尾処理では、図5のように、マッチングを行う領域を、メインメモリ307から読み出した直前の被写体位置を基準として上下測距点1つ分、左右測距点2つ分の範囲(第1の被写体追尾処理の追尾範囲)に限定する。即ち、第1の被写体追尾処理では、補助画像データの一部の範囲について、主被写体の位置を検出する。さらに、補助画像データ及びテンプレート画像データの解像度を1/2に変換した状態でマッチングを行う。これにより、マッチングを高速に処理することができ、この後のレンズ駆動等の時間を確保できる。
次に、図7乃至図12を参照して、第2の実施形態に係る撮像装置について説明する。
図7は、第2の実施形態に係る撮像装置である、ミラーレスデジタル一眼カメラの機械的構成の一例を示す模式図である。
図8(a)は、撮像素子704の画素の正面図を示し、図8(b)は画素の断面図を示す。図8において、801はマイクロレンズであり、802及び803はそれぞれフォトダイオードである。1つのマイクロレンズに対して2つのフォトダイオードを用いて画像データを読み出すことにより、図中の左右に瞳分割を行っている。この左の画素の出力を集めて形成される像と、右の画素の出力を集めて形成さえる像を比較することで、AFユニットの焦点検出ラインセンサーと同様に、位相差検出方式によって撮影レンズユニット702の焦点調節状態を検出することができる。
ステップS1101乃至ステップS1105は、SW1保持状態で行われる処理であり、連写の準備動作に該当する。ステップS1106乃至ステップS1115は、上述した連写の準備動作の完了後に、レリーズボタンが押し込まれてSW2保持状態になると行われる処理である。
続いて、ステップS1103において、システム制御部1003は、ステップS1102で読み出した画像データを用いて露出演算を行う。
ステップS1107に進むと、システム制御部1003は、ライブビュー表示を停止してから、シャッター制御部1006へ制御信号を送ってレリーズを行い、撮像素子704に露光して本撮像を行う。そして、システム制御部1003は、撮像素子704で生成された画像データを読み込み、これを画像記憶装置1007へ記録(記憶)する。
また、本発明は、以下の処理を実行することによっても実現される。
即ち、上述した実施形態の機能を実現するソフトウェア(プログラム)を、ネットワーク又は各種記憶媒体を介してシステム或いは装置に供給し、そのシステム或いは装置のコンピュータ(またはCPUやMPU等)がプログラムを読み出して実行する処理である。このプログラム及び当該プログラムを記憶したコンピュータ読み取り可能な不揮発性の記録媒体は、本発明に含まれる。
202 撮影レンズユニット
303 システム制御部
304 AE画像処理部
307 メインメモリ
308 操作検出部
309 ミラー制御部
310 シャッター制御部
311 画像記憶装置
312 ディスプレイ制御部
313 絞り制御部
314 レンズ駆動部
Claims (12)
- 連写モード時に複数の本画像を撮像する撮像装置であって、
本画像の撮像と次の本画像の撮像を行う合間に、複数の補助画像を撮像する撮像手段と、
主被写体を決定する主被写体決定手段と、
前記複数の補助画像のうちの第1の補助画像の、その一部である第1の領域から、前記主被写体と同一の被写体が存在する領域を検出する第1の被写体追尾処理手段と、
前記複数の補助画像のうちの第2の補助画像の、前記第1の領域よりも広い第2の領域から、前記主被写体と同一の被写体が存在する領域を検出する第2の被写体追尾処理手段と
を有し、
前記第1の被写体追尾処理手段による検出の結果を、前記次の本画像の撮像より前に行う焦点調節に用い、前記第2の被写体追尾処理手段による検出の結果を、前記次の本画像の撮像の後に行う前記主被写体と同一の被写体が存在する領域の検出に用いることを特徴とする撮像装置。 - 前記次の本画像の撮像の後に行う前記主被写体と同一の被写体が存在する領域の検出は、前記第1の被写体追尾処理手段によって行われることを特徴とする請求項1に記載の撮像装置。
- 前記第1の被写体追尾処理手段は、前記第2の被写体追尾処理手段による検出結果を基準に、前記第1の領域を設定することを特徴とする請求項2に記載の撮像装置。
- [規則91に基づく訂正 10.07.2012]
前記複数の補助画像を合成する合成手段と、前記合成手段にて合成された補助画像を用いて前記撮像装置の露出調節演算を行う露出調節手段をさらに有することを特徴とする請求項1乃至3のいずれかに記載の撮像装置。 - [規則91に基づく訂正 10.07.2012]
前記第1の被写体追尾処理手段にて検出された前記主被写体と相関の高い領域に対して、焦点調節を行う焦点調節手段をさらに有することを特徴とする請求項1乃至4のいずれかに記載の撮像装置。 - 前記第2の被写体追尾処理手段で用いる補助画像は、前記第1の被写体追尾処理手段で用いる補助画像よりも解像度が高いことを特徴とする請求項1乃至5のいずれかに記載の撮像装置。
- 複数の補助画像を撮像する撮像手段は、前記本画像を生成するための第1の撮像素子とは別に、前記複数の補助画像を生成するための第2の撮像素子を有することを特徴とする請求項1乃至6のいずれかに記載の撮像装置。
- [規則91に基づく訂正 10.07.2012]
レンズ群を通ってきた光束を前記第2の撮像素子に導くためのミラーをさらに有し、
前記第1の被写体追尾処理手段は、前記本画像の撮像の後に前記ミラーが前記本画像の撮影時の光路から退避し、前記次の本画像の撮像の前に前記ミラーが前記本画像の撮影時の光路に侵入するまでの間に、前記主被写体の位置を検出することを特徴とする請求項1乃至7のいずれかに記載の撮像装置。 - [規則91に基づく訂正 10.07.2012]
前記第2の被写体追尾処理手段が前記主被写体と相関の高い領域の検出を継続している間に、前記次の本画像の撮像が行われることを特徴とする請求項1乃至8のいずれかに記載の撮像装置。 - 連写モード時に複数の本画像を撮像する撮像装置の制御方法であって、
本画像の撮像と次の本画像の撮像を行う合間に、複数の補助画像を撮像する撮像工程と、
主被写体を決定する主被写体決定工程と、
前記複数の補助画像のうちの第1の補助画像の、その一部である第1の領域から、前記主被写体と同一の被写体が存在する領域を検出する第1の被写体追尾処理工程と、
前記複数の補助画像のうちの第2の補助画像の、前記第1の領域よりも広い第2の領域から、前記主被写体と同一の被写体が存在する領域を検出する第2の被写体追尾処理工程と
を有し、
前記第1の被写体追尾処理工程における検出結果を、前記次の本画像の撮像より前に行う焦点調節に用い、前記第2の被写体追尾処理工程における検出の結果を、前記次の本画像の撮像の後に行う前記主被写体と同一の被写体が存在する領域の検出に用いることを特徴とする撮像装置の制御方法。 - 連写モード時に複数の本画像を撮像する撮像装置の制御方法をコンピュータに実行させるためのプログラムであって、
本画像の撮像と次の本画像の撮像を行う合間に、複数の補助画像を撮像する撮像ステップと、
主被写体を決定する主被写体決定ステップと、
前記複数の補助画像のうちの第1の補助画像の、その一部である第1の領域から、前記主被写体と同一の被写体が存在する領域を検出する第1の被写体追尾処理ステップと、
前記複数の補助画像のうちの第2の補助画像の、前記第1の領域よりも広い第2の領域から、前記主被写体と同一の被写体が存在する領域を検出する第2の被写体追尾処理ステップと
をコンピュータに実行させ、
前記第1の被写体追尾処理ステップにおける検出結果を、前記次の本画像の撮像より前に行う焦点調節に用い、前記第2の被写体追尾処理ステップにおける検出の結果を、前記次の本画像の撮像の後に行う前記主被写体と同一の被写体が存在する領域の検出に用いることを特徴とするプログラム。 - 連写モード時に複数の本画像を撮像する撮像装置の制御方法をコンピュータに実行させるためのプログラムを記憶した不揮発性の記憶媒体であって、
前記プログラムは、
本画像の撮像と次の本画像の撮像を行う合間に、複数の補助画像を撮像する撮像ステップと、
主被写体を決定する主被写体決定ステップと、
前記複数の補助画像のうちの第1の補助画像の、その一部である第1の領域から、前記主被写体と同一の被写体が存在する領域を検出する第1の被写体追尾処理ステップと、
前記複数の補助画像のうちの第2の補助画像の、前記第1の領域よりも広い第2の領域から、前記主被写体と同一の被写体が存在する領域を検出する第2の被写体追尾処理ステップと
をコンピュータに実行させ、
前記第1の被写体追尾処理ステップにおける検出結果を、前記次の本画像の撮像より前に行う焦点調節に用い、前記第2の被写体追尾処理ステップにおける検出の結果を、前記次の本画像の撮像の後に行う前記主被写体と同一の被写体が存在する領域の検出に用いることを特徴とするプログラムを記憶した不揮発性の記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013504728A JP5925186B2 (ja) | 2011-03-17 | 2012-03-13 | 撮像装置及びその制御方法 |
RU2013146332/08A RU2549143C1 (ru) | 2011-03-17 | 2012-03-13 | Устройство регистрации изображения и способ управления им |
US13/568,479 US8743209B2 (en) | 2011-03-17 | 2012-08-07 | Image pickup apparatus and method for controlling the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-059425 | 2011-03-17 | ||
JP2011059425 | 2011-03-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/568,479 Continuation US8743209B2 (en) | 2011-03-17 | 2012-08-07 | Image pickup apparatus and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012124669A1 true WO2012124669A1 (ja) | 2012-09-20 |
Family
ID=46830737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/056348 WO2012124669A1 (ja) | 2011-03-17 | 2012-03-13 | 撮像装置及びその制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8743209B2 (ja) |
JP (1) | JP5925186B2 (ja) |
RU (1) | RU2549143C1 (ja) |
WO (1) | WO2012124669A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016197150A (ja) * | 2015-04-02 | 2016-11-24 | キヤノン株式会社 | 撮像装置およびその制御方法、プログラム、ならびに記憶媒体 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6172934B2 (ja) * | 2012-12-27 | 2017-08-02 | キヤノン株式会社 | 撮像装置、その制御方法、プログラム及び記憶媒体 |
JP5882925B2 (ja) | 2013-02-18 | 2016-03-09 | キヤノン株式会社 | 撮像装置及びその制御方法 |
JP6028928B2 (ja) * | 2013-04-16 | 2016-11-24 | オリンパス株式会社 | 撮像装置及び撮像方法 |
JP6378500B2 (ja) * | 2014-03-03 | 2018-08-22 | キヤノン株式会社 | 撮像装置及びその制御方法 |
EP3674973A1 (en) * | 2018-12-28 | 2020-07-01 | Samsung Electronics Co., Ltd. | Method and apparatus with liveness detection and object recognition |
JP2021179527A (ja) * | 2020-05-13 | 2021-11-18 | キヤノン株式会社 | 撮影制御装置、撮像装置、撮影制御方法、及びプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009139795A (ja) * | 2007-12-10 | 2009-06-25 | Sony Corp | 撮像装置 |
JP2009175821A (ja) * | 2008-01-22 | 2009-08-06 | Fujifilm Corp | 特定画像の検出方法及び撮影装置 |
JP2010072283A (ja) * | 2008-09-18 | 2010-04-02 | Nikon Corp | 撮像装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4047943B2 (ja) * | 1995-01-31 | 2008-02-13 | 富士フイルム株式会社 | ディジタル画像データ記録装置および方法ならびにディジタル画像データ再生装置および方法ならびに固体電子撮像装置およびその信号電荷読出し方法 |
US7088865B2 (en) * | 1998-11-20 | 2006-08-08 | Nikon Corporation | Image processing apparatus having image selection function, and recording medium having image selection function program |
CA2451495C (en) * | 2001-06-15 | 2010-07-27 | Mitsubishi Heavy Industries, Ltd. | Thermal barrier coating material, method of production thereof, and gas turbine member and gas turbine applying said thermal barrier coating material |
RU2370817C2 (ru) * | 2004-07-29 | 2009-10-20 | Самсунг Электроникс Ко., Лтд. | Система и способ отслеживания объекта |
RU2006118146A (ru) * | 2006-05-26 | 2007-12-20 | Самсунг Электроникс Ко., Лтд. (KR) | Активная система и способ контроля качества изображения объектов |
JP2009296029A (ja) * | 2008-06-02 | 2009-12-17 | Panasonic Corp | 撮像装置 |
US9374533B2 (en) * | 2011-02-15 | 2016-06-21 | Canon Kabushiki Kaisha | Imaging apparatus and control method for tracking a subject based on an image signal |
JP5825851B2 (ja) * | 2011-05-27 | 2015-12-02 | キヤノン株式会社 | 撮像装置及びその制御方法 |
-
2012
- 2012-03-13 WO PCT/JP2012/056348 patent/WO2012124669A1/ja active Application Filing
- 2012-03-13 JP JP2013504728A patent/JP5925186B2/ja not_active Expired - Fee Related
- 2012-03-13 RU RU2013146332/08A patent/RU2549143C1/ru not_active IP Right Cessation
- 2012-08-07 US US13/568,479 patent/US8743209B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009139795A (ja) * | 2007-12-10 | 2009-06-25 | Sony Corp | 撮像装置 |
JP2009175821A (ja) * | 2008-01-22 | 2009-08-06 | Fujifilm Corp | 特定画像の検出方法及び撮影装置 |
JP2010072283A (ja) * | 2008-09-18 | 2010-04-02 | Nikon Corp | 撮像装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016197150A (ja) * | 2015-04-02 | 2016-11-24 | キヤノン株式会社 | 撮像装置およびその制御方法、プログラム、ならびに記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
RU2013146332A (ru) | 2015-04-27 |
JPWO2012124669A1 (ja) | 2014-07-24 |
JP5925186B2 (ja) | 2016-05-25 |
US8743209B2 (en) | 2014-06-03 |
RU2549143C1 (ru) | 2015-04-20 |
US20120300083A1 (en) | 2012-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5388544B2 (ja) | 撮像装置およびそのフォーカス制御方法 | |
JP5322783B2 (ja) | 撮像装置及び該撮像装置の制御方法 | |
JP5925186B2 (ja) | 撮像装置及びその制御方法 | |
JP6405243B2 (ja) | 焦点検出装置及びその制御方法 | |
JP6457776B2 (ja) | 撮像装置および撮像装置の制御方法 | |
US8542941B2 (en) | Imaging device, image detecting method and focus adjusting method | |
JP5753371B2 (ja) | 撮像装置およびその制御方法 | |
US8571402B2 (en) | Image tracking device, imaging device, image tracking method, and imaging method | |
JP5843486B2 (ja) | 撮像装置およびその制御方法 | |
JP5932210B2 (ja) | 撮像装置及び焦点調節方法 | |
JP2018031877A (ja) | 撮像装置および焦点調節方法 | |
JP5056168B2 (ja) | 焦点調節装置および撮像装置 | |
JP2005215373A (ja) | 撮像装置 | |
JP2014202875A (ja) | 被写体追跡装置 | |
JP2013254166A (ja) | 撮像装置及びその制御方法 | |
JP2009081636A (ja) | 画像記録装置及び撮影方法 | |
JP2013113857A (ja) | 撮像装置及びその制御方法 | |
JP6561437B2 (ja) | 焦点調節装置および撮像装置 | |
JP2009010672A (ja) | 焦点検出装置および撮像装置 | |
JP2011242652A (ja) | 焦点検出装置およびその制御方法 | |
JP2012128343A (ja) | カメラ | |
JP2010072559A (ja) | 撮像装置および対象物領域抽出方法 | |
JP5447579B2 (ja) | 追尾装置、焦点調節装置および撮影装置 | |
JP7066458B2 (ja) | 撮像装置及びその制御方法、プログラム | |
JP2021015162A (ja) | 画像処理装置及びその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12757361 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013504728 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2013146332 Country of ref document: RU Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12757361 Country of ref document: EP Kind code of ref document: A1 |