EP2046018A1 - Imaging apparatus - Google Patents
Imaging apparatus Download PDFInfo
- Publication number
- EP2046018A1 EP2046018A1 EP07791037A EP07791037A EP2046018A1 EP 2046018 A1 EP2046018 A1 EP 2046018A1 EP 07791037 A EP07791037 A EP 07791037A EP 07791037 A EP07791037 A EP 07791037A EP 2046018 A1 EP2046018 A1 EP 2046018A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- section
- microprocessor
- shutter speed
- blur
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/091—Digital circuits
- G03B7/097—Digital circuits for control of both exposure time and aperture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
Definitions
- the present invention relates to an image capture device that can change the shutter speeds of a mechanical shutter and/or an electronic shutter, and more particularly relates to an image capture device with an image stabilizing function.
- the image can be stabilized either by optically compensating for the blur of the subject's image or by increasing the shutter speed with the sensitivity of the imager increased.
- Patent Document No. 1 An image capture device with such a function is disclosed in Patent Document No. 1.
- the device is designed to estimate the motion vector of an image being shot and determine the shutter speed based on that motion vector. In this manner, the image blur caused by a motion of the subject can be reduced and the SNR of the image information can be increased while an image of a still subject is being shot.
- Patent Document No. 1 Japanese Patent Application Laid-Open Publication No. 8-327917
- the magnitudes of motion vectors estimated could be different even if the magnitude of shake of the image capture device was the same. It was discovered that if the exposure were adjusted in such a situation just by the magnitude of motion vector, then an inappropriate exposure value could be selected for the magnitude of actual camera shake. That is to say, even if the image shot were not actually blurred so much, the exposure value could be unnecessarily small.
- the present invention has an object of providing an image capture device that can shoot an image at a more appropriate exposure value with the image blur compensated for properly.
- an image capture device is designed to adjust an exposure value for shooting an image.
- the device includes: a motion detecting section for detecting a motion of an image shot; a blur compensating section for optically compensating for a blur of the image shot; a mode selecting section for selecting one of multiple control modes for the blur compensating section; and a setting determining section for determining a setting related to an exposure value for shooting an image based on the control mode selected by the mode selecting section and the motion of the image shot that has been detected.
- the exposure value can be adjusted according to the control mode to perform the image stabilizing function.
- the problem with the conventional device that may have an inappropriate exposure value according to the image stabilizing function control mode can be overcome.
- the image capture device of the present invention may be able to control an exposure value for shooting an image by adjusting a shutter speed of a mechanical shutter and/or an electronic shutter.
- the setting determining section determines the shutter speed based on the control mode selected and the motion of the image shot that has been detected.
- the shutter speed can be adjusted according to the control mode to perform the image stabilizing function.
- the problem with the conventional device that may have an inappropriate exposure value according to the image stabilizing function control mode can be overcome.
- the control modes of the blur compensating section may include a first control mode and a second control mode.
- the blur compensating section continues to compensate for the blur of the image shot through a period from shooting one still picture to shooting the next still picture.
- the second control mode a session that the blur compensating section either suspends or attenuates the operation of compensating for the blur of the image shot exists during the period.
- the setting determining section may select a first setting as the exposure value. But if the control mode selected is the second control mode, the setting determining section may select a second setting as the exposure value. In that case, the exposure value to shoot an image is greater when the second setting is selected than when the first setting is selected.
- the present invention provides an image capture device that can shoot an image at a more appropriate exposure value with the image blur compensated for properly.
- FIG. 1 is a block diagram illustrating a configuration for a digital camera according to the present invention.
- the camera includes a lens barrel 14, an imager 2, an image processing section 3, and a microprocessor 8 for controlling the digital camera.
- a CCD image sensor, a CMOS image sensor or an NMOS image sensor may be used.
- the image processing section 3 and the microprocessor 8 may be implemented as pieces of hardware with or without a software program installed for a microcomputer.
- the lens barrel 14 includes an image stabilizer lens 11, a focus lens 12 and an iris 13.
- the image stabilizer lens 11 can move within a plane that intersects with the optical axis of the lens barrel 14 at right angles. Thus, by moving the image stabilizer lens 11 within that plane according to the magnitude of the camera shake, the blur of the subject's image that has been produced on the imager 2 can be compensated for.
- the imager 2 converts the image, which has entered the camera through the lens barrel, into an electrical signal (i.e., analog data).
- an A/D converter 105 converts the electrical signal that has been generated by the imager 2 into digital image data.
- the image processing section 3 includes a preprocessing section 31, a YC converting section 32, a compressing section 33, and a motion detecting section 34.
- the preprocessing section 34 performs various types of processing such as gain correction, gamma correction, white balance correction and flaw correction on the input digital image data that has come from the A/D converter 105.
- the YC converting section 32 separates the preprocessed image data into a color difference signal and a luminance signal.
- the compressing section 33 subjects the YC converted image data to compression processing.
- the compressing section 33 may also have the function of expanding the compressed image data that has been read out from a memory card 7.
- a buffer memory 4 is used as a temporary work area to get these types of processing done.
- the compressed image data is written on the memory card 7 by way of a memory card I/F 6.
- the images that are stored in the buffer memory 4 and the memory card 7 can be reproduced on an LCD monitor 5.
- the microprocessor 8 controls the overall system of this digital camera including the image processing section 3, a shutter control section 101, an iris driving section 102 and a focus driving section 103. Also, based on the magnitude of the camera shake that has been detected by a gyro sensor 10, the microprocessor 8 controls a compensation lens driving section 104 so as to minimize the blur of the subject's image that has been produced on the imager 2.
- FIG. 2 is a block diagram illustrating a configuration for the motion detecting section 34, which detects the motion of an image based on the image data that has been generated by the imager 2.
- the motion detecting section 34 includes a representative point storage section 341, a correlation calculating section 342 and a motion vector estimating section 343.
- the representative point storage section 341 divides the image signal representing a current frame, which has been supplied from the preprocessing section 31, into a plurality of areas and stores an image signal, associated with a particular representative point included in each of those areas, as a representative point signal. Also, the representative point storage section 341 reads the representative point of the previous frame that has already been stored and passes it to the correlation calculating section 342.
- the correlation calculating section 342 gets the representative point signal of the previous frame from the representative point storage section 341 and also gets the image data of the current frame from the preprocessing section 31, and then calculates the degree of correlation between the representative point signal of the previous frame and the image data of the current frame. This correlation calculation can be done by comparing the difference between the representative point signal of the previous frame and the image signal of the current frame. Thereafter, the output of the correlation calculating section 342 is given to the motion vector estimating section 343.
- the motion vector estimating section 343 estimates the motion vector between the previous and current frames of the image. If an image element that appeared in the previous frame has moved to a different location in the current frame, the motion vector represents the magnitude and direction of that motion.
- the digital camera of this preferred embodiment has at least two modes for controlling the image stabilization operation. That is to say, the microprocessor 8 has at least two modes for controlling the compensation lens driving section 104.
- those control modes will be described with reference to FIGS. 3 and 4 , which schematically show the modes for compensating for the image blur of the digital camera.
- FIG. 3 shows the connection between the exposure state of the imager 2 and the lens position instructed value, which is given by the microprocessor 8 to the compensation lens driving section 104, in a situation where the microprocessor 8 is controlling the compensation lens driving section 104 in a control mode called "MODE 1". More specifically, portion (a) of FIG. 3 shows how the lens position instructed value output by the microprocessor 8 changes with time. Portion (b) of FIG. 3 shows how the exposure state of the imager 2 changes. And portion (c) of FIG. 3 shows the times of occurrence of respective events. In this example, the respective events and the times are supposed to have the following correspondence.
- the shutter release button is supposed to be pressed halfway at a time t11 and pressed fully at a time t13, and then the imager 2 is supposed to be subjected to an exposure operation between the times t13 and t15.
- the microprocessor 8 has the compensation lens driving section 104 continue to perform the operation of compensating for the blur of the image shot.
- this control mode will be referred to herein as "MODE 1".
- the shutter release button forms part of the operating section 9.
- the microprocessor 8 By making the microprocessor 8 control the compensation lens driving section 104 in MODE 1, the image stabilization can be done even while a still picture is not being shot. For example, the image stabilization control can also be performed on a through-the-lens image for use to determine the composition of a still picture. Also, in MODE 1, the microprocessor 8 can drive the compensation lens driving section 104 irrespective of the exposure state of the imager 2, and therefore, the image stabilization operation can be controlled relatively easily.
- FIG. 4 shows the connection between the exposure state of the imager 2 and the lens position instructed value, which is given by the microprocessor 8 to the compensation lens driving section 104, in a situation where the microprocessor 8 is controlling the compensation lens driving section 104 in a control mode called "MODE 2". More specifically, portion (a) of FIG. 4 shows how the lens position instructed value output by the microprocessor 8 changes with time. Portion (b) of FIG. 4 shows how the exposure state of the imager 2 changes. And portion (c) of FIG. 4 shows the times of occurrence of respective events. In this example, the respective events and the times are supposed to have the following correspondence.
- the shutter release button is supposed to be pressed halfway at a time t21 and pressed fully at a time t23, and then the imager 2 is supposed to be subjected to an exposure operation between the times t24 and t25.
- the microprocessor 8 has the compensation lens driving section 104 suspend the operation of compensating for the blur of the image shot.
- this control mode will be referred to herein as "MODE 2".
- the image stabilizer lens 11 is driven only when it is necessary to do that to shoot a still picture. That is to say, since the image stabilizer lens 11 is not driven when it is not necessary, the power that would otherwise be dissipated by the compensation lens driving section 104 can be saved. As shown in FIG. 4 , even during the interval between the times t23 and t24, the image stabilization function is kept ON. This is done in order to perform the image stabilization operation with good stability during the exposure period by performing the exposure operation after the image stabilization function has been turned ON in advance. The image stabilization function is not turned OFF right after the exposure period is over.
- the image stabilization function is turned ON not only during the exposure period (i.e., from the time t24 through the time t25 ) but also during the pre-exposure period (i.e., from the time t23 through the time t24 ) and the post-exposure period (i.e., from the time t25 on).
- the lens position instructed value is supposed to be constant.
- the lens position instructed value other than the necessary control period (i.e., from the time t23 through the time t25 ) to shoot a still picture may be smaller than the one during that necessary control period (i.e., from the time t23 through the time t25 ) to shoot a still picture, and such a control mode may be used as a new control mode.
- the present invention is applicable to any situation as long as the microprocessor 8 has multiple modes for controlling the compensation lens driving section 104.
- the digital camera of this preferred embodiment is an example of image capture device according to the present invention.
- Another image capture device according to the present invention could be a cellphone with a camera function, for example.
- the motion detecting section 34 is an exemplary means for estimating a motion vector.
- the motion detecting section 34 may be implemented either as a DSP circuit dedicated for estimating a motion vector or by making a general-purpose computer execute a software program for motion detection.
- the gyro sensor 10, the microprocessor 8, the compensation lens driving section 104 and the image stabilizer lens 11 together form an exemplary blur compensating means.
- Another blur compensating means may be provided by replacing the gyro sensor 10 with an angular velocity sensor, for example.
- the imager 2 may also be driven instead, or even the lens barrel 14 may be driven in its entirety. In short, any other technique may be adopted as long as the blur of the subject's image can be compensated for optically.
- the operating section 9 is an exemplary mode selecting means.
- the operating section 9 may be implemented as a piece of hardware such as a button or a dial.
- the operating section 9 may also be implemented by presenting characters or images on a screen with a touchscreen panel by software processing and allowing the user to make a contact with the screen. In that case, the operating section 9 is provided as a combination of hardware and software.
- the microprocessor 8 is an exemplary setting determining means.
- FIG. 5 is a schematic representation illustrating a menu being displayed on the LCD monitor 5 when an image stabilization control mode needs to be selected for the digital camera.
- This menu 51 is displayed when the user operates the operating section 9.
- the user selects either the field 52 representing MODE 1 or the field 53 representing MODE 2, thereby setting his or her desired image stabilization control mode.
- This selection may be made using cross keys or ENTER button, which form parts of the operating section 9.
- the control mode currently selected may be stored in a flash memory in the microprocessor 8, for example.
- the microprocessor 8 can know whether the control mode currently selected is MODE 1 or MODE 2.
- FIG. 6 is a flowchart showing the shooting operation to be done by the digital camera.
- the microprocessor 8 sees if the shutter release button has been pressed halfway (in Step S1 ). In that case, before a still picture is shot, the LCD monitor 5 presents a through-the-lens image.
- the LCD monitor 5 presents a through-the-lens image.
- the through-the-lens image is displayed on the LCD monitor 5 by performing the following processing. Specifically, the imager 2 converts the optical signal, which has entered the camera through the lens barrel 14 , into an electrical signal. Then, the A/D converter 105 converts the electrical signal into a digital signal.
- the image processing section 3 subjects the digitized image data to preprocessing, YC conversion, electronic zoom processing and so on, thereby generating monitor image data. And when this monitor image data is input to the LCD monitor 5 , the LCD monitor 5 presents a through-the-lens image.
- the "through-the-lens image” refers to an image that will not be stored in the memory card 7 eventually.
- the microprocessor 8 When the user presses halfway the shutter release button, which forms part of the operating section 9 (i.e., if the answer to the query of Step S1 is YES), the microprocessor 8 performs AE processing and AF processing in parallel with each other in Step S2 .
- the microprocessor 8 is supposed to perform the AE processing and the AF processing in parallel with each other.
- the present invention is in no way limited to this specific preferred embodiment.
- the microprocessor 8 may perform the AE processing and then the AF processing or perform the AF processing first and then the AE processing.
- the microprocessor 8 determines the exposure value based on the image data that has been processed by the image processing section 3. Then, the microprocessor 8 sets an appropriate shutter speed based on the exposure value. That is to say, the microprocessor 8 sets the exposure period of the imager 2 according to the exposure value. In this manner, the AE processing gets done by the digital camera. It should be noted that the shutter speed that has been set during the AE processing is a temporary setting. In a subsequent processing step, the microprocessor 8 will correct the shutter speed that was set during the AE processing to determine the final shutter speed.
- the microprocessor 8 adjusts the position of the focus lens 12 according to the contrast value of the image data that has been processed by the image processing section 3 such that the contrast value becomes a peak value.
- the focus lens 12 is moved by the focus lens driving section 103 under the control of the microprocessor 8 .
- the microprocessor 8 can get autofocusing processing done. That is to say, the AF processing also gets done by the digital camera (in Step S2 ).
- the microprocessor 8 gets the motion vector of the image data from the image processing section 34 (in Step S3) . More specifically, the microprocessor 8 keeps getting motion vectors for a predetermined period of time or more until the shutter release button is pressed fully. Alternatively, the microprocessor 8 may always get motion vectors, too.
- the microprocessor 8 determines whether the image stabilization control mode currently selected is MODE 1 or MODE 2 (in Step S5). Also, the microprocessor 8 calculates the average of the motion vectors that had been gotten during the predetermined period until the shutter release button was pressed fully. And this average is used as a motion vector magnitude for obtaining a corrected shutter speed value. In this preferred embodiment, the average of motion vectors that have been gotten during a predetermined period is supposed to be used as a motion vector magnitude for obtaining a corrected shutter speed value. However, the present invention is in no way limited to this specific preferred embodiment. Alternatively, either the average of the absolute values, or the maximum value, of the motion vectors that have been gotten during the predetermined period may also be used.
- Step S9 If the microprocessor 8 has determined that the control mode currently selected is MODE 1 (i.e., if the answer to the query of Step S5 is mode 1), then the microprocessor 8 selects a method that uses a correction value map #1 as a method for obtaining a corrected shutter speed value (in Step S9 ).
- the correction value map #1 will be described later.
- Step S6 the microprocessor 8 controls the compensation lens driving section 104 to get an image stabilization operation started (in Step S6 ). More specifically, the microprocessor 8 calculates the degree of camera shake of the digital camera based on the output of the gyro sensor 10 . Then, the compensation lens driving section 104 shifts the image stabilizer lens 11 in such a direction that cancels the camera shake under the control of the microprocessor 8 .
- the microprocessor 8 determines whether or not the magnitude of the output of the gyro sensor 10 is greater than a predetermined value A (in Step S7 ). If the magnitude of the output of the gyro sensor 10 is smaller than the predetermined value A , then the microprocessor 8 advances the control process to Step S9 . On the other hand, if the magnitude of the output of the gyro sensor 10 is greater than the predetermined value A , then the microprocessor 8 selects a method that uses a correction value map #2 as a method for obtaining a corrected shutter speed value (in Step S8 ). The correction value map #2 will be described later along with the correction value map #1.
- the microprocessor 8 corrects the temporarily set shutter speed in accordance with either the correction value map #1 selected in Step S9 or the correction value map #2 selected in Step S8 , thereby determining a final shutter speed value (in Step S10 ).
- the microprocessor 8 controls the imager 2 to get the exposure operation started. Thereafter, the microprocessor 8 controls the imager 2 and finishes the exposure operation when an exposure period, associated with the final shutter speed value, passes (in Step S11 ). Finally, the image processing section 3 subjects the image data captured to a predetermined type of processing under the control of the microprocessor 8 , thereby writing processed image data on the memory card 7 and ending the series of shooting operations (in Step S12 ).
- the microprocessor 8 determines the exposure value based on the image data that has been processed by the image processing section 3 and then temporarily sets a shutter speed based on that exposure value.
- the microprocessor 8 corrects the temporarily set shutter speed based on the motion vector of the image data that has been processed by the image processing section 3 and on the image stabilization control mode. In that case, the corrected shutter speed value is determined in either Step S8 or Step S9 .
- Step S8 or Step S9 a method for determining the corrected shutter speed value will be described with reference to FIG. 7 .
- FIG. 7 shows a correlation between the motion vector, the image stabilization control mode, and the corrected shutter speed value.
- the microprocessor 8 corrects the shutter speed value in accordance with the correction value map #1 shown in FIG. 7 . That is to say, if the motion vector magnitude of the image data that has been processed by the image processing section 3 is equal to or smaller than A1, the microprocessor 8 selects SS1 as a corrected shutter speed value. On the other hand, if the motion vector magnitude is greater than A1 but equal to or smaller than A2, the microprocessor 8 selects SS2 as a corrected shutter speed value. Furthermore, if the motion vector magnitude is greater than A2 but equal to or smaller than A3, the microprocessor 8 selects SS3 as a corrected shutter speed value. And if the motion vector magnitude is greater than A3, then the microprocessor 8 selects SS4 as a corrected shutter speed value.
- the microprocessor 8 corrects the shutter speed value in accordance with the correction value map #2 shown in FIG. 7 . That is to say, if the motion vector magnitude of the image data that has been processed by the image processing section 3 is equal to or smaller than B1, the microprocessor 8 selects SS1 as a corrected shutter speed value. On the other hand, if the motion vector magnitude is greater than B1 but equal to or smaller than B2, the microprocessor 8 selects SS2 as a corrected shutter speed value. Furthermore, if the motion vector magnitude is greater than B2 but equal to or smaller than B3, the microprocessor 8 selects SS3 as a corrected shutter speed value. And if the motion vector magnitude is greater than B3, then the microprocessor 8 selects SS4 as a corrected shutter speed value.
- the motion vector magnitude is equal to or smaller than A1, greater than B1 but equal to or smaller than A2, greater than B2 but equal to or smaller than A3, or greater than B3, then SS1, SS2, SS3 or SS4 is respectively selected, as the same corrected shutter speed value for both MODE 1 and MODE 2.
- the motion vector magnitude is greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3, the corrected shutter speed values are different between MODE 1 and MODE 2.
- the microprocessor 8 determines SS2 and SS1 as corrected shutter speed values for MODE 1 and MODE 2, respectively. That is to say, the final shutter speed for MODE 2 becomes lower than the one for MODE 1.
- the microprocessor 8 controls the shutter control section 101 such that if the motion vector magnitude is equal to a certain value (i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment), then the shutter speed for MODE 2 is lower than the one for MODE 1.
- a certain value i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment
- the present invention is applicable to any other situation as long as there are multiple image stabilization control modes with motion vector magnitudes that will result in mutually different corrected shutter speed values.
- the present invention is applicable to a situation where the corrected shutter speed values are different between those control modes in just a part of the motion vector magnitude range as in the preferred embodiment described above and to a situation where the corrected shutter speed values are different between those control modes in the entire motion vector magnitude range.
- Step S1 the microprocessor 8 performs the AE processing and the AF processing.
- the microprocessor 8 obtains motion vectors from the image processing section 3 (in Step S3 ) and continues getting them until the user presses the button fully (in Step S4 ). And when the user presses the shutter release button fully at a time t13 (in Step S5 ), the process advances to Step S9 in which the shutter speed is corrected.
- the microprocessor 8 retains the motion vectors, which have been estimated by the motion detecting section 34 between the times t12 and t13 , in the internal memory.
- the microprocessor 8 calculates the average of those internally retained motion vectors and then defines the average as a motion vector magnitude for use to determine a corrected shutter speed value.
- the microprocessor 8 obtains the corrected shutter speed value in accordance with the correction value map #1 shown in FIG. 7 (in Step S9 ). As the motion vector magnitude is obtained as described above, the microprocessor 8 determines the corrected shutter speed value based on that value and the relation shown in FIG. 7 . Thereafter, the microprocessor 8 adds the corrected shutter speed value to the shutter speed value that has been temporarily set in Step S2 , thereby determining the final shutter speed.
- the temporary set shutter speed is 1/15 seconds and the corrected value should be one level higher than the temporary one, then the final shutter speed becomes 1/30 seconds.
- Step S11 the imager 2 starts performing an exposure operation and continues the exposure for a period of time associated with the final shutter speed, thereby obtaining captured image data.
- the image processing section 3 subjects the captured image data to predetermined processing and then stores it in the memory card 7 . In this manner, a series of shooting operations in MODE 1 gets done.
- the exposure operation is supposed to start immediately. However, there could be some time lag between the time when the button is pressed fully and the time when the exposure operation is started.
- Step S1 the microprocessor 8 performs the AE processing and the AF processing.
- the microprocessor 8 obtains motion vectors from the image processing section 3 (in Step S3) and continues getting them until the user presses the button fully (in Step S4 ). And when the user presses the shutter release button fully at a time t23 (in Step S5 ), the microprocessor 8 starts performing an image stabilization control at the time t23 (in Step S6 ). At this point in time, the microprocessor 8 retains the motion vectors, which have been estimated by the motion detecting section 34 between the times t22 and t23, in the internal memory. The microprocessor 8 calculates the average of those internally retained motion vectors and then defines the average as a motion vector magnitude for use to determine a corrected shutter speed value.
- the microprocessor 8 determines whether or not the output of the gyro sensor 10 is greater than a predetermined value A. If the user is holding the digital camera with his or her hands, for example, the digital camera has significant camera shake. In that case, the output of the gyro sensor 10 should be greater than the predetermined value A. On the other hand, if the digital camera is fixed on a tripod, for example, then the digital camera has little camera shake. In that case, the output of the gyro sensor 10 should be smaller than the predetermined value A.
- the microprocessor 8 obtains the corrected shutter speed value in accordance with the correction value map #2 shown in FIG. 1 (in Step S9 ) . As the motion vector magnitude is obtained as described above, the microprocessor 8 determines the corrected shutter speed value based on that value and the relation shown in FIG. 7 . Thereafter, the microprocessor 8 adds the corrected shutter speed value to the shutter speed value that has been temporarily set in Step S2, thereby determining the final shutter speed (in Step S10 ).
- the microprocessor 8 obtains the corrected shutter speed value in accordance with the correction value map #1 shown in FIG. 7 (in Step S9 ). As the motion vector magnitude is obtained as described above, the microprocessor 8 determines the corrected shutter speed value based on that value and the relation shown in FIG. 7 . Thereafter, the microprocessor 8 corrects the shutter speed value that has been temporarily set in Step S2 with the corrected shutter speed value, thereby determining the final shutter speed (in Step S10 ).
- the image stabilizer lens 11 does not change its positions until just before a still picture starts to be shot (at the time t23 ), and the image stabilization control is started after that time that is just before the shooting is started. That is why even if the magnitude of the camera shake of the digital camera remains the same before and after the time t23, the blur of the subject's image on the imager 2 has different magnitudes before and after the time t23.
- the corrected shutter speed is obtained based on the motion vector magnitude, which is the average of the motion vectors that have been estimated until just before a still picture starts to be shot. For that reason, there is no problem if the magnitude of the image blur during the exposure operation is similar to that of the image blur while the motion vectors are being estimated. However, it is a problem if there is significant camera shake in MODE 2. That is to say, while motion vectors are being estimated, no image stabilization control is performed, and therefore, there is significant image blur. That is why if the shutter speed were corrected with a motion vector that has been estimated in such a state, then the shutter speed would be just increased. However, since the image stabilization control is performed during the exposure operation, there is little image blur.
- the shutter speed is set based on the correction value map #2 that is different from the one used in MODE 1. That is to say, the microprocessor 8 controls the shutter control section 101 such that if the motion vector magnitude is equal to a certain value (i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment), then the shutter speed for MODE 2 is lower than the one for MODE 1.
- a certain value i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment
- the exposure period of the imager 2 is supposed to be set using an electronic shutter.
- the present invention is in no way limited to that specific preferred embodiment.
- the exposure period may also be controlled by providing a mechanical shutter for the imager 2 such that the mechanical shutter faces the subject and by adjusting the shutter speed of that mechanical shutter.
- control modes 1 and 2 are supposed to be used as the image stabilization control modes.
- the present invention is in no way limited to that specific preferred embodiment.
- a third control mode may be further provided.
- a temporary shutter speed is supposed to be set and then corrected.
- the shutter speed may also be determined directly based on the image stabilization mode, motion vector and exposure value.
- the AF processing is supposed to be performed in Step S2 shown in FIG. 6 .
- the present invention could also cope with a manual focus operation instead of performing the AF processing.
- the motion of an image shot is supposed to be detected by estimating a motion vector.
- the present invention is in no way limited to that specific preferred embodiment.
- the motion of an image shot can also be figured out by calculating the differences between the pixel values of the previous and next frames and integrating them together.
- the present invention is applicable to any situation as long as the motion of an image shot can be calculated.
- the gain to the image data may be controlled according to the shutter speed value. For example, if the shutter speed is high, a dark image may be brightened by increasing the gain. On the other hand, if the shutter speed is low, the noise of an image shot may be reduced by decreasing the gain.
- the image data that has been digitized by the preprocessing section 31 may have its gain controlled.
- the present invention is in no way limited to that specific preferred embodiment.
- the image data as an analog signal may have its gain controlled before passed to the A/D converter 105.
- the present invention is applicable for use in an image capture device that has multiple image stabilization control modes. Specifically, the present invention can be used effectively in a digital still camera or a cellphone with a camera function, for example.
Abstract
Description
- The present invention relates to an image capture device that can change the shutter speeds of a mechanical shutter and/or an electronic shutter, and more particularly relates to an image capture device with an image stabilizing function.
- Recently, digital still cameras with an image stabilizing function have been developed and put on sale one after another. The image can be stabilized either by optically compensating for the blur of the subject's image or by increasing the shutter speed with the sensitivity of the imager increased.
- An image capture device with such a function is disclosed in Patent Document No. 1. The device is designed to estimate the motion vector of an image being shot and determine the shutter speed based on that motion vector. In this manner, the image blur caused by a motion of the subject can be reduced and the SNR of the image information can be increased while an image of a still subject is being shot.
Patent Document No. 1: Japanese Patent Application Laid-Open Publication No.8-327917 - However, it was discovered that if an image capture device with an optical image stabilizing function that could be performed in any of multiple selectable control modes had its shutter speed adjusted according to the motion vector, some problems occurred.
- More specifically, according to the image stabilizing control mode selected, the magnitudes of motion vectors estimated could be different even if the magnitude of shake of the image capture device was the same. It was discovered that if the exposure were adjusted in such a situation just by the magnitude of motion vector, then an inappropriate exposure value could be selected for the magnitude of actual camera shake. That is to say, even if the image shot were not actually blurred so much, the exposure value could be unnecessarily small.
- In order to overcome the problems described above, the present invention has an object of providing an image capture device that can shoot an image at a more appropriate exposure value with the image blur compensated for properly.
- To achieve this object, an image capture device according to the present invention is designed to adjust an exposure value for shooting an image. The device includes: a motion detecting section for detecting a motion of an image shot; a blur compensating section for optically compensating for a blur of the image shot; a mode selecting section for selecting one of multiple control modes for the blur compensating section; and a setting determining section for determining a setting related to an exposure value for shooting an image based on the control mode selected by the mode selecting section and the motion of the image shot that has been detected.
- Then, the exposure value can be adjusted according to the control mode to perform the image stabilizing function. As a result, the problem with the conventional device that may have an inappropriate exposure value according to the image stabilizing function control mode can be overcome.
- In one preferred embodiment, the image capture device of the present invention may be able to control an exposure value for shooting an image by adjusting a shutter speed of a mechanical shutter and/or an electronic shutter. In that case, the setting determining section determines the shutter speed based on the control mode selected and the motion of the image shot that has been detected.
- Then, the shutter speed can be adjusted according to the control mode to perform the image stabilizing function. As a result, the problem with the conventional device that may have an inappropriate exposure value according to the image stabilizing function control mode can be overcome.
- The control modes of the blur compensating section may include a first control mode and a second control mode. In the first control mode, the blur compensating section continues to compensate for the blur of the image shot through a period from shooting one still picture to shooting the next still picture. In the second control mode, a session that the blur compensating section either suspends or attenuates the operation of compensating for the blur of the image shot exists during the period.
- In a situation where the motion of the image shot that has been detected by the motion detecting section has a predetermined value, if the control mode selected is the first control mode, the setting determining section may select a first setting as the exposure value. But if the control mode selected is the second control mode, the setting determining section may select a second setting as the exposure value. In that case, the exposure value to shoot an image is greater when the second setting is selected than when the first setting is selected.
- Thus, the present invention provides an image capture device that can shoot an image at a more appropriate exposure value with the image blur compensated for properly.
-
-
FIG. 1 is a block diagram illustrating a configuration for a digital camera as a first preferred embodiment of the present invention. -
FIG. 2 is a block diagram illustrating a configuration for the motion detecting section. -
FIG. 3 schematically shows how to perform an image stabilization operation inMODE 1. -
FIG. 4 schematically shows how to perform an image stabilization operation inMODE 2. -
FIG. 5 is a schematic representation illustrating an exemplary menu being displayed on the screen when an image stabilization control mode needs to be selected. -
FIG. 6 is a flowchart showing the shooting operation to be done by the digital camera of the first preferred embodiment of the present invention. -
FIG. 7 shows how the corrected shutter speed value changes with the magnitude of a motion vector. -
- 2
- imager
- 8
- microprocessor
- 9
- operating section
- 10
- gyro sensor
- 11
- image stabilizer lens
- 34
- motion detecting section
- 101
- shutter control section
- 104
- compensation lens driving section
-
FIG. 1 is a block diagram illustrating a configuration for a digital camera according to the present invention. As shown inFIG. 1 , the camera includes alens barrel 14, animager 2, animage processing section 3, and amicroprocessor 8 for controlling the digital camera. As theimager 2, a CCD image sensor, a CMOS image sensor or an NMOS image sensor may be used. Theimage processing section 3 and themicroprocessor 8 may be implemented as pieces of hardware with or without a software program installed for a microcomputer. - The
lens barrel 14 includes animage stabilizer lens 11, afocus lens 12 and aniris 13. Theimage stabilizer lens 11 can move within a plane that intersects with the optical axis of thelens barrel 14 at right angles. Thus, by moving theimage stabilizer lens 11 within that plane according to the magnitude of the camera shake, the blur of the subject's image that has been produced on theimager 2 can be compensated for. - The
imager 2 converts the image, which has entered the camera through the lens barrel, into an electrical signal (i.e., analog data). Next, an A/D converter 105 converts the electrical signal that has been generated by theimager 2 into digital image data. - The
image processing section 3 includes apreprocessing section 31, aYC converting section 32, a compressingsection 33, and amotion detecting section 34. Thepreprocessing section 34 performs various types of processing such as gain correction, gamma correction, white balance correction and flaw correction on the input digital image data that has come from the A/D converter 105. TheYC converting section 32 separates the preprocessed image data into a color difference signal and a luminance signal. The compressingsection 33 subjects the YC converted image data to compression processing. Optionally, the compressingsection 33 may also have the function of expanding the compressed image data that has been read out from amemory card 7. Abuffer memory 4 is used as a temporary work area to get these types of processing done. The compressed image data is written on thememory card 7 by way of a memory card I/F 6. The images that are stored in thebuffer memory 4 and thememory card 7 can be reproduced on anLCD monitor 5. - In accordance with a command entered through the
operating section 9, themicroprocessor 8 controls the overall system of this digital camera including theimage processing section 3, ashutter control section 101, aniris driving section 102 and afocus driving section 103. Also, based on the magnitude of the camera shake that has been detected by agyro sensor 10, themicroprocessor 8 controls a compensationlens driving section 104 so as to minimize the blur of the subject's image that has been produced on theimager 2. -
FIG. 2 is a block diagram illustrating a configuration for themotion detecting section 34, which detects the motion of an image based on the image data that has been generated by theimager 2. Themotion detecting section 34 includes a representativepoint storage section 341, acorrelation calculating section 342 and a motionvector estimating section 343. - The representative
point storage section 341 divides the image signal representing a current frame, which has been supplied from thepreprocessing section 31, into a plurality of areas and stores an image signal, associated with a particular representative point included in each of those areas, as a representative point signal. Also, the representativepoint storage section 341 reads the representative point of the previous frame that has already been stored and passes it to thecorrelation calculating section 342. - The
correlation calculating section 342 gets the representative point signal of the previous frame from the representativepoint storage section 341 and also gets the image data of the current frame from thepreprocessing section 31, and then calculates the degree of correlation between the representative point signal of the previous frame and the image data of the current frame. This correlation calculation can be done by comparing the difference between the representative point signal of the previous frame and the image signal of the current frame. Thereafter, the output of thecorrelation calculating section 342 is given to the motionvector estimating section 343. - Based on the result of calculation made by the
correlation calculating section 342, the motionvector estimating section 343 estimates the motion vector between the previous and current frames of the image. If an image element that appeared in the previous frame has moved to a different location in the current frame, the motion vector represents the magnitude and direction of that motion. - The digital camera of this preferred embodiment has at least two modes for controlling the image stabilization operation. That is to say, the
microprocessor 8 has at least two modes for controlling the compensationlens driving section 104. Hereinafter, those control modes will be described with reference toFIGS. 3 and 4 , which schematically show the modes for compensating for the image blur of the digital camera. -
FIG. 3 shows the connection between the exposure state of theimager 2 and the lens position instructed value, which is given by themicroprocessor 8 to the compensationlens driving section 104, in a situation where themicroprocessor 8 is controlling the compensationlens driving section 104 in a control mode called "MODE 1". More specifically, portion (a) ofFIG. 3 shows how the lens position instructed value output by themicroprocessor 8 changes with time. Portion (b) ofFIG. 3 shows how the exposure state of theimager 2 changes. And portion (c) ofFIG. 3 shows the times of occurrence of respective events. In this example, the respective events and the times are supposed to have the following correspondence. Specifically, the shutter release button is supposed to be pressed halfway at a time t11 and pressed fully at a time t13, and then theimager 2 is supposed to be subjected to an exposure operation between the times t13 and t15. As shown inFIG. 3 , in the interval between a session of shooting one still picture and the next session of shooting another still picture, themicroprocessor 8 has the compensationlens driving section 104 continue to perform the operation of compensating for the blur of the image shot. Hereinafter, this control mode will be referred to herein as "MODE 1". Although not shown inFIG. 1 , the shutter release button forms part of theoperating section 9. - By making the
microprocessor 8 control the compensationlens driving section 104 inMODE 1, the image stabilization can be done even while a still picture is not being shot. For example, the image stabilization control can also be performed on a through-the-lens image for use to determine the composition of a still picture. Also, inMODE 1, themicroprocessor 8 can drive the compensationlens driving section 104 irrespective of the exposure state of theimager 2, and therefore, the image stabilization operation can be controlled relatively easily. -
FIG. 4 shows the connection between the exposure state of theimager 2 and the lens position instructed value, which is given by themicroprocessor 8 to the compensationlens driving section 104, in a situation where themicroprocessor 8 is controlling the compensationlens driving section 104 in a control mode called "MODE 2". More specifically, portion (a) ofFIG. 4 shows how the lens position instructed value output by themicroprocessor 8 changes with time. Portion (b) ofFIG. 4 shows how the exposure state of theimager 2 changes. And portion (c) ofFIG. 4 shows the times of occurrence of respective events. In this example, the respective events and the times are supposed to have the following correspondence. Specifically, the shutter release button is supposed to be pressed halfway at a time t21 and pressed fully at a time t23, and then theimager 2 is supposed to be subjected to an exposure operation between the times t24 and t25. As shown inFIG. 4 , in the interval between a session of shooting one still picture and the next session of shooting another still picture, there exists period that themicroprocessor 8 has the compensationlens driving section 104 suspend the operation of compensating for the blur of the image shot. Hereinafter, this control mode will be referred to herein as "MODE 2". - By making the
microprocessor 8 control the compensationlens driving section 104 inMODE 2, theimage stabilizer lens 11 is driven only when it is necessary to do that to shoot a still picture. That is to say, since theimage stabilizer lens 11 is not driven when it is not necessary, the power that would otherwise be dissipated by the compensationlens driving section 104 can be saved. As shown inFIG. 4 , even during the interval between the times t23 and t24, the image stabilization function is kept ON. This is done in order to perform the image stabilization operation with good stability during the exposure period by performing the exposure operation after the image stabilization function has been turned ON in advance. The image stabilization function is not turned OFF right after the exposure period is over. This is also done in order to perform the image stabilization operation with good stability during the exposure period. That is why when we say "theimage stabilizer lens 11 is driven only when it is necessary to do that to shoot a still picture", the image stabilization function is turned ON not only during the exposure period (i.e., from the time t24 through the time t25) but also during the pre-exposure period (i.e., from the time t23 through the time t24) and the post-exposure period (i.e., from the time t25 on). - In
MODE 2, other than the necessary control period (i.e., from the time t23 through the time t25) to shoot a still picture, the lens position instructed value is supposed to be constant. However, the present invention is in no way limited to that specific preferred embodiment. For example, the lens position instructed value other than the necessary control period (i.e., from the time t23 through the time t25) to shoot a still picture may be smaller than the one during that necessary control period (i.e., from the time t23 through the time t25) to shoot a still picture, and such a control mode may be used as a new control mode. In short, the present invention is applicable to any situation as long as themicroprocessor 8 has multiple modes for controlling the compensationlens driving section 104. - The digital camera of this preferred embodiment is an example of image capture device according to the present invention. Another image capture device according to the present invention could be a cellphone with a camera function, for example.
- The
motion detecting section 34 is an exemplary means for estimating a motion vector. Themotion detecting section 34 may be implemented either as a DSP circuit dedicated for estimating a motion vector or by making a general-purpose computer execute a software program for motion detection. - The
gyro sensor 10, themicroprocessor 8, the compensationlens driving section 104 and theimage stabilizer lens 11 together form an exemplary blur compensating means. Another blur compensating means may be provided by replacing thegyro sensor 10 with an angular velocity sensor, for example. Also, although the camera shake is supposed to be compensated for by driving an inner lens in the preferred embodiment described above, theimager 2 may also be driven instead, or even thelens barrel 14 may be driven in its entirety. In short, any other technique may be adopted as long as the blur of the subject's image can be compensated for optically. - The
operating section 9 is an exemplary mode selecting means. Theoperating section 9 may be implemented as a piece of hardware such as a button or a dial. Alternatively, theoperating section 9 may also be implemented by presenting characters or images on a screen with a touchscreen panel by software processing and allowing the user to make a contact with the screen. In that case, theoperating section 9 is provided as a combination of hardware and software. Themicroprocessor 8 is an exemplary setting determining means. - Hereinafter, it will be described with reference to the accompanying drawings how the digital camera of this preferred embodiment operates.
-
FIG. 5 is a schematic representation illustrating a menu being displayed on theLCD monitor 5 when an image stabilization control mode needs to be selected for the digital camera. Thismenu 51 is displayed when the user operates theoperating section 9. On thismenu 51, the user selects either thefield 52 representingMODE 1 or thefield 53 representingMODE 2, thereby setting his or her desired image stabilization control mode. This selection may be made using cross keys or ENTER button, which form parts of theoperating section 9. Also, the control mode currently selected may be stored in a flash memory in themicroprocessor 8, for example. Thus, themicroprocessor 8 can know whether the control mode currently selected isMODE 1 orMODE 2. -
FIG. 6 is a flowchart showing the shooting operation to be done by the digital camera. - First, the
microprocessor 8 sees if the shutter release button has been pressed halfway (in Step S1). In that case, before a still picture is shot, theLCD monitor 5 presents a through-the-lens image. By making theLCD monitor 5 present a through-the-lens image, the user can determine the composition of the image to be shot while monitoring the through-the-lens image and can get the shooting operation done easily. The through-the-lens image is displayed on theLCD monitor 5 by performing the following processing. Specifically, theimager 2 converts the optical signal, which has entered the camera through thelens barrel 14, into an electrical signal. Then, the A/D converter 105 converts the electrical signal into a digital signal. Theimage processing section 3 subjects the digitized image data to preprocessing, YC conversion, electronic zoom processing and so on, thereby generating monitor image data. And when this monitor image data is input to theLCD monitor 5, theLCD monitor 5 presents a through-the-lens image. As used herein, the "through-the-lens image" refers to an image that will not be stored in thememory card 7 eventually. - When the user presses halfway the shutter release button, which forms part of the operating section 9 (i.e., if the answer to the query of Step S1 is YES), the
microprocessor 8 performs AE processing and AF processing in parallel with each other in Step S2. In this preferred embodiment, themicroprocessor 8 is supposed to perform the AE processing and the AF processing in parallel with each other. However, the present invention is in no way limited to this specific preferred embodiment. Alternatively, themicroprocessor 8 may perform the AE processing and then the AF processing or perform the AF processing first and then the AE processing. - During the AE processing, the
microprocessor 8 determines the exposure value based on the image data that has been processed by theimage processing section 3. Then, themicroprocessor 8 sets an appropriate shutter speed based on the exposure value. That is to say, themicroprocessor 8 sets the exposure period of theimager 2 according to the exposure value. In this manner, the AE processing gets done by the digital camera. It should be noted that the shutter speed that has been set during the AE processing is a temporary setting. In a subsequent processing step, themicroprocessor 8 will correct the shutter speed that was set during the AE processing to determine the final shutter speed. - In parallel with the AE processing, the
microprocessor 8 adjusts the position of thefocus lens 12 according to the contrast value of the image data that has been processed by theimage processing section 3 such that the contrast value becomes a peak value. Actually, thefocus lens 12 is moved by the focuslens driving section 103 under the control of themicroprocessor 8. In this manner, themicroprocessor 8 can get autofocusing processing done. That is to say, the AF processing also gets done by the digital camera (in Step S2). - Next, the
microprocessor 8 gets the motion vector of the image data from the image processing section 34 (in Step S3). More specifically, themicroprocessor 8 keeps getting motion vectors for a predetermined period of time or more until the shutter release button is pressed fully. Alternatively, themicroprocessor 8 may always get motion vectors, too. - Subsequently, when the user presses the shutter release button fully (i.e., if the answer to the query of Step S3 is YES), the
microprocessor 8 determines whether the image stabilization control mode currently selected isMODE 1 or MODE 2 (in Step S5). Also, themicroprocessor 8 calculates the average of the motion vectors that had been gotten during the predetermined period until the shutter release button was pressed fully. And this average is used as a motion vector magnitude for obtaining a corrected shutter speed value. In this preferred embodiment, the average of motion vectors that have been gotten during a predetermined period is supposed to be used as a motion vector magnitude for obtaining a corrected shutter speed value. However, the present invention is in no way limited to this specific preferred embodiment. Alternatively, either the average of the absolute values, or the maximum value, of the motion vectors that have been gotten during the predetermined period may also be used. - If the
microprocessor 8 has determined that the control mode currently selected is MODE 1 (i.e., if the answer to the query of Step S5 is mode 1), then themicroprocessor 8 selects a method that uses a correctionvalue map # 1 as a method for obtaining a corrected shutter speed value (in Step S9). The correctionvalue map # 1 will be described later. - On the other hand, if the
microprocessor 8 has determined that the control mode currently selected is MODE 2 (i.e., if the answer to the query of Step S5 is mode 2), then themicroprocessor 8 controls the compensationlens driving section 104 to get an image stabilization operation started (in Step S6). More specifically, themicroprocessor 8 calculates the degree of camera shake of the digital camera based on the output of thegyro sensor 10. Then, the compensationlens driving section 104 shifts theimage stabilizer lens 11 in such a direction that cancels the camera shake under the control of themicroprocessor 8. - Next, the
microprocessor 8 determines whether or not the magnitude of the output of thegyro sensor 10 is greater than a predetermined value A (in Step S7). If the magnitude of the output of thegyro sensor 10 is smaller than the predetermined value A, then themicroprocessor 8 advances the control process to Step S9. On the other hand, if the magnitude of the output of thegyro sensor 10 is greater than the predetermined value A, then themicroprocessor 8 selects a method that uses a correctionvalue map # 2 as a method for obtaining a corrected shutter speed value (in Step S8). The correctionvalue map # 2 will be described later along with the correctionvalue map # 1. - Subsequently, the
microprocessor 8 corrects the temporarily set shutter speed in accordance with either the correctionvalue map # 1 selected in Step S9 or the correctionvalue map # 2 selected in Step S8, thereby determining a final shutter speed value (in Step S10). - Next, the
microprocessor 8 controls theimager 2 to get the exposure operation started. Thereafter, themicroprocessor 8 controls theimager 2 and finishes the exposure operation when an exposure period, associated with the final shutter speed value, passes (in Step S11). Finally, theimage processing section 3 subjects the image data captured to a predetermined type of processing under the control of themicroprocessor 8, thereby writing processed image data on thememory card 7 and ending the series of shooting operations (in Step S12). - As described above, the
microprocessor 8 determines the exposure value based on the image data that has been processed by theimage processing section 3 and then temporarily sets a shutter speed based on that exposure value. In the digital camera of this preferred embodiment, themicroprocessor 8 corrects the temporarily set shutter speed based on the motion vector of the image data that has been processed by theimage processing section 3 and on the image stabilization control mode. In that case, the corrected shutter speed value is determined in either Step S8 or Step S9. Hereinafter, a method for determining the corrected shutter speed value will be described with reference toFIG. 7 . -
FIG. 7 shows a correlation between the motion vector, the image stabilization control mode, and the corrected shutter speed value. - If the image stabilization control mode currently selected is
MODE 1, then themicroprocessor 8 corrects the shutter speed value in accordance with the correctionvalue map # 1 shown inFIG. 7 . That is to say, if the motion vector magnitude of the image data that has been processed by theimage processing section 3 is equal to or smaller than A1, themicroprocessor 8 selects SS1 as a corrected shutter speed value. On the other hand, if the motion vector magnitude is greater than A1 but equal to or smaller than A2, themicroprocessor 8 selects SS2 as a corrected shutter speed value. Furthermore, if the motion vector magnitude is greater than A2 but equal to or smaller than A3, themicroprocessor 8 selects SS3 as a corrected shutter speed value. And if the motion vector magnitude is greater than A3, then themicroprocessor 8 selects SS4 as a corrected shutter speed value. - Meanwhile, if the image stabilization control mode currently selected is
MODE 2, then themicroprocessor 8 corrects the shutter speed value in accordance with the correctionvalue map # 2 shown inFIG. 7 . That is to say, if the motion vector magnitude of the image data that has been processed by theimage processing section 3 is equal to or smaller than B1, themicroprocessor 8 selects SS1 as a corrected shutter speed value. On the other hand, if the motion vector magnitude is greater than B1 but equal to or smaller than B2, themicroprocessor 8 selects SS2 as a corrected shutter speed value. Furthermore, if the motion vector magnitude is greater than B2 but equal to or smaller than B3, themicroprocessor 8 selects SS3 as a corrected shutter speed value. And if the motion vector magnitude is greater than B3, then themicroprocessor 8 selects SS4 as a corrected shutter speed value. - As shown in
FIG. 7 , if the motion vector magnitude is equal to or smaller than A1, greater than B1 but equal to or smaller than A2, greater than B2 but equal to or smaller than A3, or greater than B3, then SS1, SS2, SS3 or SS4 is respectively selected, as the same corrected shutter speed value for bothMODE 1 andMODE 2. However, if the motion vector magnitude is greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3, the corrected shutter speed values are different betweenMODE 1 andMODE 2. For example, if the motion vector magnitude is greater than A1 but equal to or smaller than B1, then themicroprocessor 8 determines SS2 and SS1 as corrected shutter speed values forMODE 1 andMODE 2, respectively. That is to say, the final shutter speed forMODE 2 becomes lower than the one forMODE 1. - As described above, the
microprocessor 8 controls theshutter control section 101 such that if the motion vector magnitude is equal to a certain value (i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment), then the shutter speed forMODE 2 is lower than the one forMODE 1. - It should be noted that the present invention is applicable to any other situation as long as there are multiple image stabilization control modes with motion vector magnitudes that will result in mutually different corrected shutter speed values. For example, the present invention is applicable to a situation where the corrected shutter speed values are different between those control modes in just a part of the motion vector magnitude range as in the preferred embodiment described above and to a situation where the corrected shutter speed values are different between those control modes in the entire motion vector magnitude range.
- Hereinafter, it will be described with reference to
FIGS. 3 and6 how the digital camera of this preferred embodiment operates if the image stabilization mode isMODE 1. - Suppose the user presses the shutter release button halfway at a time t11 before starting shooting (in Step S1). In response, the
microprocessor 8 performs the AE processing and the AF processing. - Next, the
microprocessor 8 obtains motion vectors from the image processing section 3 (in Step S3) and continues getting them until the user presses the button fully (in Step S4). And when the user presses the shutter release button fully at a time t13 (in Step S5), the process advances to Step S9 in which the shutter speed is corrected. At this point in time, themicroprocessor 8 retains the motion vectors, which have been estimated by themotion detecting section 34 between the times t12 and t13, in the internal memory. Themicroprocessor 8 calculates the average of those internally retained motion vectors and then defines the average as a motion vector magnitude for use to determine a corrected shutter speed value. Also, since the image stabilization control mode currently selected isMODE 1, themicroprocessor 8 obtains the corrected shutter speed value in accordance with the correctionvalue map # 1 shown inFIG. 7 (in Step S9). As the motion vector magnitude is obtained as described above, themicroprocessor 8 determines the corrected shutter speed value based on that value and the relation shown inFIG. 7 . Thereafter, themicroprocessor 8 adds the corrected shutter speed value to the shutter speed value that has been temporarily set in Step S2, thereby determining the final shutter speed. For example, in a situation where the shutter speed is selectable from 1/8 seconds, 1/15 seconds, 1/30 seconds, or 1/60 seconds, if the temporarily set shutter speed is 1/15 seconds and the corrected value should be one level higher than the temporary one, then the final shutter speed becomes 1/30 seconds. - Thereafter, the
imager 2 starts performing an exposure operation and continues the exposure for a period of time associated with the final shutter speed, thereby obtaining captured image data (in Step S11). Then, theimage processing section 3 subjects the captured image data to predetermined processing and then stores it in thememory card 7. In this manner, a series of shooting operations inMODE 1 gets done. - In the example described above, as soon as the shutter release button is pressed fully at the time t13, the exposure operation is supposed to start immediately. However, there could be some time lag between the time when the button is pressed fully and the time when the exposure operation is started.
- Hereinafter, it will be described with reference to
FIGS. 4 and6 how the digital camera of this preferred embodiment operates if the image stabilization mode isMODE 2. - Suppose the user presses the shutter release button halfway at a time t21 before starting shooting (in Step S1). In response, the
microprocessor 8 performs the AE processing and the AF processing. - Next, the
microprocessor 8 obtains motion vectors from the image processing section 3 (in Step S3) and continues getting them until the user presses the button fully (in Step S4). And when the user presses the shutter release button fully at a time t23 (in Step S5), themicroprocessor 8 starts performing an image stabilization control at the time t23 (in Step S6). At this point in time, themicroprocessor 8 retains the motion vectors, which have been estimated by themotion detecting section 34 between the times t22 and t23, in the internal memory. Themicroprocessor 8 calculates the average of those internally retained motion vectors and then defines the average as a motion vector magnitude for use to determine a corrected shutter speed value. - Next, the
microprocessor 8 determines whether or not the output of thegyro sensor 10 is greater than a predetermined value A. If the user is holding the digital camera with his or her hands, for example, the digital camera has significant camera shake. In that case, the output of thegyro sensor 10 should be greater than the predetermined value A. On the other hand, if the digital camera is fixed on a tripod, for example, then the digital camera has little camera shake. In that case, the output of thegyro sensor 10 should be smaller than the predetermined value A. - Supposing the output of the
gyro sensor 10 is greater than the predetermined value A, themicroprocessor 8 obtains the corrected shutter speed value in accordance with the correctionvalue map # 2 shown inFIG. 1 (in Step S9). As the motion vector magnitude is obtained as described above, themicroprocessor 8 determines the corrected shutter speed value based on that value and the relation shown inFIG. 7 . Thereafter, themicroprocessor 8 adds the corrected shutter speed value to the shutter speed value that has been temporarily set in Step S2, thereby determining the final shutter speed (in Step S10). - On the other hand, if the output of the
gyro sensor 10 is smaller than the predetermined value A, themicroprocessor 8 obtains the corrected shutter speed value in accordance with the correctionvalue map # 1 shown inFIG. 7 (in Step S9). As the motion vector magnitude is obtained as described above, themicroprocessor 8 determines the corrected shutter speed value based on that value and the relation shown inFIG. 7 . Thereafter, themicroprocessor 8 corrects the shutter speed value that has been temporarily set in Step S2 with the corrected shutter speed value, thereby determining the final shutter speed (in Step S10). - After that, the shooting operation in
MODE 2 is performed as inMODE 1. - As described above, if the image stabilization control mode is
MODE 2, theimage stabilizer lens 11 does not change its positions until just before a still picture starts to be shot (at the time t23), and the image stabilization control is started after that time that is just before the shooting is started. That is why even if the magnitude of the camera shake of the digital camera remains the same before and after the time t23, the blur of the subject's image on theimager 2 has different magnitudes before and after the time t23. - According to the present invention, the corrected shutter speed is obtained based on the motion vector magnitude, which is the average of the motion vectors that have been estimated until just before a still picture starts to be shot. For that reason, there is no problem if the magnitude of the image blur during the exposure operation is similar to that of the image blur while the motion vectors are being estimated. However, it is a problem if there is significant camera shake in
MODE 2. That is to say, while motion vectors are being estimated, no image stabilization control is performed, and therefore, there is significant image blur. That is why if the shutter speed were corrected with a motion vector that has been estimated in such a state, then the shutter speed would be just increased. However, since the image stabilization control is performed during the exposure operation, there is little image blur. That is why even if the shutter speed is not increased, an image can be shot with little blur. Therefore, if there is significant camera shake inMODE 2, the magnitudes of image blur will be different while the motion vectors are being estimated and while the exposure operation is being performed. Consequently, if the shutter speed were just corrected according to the motion vector magnitude estimated, then the shutter speed would be increased more than necessarily. - For that reason, if the camera shake is significant in
MODE 2, the shutter speed is set based on the correctionvalue map # 2 that is different from the one used inMODE 1. That is to say, themicroprocessor 8 controls theshutter control section 101 such that if the motion vector magnitude is equal to a certain value (i.e., greater than A1 but equal to or smaller than B1, or greater than A2 but equal to or smaller than B2, or greater than A3 but equal to or smaller than B3 in this preferred embodiment), then the shutter speed forMODE 2 is lower than the one forMODE 1. - On the other hand, if the camera shake is little, the magnitude of the image blur remains the same even in
MODE 2 just before and after the exposure operation (at the time t23). That is why the shutter speed is set based on the same correction value map as the one used inMODE 1. - The first preferred embodiment of the present invention described above could be modified in various manners as long as those modifications fall within the scope of the present invention. Those modified examples will be described collectively as a second preferred embodiment of the present invention.
- In the first preferred embodiment of the present invention described above, the exposure period of the
imager 2 is supposed to be set using an electronic shutter. However, the present invention is in no way limited to that specific preferred embodiment. Alternatively, the exposure period may also be controlled by providing a mechanical shutter for theimager 2 such that the mechanical shutter faces the subject and by adjusting the shutter speed of that mechanical shutter. - Also, in the first preferred embodiment of the present invention described above, two control modes called
MODES - Furthermore, in the first preferred embodiment of the present invention described above, a temporary shutter speed is supposed to be set and then corrected. Alternatively, the shutter speed may also be determined directly based on the image stabilization mode, motion vector and exposure value.
- Also, in the first preferred embodiment of the present invention described above, the AF processing is supposed to be performed in Step S2 shown in
FIG. 6 . Optionally, the present invention could also cope with a manual focus operation instead of performing the AF processing. - Furthermore, in the first preferred embodiment of the present invention described above, the motion of an image shot is supposed to be detected by estimating a motion vector. However, the present invention is in no way limited to that specific preferred embodiment. Alternatively, the motion of an image shot can also be figured out by calculating the differences between the pixel values of the previous and next frames and integrating them together. In any case, the present invention is applicable to any situation as long as the motion of an image shot can be calculated.
- Optionally, in the first preferred embodiment of the present invention described above, the gain to the image data may be controlled according to the shutter speed value. For example, if the shutter speed is high, a dark image may be brightened by increasing the gain. On the other hand, if the shutter speed is low, the noise of an image shot may be reduced by decreasing the gain. In the first preferred embodiment of the present invention, the image data that has been digitized by the
preprocessing section 31 may have its gain controlled. However, the present invention is in no way limited to that specific preferred embodiment. For example, the image data as an analog signal may have its gain controlled before passed to the A/D converter 105. - The present invention is applicable for use in an image capture device that has multiple image stabilization control modes. Specifically, the present invention can be used effectively in a digital still camera or a cellphone with a camera function, for example.
Claims (5)
- An image capture device with an ability to adjust an exposure value for shooting an image, the device comprising:a motion detecting section for detecting a motion of an image shot;a blur compensating section for optically compensating for a blur of the image shot;a mode selecting section for selecting one of multiple control modes for the blur compensating section; anda setting determining section for determining a setting related to an exposure value for shooting an image based on the control mode selected by the mode selecting section and the motion of the image shot that has been detected.
- The image capture device of claim 1, further comprising a shutter,
wherein the device is able to control the exposure value for shooting an image by adjusting a shutter speed of the shutter, and
wherein the setting determining section determines the shutter speed based on the control mode selected and the motion of the image shot that has been detected. - The image capture device of claim 1, wherein the control modes of the blur compensating section include a first control mode and a second control mode, and
wherein in the first control mode, the blur compensating section continues to compensate for the blur of the image shot through a period from shooting one still picture to shooting the next still picture, and
wherein in the second control mode, a session that the blur compensating section either suspends or attenuates the operation of compensating for the blur of the image shot exists during the period. - The image capture device of claim 3, wherein in a situation where the motion of the image shot that has been detected by the motion detecting section has at least one predetermined value,
if the control mode selected is the first control mode, the setting determining section selects a first setting as the setting related to an exposure value, but
if the control mode selected is the second control mode, the setting determining section selects a second setting as the setting related to an exposure value, and
wherein the exposure value for shooting an image is greater when the second setting is selected than when the first setting is selected. - The image capture device of claim 2, wherein the shutter is a mechanical shutter and/or an electronic shutter.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006197902 | 2006-07-20 | ||
PCT/JP2007/064284 WO2008010559A1 (en) | 2006-07-20 | 2007-07-19 | Imaging apparatus |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2046018A1 true EP2046018A1 (en) | 2009-04-08 |
EP2046018A4 EP2046018A4 (en) | 2009-09-23 |
EP2046018B1 EP2046018B1 (en) | 2012-12-05 |
Family
ID=38956891
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07791037A Expired - Fee Related EP2046018B1 (en) | 2006-07-20 | 2007-07-19 | Imaging apparatus |
Country Status (5)
Country | Link |
---|---|
US (1) | US8289404B2 (en) |
EP (1) | EP2046018B1 (en) |
JP (1) | JP4916513B2 (en) |
CN (1) | CN101491084A (en) |
WO (1) | WO2008010559A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3342149A4 (en) * | 2015-10-20 | 2018-11-14 | Samsung Electronics Co., Ltd. | Camera module having stabilizer and electronic device including the same |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4916513B2 (en) * | 2006-07-20 | 2012-04-11 | パナソニック株式会社 | Imaging device |
JP5354946B2 (en) * | 2007-05-28 | 2013-11-27 | キヤノン株式会社 | Imaging device and lens device |
JP5406546B2 (en) * | 2009-02-06 | 2014-02-05 | キヤノン株式会社 | Imaging apparatus and control method thereof |
US8478071B2 (en) * | 2009-12-16 | 2013-07-02 | Nvidia Corporation | System and method for constructing a motion-compensated composite image |
US9179062B1 (en) * | 2014-11-06 | 2015-11-03 | Duelight Llc | Systems and methods for performing operations on pixel data |
US9531961B2 (en) | 2015-05-01 | 2016-12-27 | Duelight Llc | Systems and methods for generating a digital image using separate color and intensity data |
US9918017B2 (en) | 2012-09-04 | 2018-03-13 | Duelight Llc | Image sensor apparatus and method for obtaining multiple exposures with zero interframe time |
US9819849B1 (en) | 2016-07-01 | 2017-11-14 | Duelight Llc | Systems and methods for capturing digital images |
US9807322B2 (en) | 2013-03-15 | 2017-10-31 | Duelight Llc | Systems and methods for a digital image sensor |
US10558848B2 (en) | 2017-10-05 | 2020-02-11 | Duelight Llc | System, method, and computer program for capturing an image with correct skin tone exposure |
US10924688B2 (en) | 2014-11-06 | 2021-02-16 | Duelight Llc | Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene |
US11463630B2 (en) | 2014-11-07 | 2022-10-04 | Duelight Llc | Systems and methods for generating a high-dynamic range (HDR) pixel stream |
WO2018044314A1 (en) | 2016-09-01 | 2018-03-08 | Duelight Llc | Systems and methods for adjusting focus based on focus target information |
CN109803079B (en) * | 2019-02-18 | 2021-04-27 | Oppo广东移动通信有限公司 | Mobile terminal, photographing method thereof and computer storage medium |
JP2021158438A (en) * | 2020-03-25 | 2021-10-07 | キヤノン株式会社 | Image pickup apparatus, control method, and program |
CN113489909B (en) * | 2021-07-30 | 2024-01-19 | 维沃移动通信有限公司 | Shooting parameter determining method and device and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0388936A2 (en) * | 1989-03-22 | 1990-09-26 | Matsushita Electric Industrial Co., Ltd. | Image pickup device |
US5220375A (en) * | 1989-06-21 | 1993-06-15 | Minolta Camera Kabushiki Kaisha | Camera having blurring correction apparatus |
US5245378A (en) * | 1990-07-09 | 1993-09-14 | Canon Kabushiki Kaisha | Image stabilization device |
US6272289B1 (en) * | 1998-09-14 | 2001-08-07 | Canon Kabushiki Kaisha | Camera |
EP1672914A2 (en) * | 2004-12-15 | 2006-06-21 | Canon Kabushiki Kaisha | Image taking apparatus and image taking method |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0455444B1 (en) * | 1990-04-29 | 1997-10-08 | Canon Kabushiki Kaisha | Movement detection device and focus detection apparatus using such device |
US5030984A (en) * | 1990-07-19 | 1991-07-09 | Eastman Kodak Company | Method and associated apparatus for minimizing the effects of motion in the recording of an image |
EP0556666B1 (en) * | 1992-02-06 | 1999-04-28 | Nikon Corporation | Camera with pan shot detecting device |
US5809353A (en) * | 1994-05-16 | 1998-09-15 | Nikon Corporation | Camera which compensates for motion to suppress image blur and terminates motion compensation automatically after exposure |
JPH08327917A (en) | 1995-06-01 | 1996-12-13 | Nikon Corp | Image pickup device |
US7027087B2 (en) * | 1998-08-21 | 2006-04-11 | Nikon Corporation | Electronic camera |
US6778210B1 (en) * | 1999-07-15 | 2004-08-17 | Olympus Optical Co., Ltd. | Image pickup apparatus with blur compensation |
JP3430994B2 (en) | 1999-09-28 | 2003-07-28 | ミノルタ株式会社 | camera |
JP3697129B2 (en) * | 2000-01-20 | 2005-09-21 | キヤノン株式会社 | Imaging device |
US7064777B2 (en) * | 2000-08-31 | 2006-06-20 | Canon Kabushiki Kaisha | Blur correction aparatus, control apparatus to be used in a blur correction apparatus, image taking apparatus, control method to be used in these apparatuses and computer program product to be used with these apparatuses |
DE10348567A1 (en) * | 2002-10-22 | 2004-05-13 | Fuji Photo Optical Co. Ltd. | Image blur correction device |
JP2004361486A (en) * | 2003-06-02 | 2004-12-24 | Nikon Corp | Digital still camera |
JP4478422B2 (en) * | 2003-09-18 | 2010-06-09 | キヤノン株式会社 | Image stabilization device, interchangeable lens, and photographing device |
JP2005266480A (en) * | 2004-03-19 | 2005-09-29 | Nikon Corp | Blur correction apparatus |
US8045009B2 (en) | 2004-05-10 | 2011-10-25 | Hewlett-Packard Development Company, L.P. | Image-exposure systems and methods using detecting motion of a camera to terminate exposure |
JP4556560B2 (en) * | 2004-08-27 | 2010-10-06 | 株式会社ニコン | Blur correction device and camera system |
US7522188B2 (en) | 2004-06-08 | 2009-04-21 | Nikon Corporation | Vibration reduction apparatus having feedback path for motion signal, and camera system |
JP2006033578A (en) * | 2004-07-20 | 2006-02-02 | Canon Inc | Optical instrument compensation device and system |
JP2006171654A (en) * | 2004-12-20 | 2006-06-29 | Olympus Corp | Photographic apparatus |
JP2006186796A (en) * | 2004-12-28 | 2006-07-13 | Casio Comput Co Ltd | Photographic apparatus, photographing method, and photographing program |
US7791643B2 (en) * | 2005-01-28 | 2010-09-07 | Hewlett-Packard Development Company, L.P. | Sequenced response image stabilization |
JP4916513B2 (en) * | 2006-07-20 | 2012-04-11 | パナソニック株式会社 | Imaging device |
-
2007
- 2007-07-19 JP JP2008525907A patent/JP4916513B2/en not_active Expired - Fee Related
- 2007-07-19 EP EP07791037A patent/EP2046018B1/en not_active Expired - Fee Related
- 2007-07-19 WO PCT/JP2007/064284 patent/WO2008010559A1/en active Search and Examination
- 2007-07-19 CN CNA2007800269302A patent/CN101491084A/en active Pending
- 2007-07-19 US US12/374,356 patent/US8289404B2/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0388936A2 (en) * | 1989-03-22 | 1990-09-26 | Matsushita Electric Industrial Co., Ltd. | Image pickup device |
US5220375A (en) * | 1989-06-21 | 1993-06-15 | Minolta Camera Kabushiki Kaisha | Camera having blurring correction apparatus |
US5245378A (en) * | 1990-07-09 | 1993-09-14 | Canon Kabushiki Kaisha | Image stabilization device |
US6272289B1 (en) * | 1998-09-14 | 2001-08-07 | Canon Kabushiki Kaisha | Camera |
EP1672914A2 (en) * | 2004-12-15 | 2006-06-21 | Canon Kabushiki Kaisha | Image taking apparatus and image taking method |
Non-Patent Citations (1)
Title |
---|
See also references of WO2008010559A1 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3342149A4 (en) * | 2015-10-20 | 2018-11-14 | Samsung Electronics Co., Ltd. | Camera module having stabilizer and electronic device including the same |
US10310290B2 (en) | 2015-10-20 | 2019-06-04 | Samsung Electronics Co., Ltd. | Camera module having stabilizer and electronic device including the same |
Also Published As
Publication number | Publication date |
---|---|
EP2046018A4 (en) | 2009-09-23 |
CN101491084A (en) | 2009-07-22 |
US8289404B2 (en) | 2012-10-16 |
US20090251550A1 (en) | 2009-10-08 |
JPWO2008010559A1 (en) | 2009-12-17 |
EP2046018B1 (en) | 2012-12-05 |
JP4916513B2 (en) | 2012-04-11 |
WO2008010559A1 (en) | 2008-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2046018B1 (en) | Imaging apparatus | |
EP2081374B1 (en) | Imaging apparatus and its control method | |
US7995852B2 (en) | Imaging device and imaging method | |
US7565068B2 (en) | Image-taking apparatus | |
US7606476B2 (en) | Imaging device and imaging method | |
US8081233B2 (en) | Imaging device and imaging method | |
KR101625893B1 (en) | Image pickup apparatus that periodically changes exposure condition, a method of controlling image pickup apparatus, and storage medium | |
JP4974704B2 (en) | Imaging device | |
EP1808014B1 (en) | Camera and image processing method for camera | |
US8294795B2 (en) | Image capturing apparatus and medium storing image processing program | |
JPWO2006082967A1 (en) | Imaging device | |
JP2009139688A (en) | Focus adjustment device and camera | |
JP2008288975A (en) | Imaging apparatus, imaging method and imaging program | |
US8593545B2 (en) | Imaging apparatus, imaging method, and computer-readable recording medium with switched image capturing mode | |
US8253850B2 (en) | Imaging apparatus and program thereof | |
US8570407B2 (en) | Imaging apparatus, image processing program, image processing apparatus, and image processing method | |
US11190704B2 (en) | Imaging apparatus and control method for performing live view display of a tracked object | |
KR20130057764A (en) | Digital photographing apparatus and control method thereof | |
US20130093945A1 (en) | Imaging apparatus | |
JP2006203346A (en) | Electronic camera | |
US11330179B2 (en) | Imaging device and control method thereof | |
US11336802B2 (en) | Imaging apparatus | |
JP3376156B2 (en) | Imaging device | |
JP5217783B2 (en) | Imaging device | |
JP5393189B2 (en) | Imaging apparatus and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090123 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK RS |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20090820 |
|
17Q | First examination report despatched |
Effective date: 20100811 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602007027168 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: H04N0005232000 Ipc: H04N0005235000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 5/232 20060101ALI20120418BHEP Ipc: G03B 5/00 20060101ALI20120418BHEP Ipc: H04N 5/235 20060101AFI20120418BHEP Ipc: G03B 7/097 20060101ALI20120418BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
DAX | Request for extension of the european patent (deleted) | ||
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): DE FR GB |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602007027168 Country of ref document: DE Effective date: 20130131 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20130906 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602007027168 Country of ref document: DE Effective date: 20130906 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 10 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 11 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20190719 Year of fee payment: 13 Ref country code: FR Payment date: 20190719 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20190719 Year of fee payment: 13 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602007027168 Country of ref document: DE |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20200719 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200719 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210202 |