EP1845709B1 - Image capturing apparatus, control method therefor, image processing apparatus, image processing method, and program - Google Patents

Image capturing apparatus, control method therefor, image processing apparatus, image processing method, and program Download PDF

Info

Publication number
EP1845709B1
EP1845709B1 EP07106043.8A EP07106043A EP1845709B1 EP 1845709 B1 EP1845709 B1 EP 1845709B1 EP 07106043 A EP07106043 A EP 07106043A EP 1845709 B1 EP1845709 B1 EP 1845709B1
Authority
EP
European Patent Office
Prior art keywords
image
foreign substance
images
area
positional shift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
EP07106043.8A
Other languages
German (de)
French (fr)
Other versions
EP1845709A3 (en
EP1845709A2 (en
Inventor
Masafumi Kimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of EP1845709A2 publication Critical patent/EP1845709A2/en
Publication of EP1845709A3 publication Critical patent/EP1845709A3/en
Application granted granted Critical
Publication of EP1845709B1 publication Critical patent/EP1845709B1/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • H04N23/811Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • the present invention relates to a technique for suppressing the influence of a foreign substance adhering to the neighboring portion of an image sensor on the image quality in a digital camera or the like.
  • Japanese Patent Laid-Open No. 2004-172820 discloses a method of detecting a foreign substance from a plurality of images.
  • the user acquires a plurality of images in advance, detects an unchangeable portion such as contrast over the plurality of images, and detects the position of the foreign substance on the basis of the unchangeable portion.
  • the user can obtain a high-quality image by taking a picture after appropriately removing the foreign substance in a cleaning mode or the like.
  • Japanese Patent Laid-Open No. 2004-222231 discloses a method of correcting a change in luminance due to the presence of a foreign substance from a reference image.
  • the camera captures an image of an object having a uniform luminance serving as a reference to generate a transmittance map from the luminance distribution obtained upon image capture. After that, the user appropriately executes gain correction for the captured image to correct a change in transmittance due to the presence of a foreign substance. This makes it possible to attain a high-quality image.
  • Japanese Patent Laid-Open No. 2000-341582 discloses a method of positioning a plurality of images. According to the invention disclosed in Japanese Patent Laid-Open No. 2000-341582 , a plurality of images are positioned with reference to a feature point with high reliability to generate one image from the plurality of images. Appropriate composition attains a composite image having a wider dynamic range.
  • Japanese Patent Laid-Open No. 2004-172820 requires the user to execute an operation such as a cleaning mode, so it cannot cope with a foreign substance adhered immediately before image capture.
  • the invention in Japanese Patent Laid-Open No. 2004-222231 also requires the user to execute an operation such as a cleaning mode, so it cannot cope with a foreign substance adhered immediately before image capture.
  • an operation such as a cleaning mode
  • the reference image is an inappropriate one which is susceptible to texture, appropriate gain correction is impossible.
  • the invention in Japanese Patent Laid-Open No. 2000-341582 cannot cope with degradation in image quality due to the adhesion of a foreign substance.
  • Document US 2004 0041936 A1 relates to an electronic camera having self-detection function of foreign materials and a control program thereof.
  • the electronic camera includes: an image sensor having an image pickup plane on an image plane of an optical system; a formation changing section changing a state of image formation of the optical system on the image pickup plane; and a control section driving and controlling the image sensor and formation changing section.
  • the control section drives the formation changing section to set a plurality of states of image formation different from each other, and it drives the image sensor to obtain a plurality of images in each of the states and compares the plurality of images to find a part of the image which has not varied with a change in the state of image formation, and determines the found part as image of a foreign material.
  • the WO 2006/022229 A1 discloses an image pickup optical device capable of preventing degradation of quality of a picked-up image due to dust or flaw in an image pickup optical system of the image pickup optical device.
  • the present invention has been made in consideration of the above problems, and obtains an image which is hardly influenced by a foreign substance even when it adheres to the neighboring portion of an image sensor in an image capturing apparatus.
  • an image processing apparatus as specified in claim 1.
  • Fig. 1 is a block diagram showing the arrangement of a digital camera according to the first embodiment of the present invention.
  • reference numeral 41 denotes a photographing optical system which forms an object image
  • 41a an aperture stop which is accommodated in the photographing optical system 41 and adjusts the amount of light that enters an image sensor 42
  • 42 the image sensor which photo-electrically converts the object image.
  • an optical member 42a such as a lowpass filter and cover filter is arranged in its proximity. A foreign substance adheres to the surface of the optical member 42a. The adhered foreign substance is captured in the object image on the image sensor 42 as a shadow.
  • Reference numeral 43 denotes an A/D converter which converts an analog image signal output from the image sensor 42 into a digital signal; and 44, an image processing apparatus which processes the digital image signal output from the A/D converter 43.
  • Reference numeral 45 denotes a lens system control unit which controls the lens position of the photographing optical system 41 and the degree of opening of the aperture stop 41a; and 46, various sensors such as an AF (Auto Focus) sensor and AE (Auto Exposure) sensor.
  • Reference numeral 47 denotes a camera control unit which controls the operation of the overall digital camera; 48, an I/O which interfaces with a release switch, display, and the like; and 49, a memory which stores captured images and various types of information.
  • the digital camera acquires the user's operation via the I/O 48 and executes power ON/OFF, an image capturing operation, and the like in accordance with the user's instruction.
  • the camera control unit 47 decides an appropriate image capturing condition on the basis of the information obtained from the various sensors 46 or image sensor 42, and sets an appropriate lens position via the lens system control unit 45.
  • the output signal from the image sensor 42 is digitized via the A/D converter 43 after exposure, undergoes an appropriate image process by the image processing apparatus 44, and is saved in the memory 49.
  • a display (not shown) displays the image via the I/O 48 as needed.
  • the image processing apparatus 44 generally executes processes such as white balance adjustment, RGB development, and compression encoding.
  • the image processing apparatus 44 according to the first embodiment comprises a composition unit which composites a plurality of images to generate one output image, in addition to the above processes.
  • Fig. 2 is a block diagram showing the arrangement of the image processing apparatus 44.
  • reference numeral 44 denotes the image processing apparatus; 52, an image processing unit which composites and corrects a plurality of images 10a, 10b,...; 53, a white balance adjustment unit; 54, an RGB developing unit; and 55, a compression encoding unit.
  • the plurality of images 10a, 10b,... are composited to generate one output image 11, although a detailed description thereof will be omitted.
  • the output image 11 undergoes an appropriate image process to obtain a high-quality compressed image, in which deterioration due to the presence of the foreign substance is corrected.
  • Fig. 3 is a block diagram mainly showing the image processing unit 52 extracted from the image processing apparatus 44 shown in Fig. 2 , and the data sequence of an image processing method according to the first embodiment. An image processing operation according to the first embodiment will be described with reference to Fig. 3 .
  • the plurality of images 10a, 10b,... shown in Figs. 2 and 3 are images obtained by continuously shooting the same object.
  • one image is generated by continuously shooting the same object and generating and compositing a plurality of images.
  • the reason why the same object is continuously shot to composite the plurality of captured images is as follows.
  • a plurality of underexposure images are captured by increasing the shutter speed from a level at which appropriate exposure can be attained but camera shake is likely to occur to a level at which camera shake is less likely to occur.
  • These plurality of images are composited to generate one image with appropriate exposure. That is, this technique captures and composites a plurality of images with underexposure which makes camera shake negligible to obtain an image with appropriate exposure which makes the influence of camera shake inconspicuous.
  • the plurality of images 10a, 10b,... shown in Figs. 2 and 3 are images obtained by continuous shooting for the above purpose. Note that the plurality of images 10a, 10b,... are not limited to images captured for camera shake correction, and may be images captured to be composited for another purpose.
  • a positional shift detection unit 1 and image composition unit 2 receive the plurality of images 10a, 10b,....
  • the positional shift detection unit 1 detects the positional shifts between the plurality of images using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 .
  • the image composition unit 2 composites the plurality of images into one image by executing positioning and exposure compensation for the plurality of images on the basis of the positional shift information calculated by the positional shift detection unit 1.
  • An image correction unit 3 generates an output image 11 by correcting the composite image on the basis of the composite image obtained by the image composition unit 2, the positional shift information between the plurality of images obtained by the image composition unit 2, and foreign substance area information stored in a memory 4.
  • the foreign substance area information on the memory 4 is information about an area on the neighboring portion of the image sensor, within which a foreign substance detected using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 or 2004-222231 exists.
  • This foreign substance area information is information associated with the position and size of a foreign substance existing on the optical member 42a when it is seen in a captured image.
  • Figs. 4A to 4D exemplify a case in which two images are to be composited, for descriptive convenience.
  • Figs. 4A and 4B show a plurality of images (two images in this case) to be given to the positional shift detection unit 1.
  • Fig. 4C is a schematic view when the two images shown in Figs. 4A and 4B are superposed without any positioning.
  • the positional shift detection unit 1 calculates the positional shifts between a plurality of images (two images in this case) using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 .
  • Fig. 4C shows the positional shift vector between the two images calculated at this time.
  • the image composition unit 2 composites the two images by executing positioning on the basis of the positional shift vector between the two images calculated by the positional shift detection unit 1.
  • Fig. 4D exemplifies the composited image.
  • reference numerals 20a and 20b denote areas in the two images, where the image quality is degraded due to the adhesion of the foreign substance to the optical member 42a.
  • Reference numerals 21a and 21b denote identical objects in these two images.
  • the positional shift vector calculated by the positional shift detection unit 1 takes a value other than zero. That is, a vector 22c in Fig. 4C takes a value other than a zero vector. Even when the objects 21a and 21b are identical objects, their positions on the window do not coincide with each other.
  • the image of the foreign substance adhering to the optical member 42a always exists at the fixed place on the window. That is, the areas 20a and 20b where the image quality is degraded due to the adhesion of the foreign substance coincide with each other in Fig. 4C in which the two images are superposed without any positioning.
  • the positions of the objects 21a and 21b coincide with each other while those of the areas 20a and 20b where the image quality is degraded due to the adhesion of the foreign substance do not coincide with each other. That is, the composited image shown in Fig. 4D has the two areas 20a and 20b where the image quality is degraded.
  • the image correction unit 3 receives positional shift information (positional shift vector) calculated by the positional shift detection unit 1, a composite image composited by the image composition unit 2, and foreign substance information (e.g., the position and size of a foreign substance) stored in the memory 4 in advance.
  • the foreign substance information is calculated in advance using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 or 2004-222231 , and is accumulated on the memory 4.
  • Figs. 5A to 5C are schematic views showing two images before composition and a composited image, in which Figs. 5A and 5B show two images before composition, and Fig. 5C shows a composited image.
  • the information on the memory 4 is given as, e.g., the position and size of the foreign substance.
  • the foreign substance areas 20a and 20b in the two images before compositions are specified.
  • the positional shift detection unit 1 has already detected the positional shift information of the two images. This makes it possible to easily specify the position of the foreign substance area 20a as a foreign substance area 23b when the image shown in Fig. 5A is positioned and superposed on the image shown in Fig. 5B . Similarly, it is possible to specify the position of the foreign substance area 20b as a foreign substance area 23a when the image shown in Fig. 5B is positioned and superposed on the image shown in Fig. 5A .
  • Each of the foreign substance areas 23a and 23b is assumed to be a partial image corresponding to an area where the image quality is degraded in the image which does not contain the area 23a or 23b. These partial images are corrected using the images 23a and 23b free from any image deterioration. Image correction is executed using an appropriate method of, e.g., interchanging the area 20a with the area 23b, interchanging the area 20b with the area 23a, or changing the weight in weighted mean averaging of the areas 20a and 23b and the areas 20b and 23a.
  • the finally obtained image is the one which is generated using information about areas free from any image deterioration due to the presence of a so-called foreign substance to result in a high-quality image.
  • Using more than two images makes it possible to more accurately take a measure against image deterioration even over a wider range.
  • the first embodiment it is possible to suppress the capture of a foreign substance in the captured image to result in a high-quality image.
  • the block arrangement of a digital camera in the second embodiment is the same as that in the first embodiment shown in Fig. 1 except for the internal arrangement of the image processing apparatus 44.
  • Fig. 6 is a block diagram showing the internal arrangement of an image processing apparatus 44 according to the second embodiment.
  • reference numeral 44 denotes the image processing apparatus; 52a, a multiple image processing unit; 53, a white balance adjustment unit; 54, an RGB developing unit; and 55, a compression encoding unit.
  • the multiple image processing unit 52a differs from the image processing unit 52 according to the first embodiment shown in Fig. 2 in that a foreign substance area detection unit 5 is provided.
  • the foreign substance area detection unit 5 acquires a plurality of images 10a, 10b,... with appropriate exposure in an aperture state designated by the user, and a plurality of images 13a, 13b,... obtained by narrowing down an aperture stop 41a to an extent larger than a predetermined value.
  • Figs. 7A and 7B are schematic views showing the relationship between an aperture stop and an image of a foreign substance.
  • Fig. 7A is a schematic view when the aperture stop is opened (i.e., the F-number is small).
  • Fig. 7B is a schematic view when the aperture stop is narrowed down (i.e., the F-number is large).
  • reference numerals 61a and 61b denote pupils; 62a and 62b, ranges on an image sensor, where the influence of a foreign substance reaches; 63, a foreign substance; and 64a and 64b, graphs schematically showing the magnitude of the influence of the foreign substance on the image sensor.
  • narrowing down the aperture stop 41a narrows the range where the influence of the foreign substance reaches, and increases the peak at the same time to lead to a greater influence on the image signal.
  • narrowing down the aperture stop 41a makes it possible to make the image of the foreign substance clear to facilitate the detection of it. This makes it possible to improve the reliability of foreign substance detection.
  • the multiple image processing unit 52a detects a foreign substance using the narrowed image. On the basis of the foreign substance area information detected by this operation, the multiple image processing unit 52a corrects a plurality of images with appropriate exposure in an aperture state designated by the user to obtain an output image 11. This makes it possible to obtain a high-quality image corrected on the basis of more accurate foreign substance area information.
  • Fig. 8 is a block diagram mainly showing the image processing unit 52a extracted from the image processing apparatus 44 shown in Fig. 6 , and the data sequence of an image processing method according to the second embodiment. An image processing operation according to the second embodiment will be explained with reference to Fig. 8 .
  • a positional shift detection unit 1 and image composition unit 2 receive the plurality of images 10a, 10b,....
  • the positional shift detection unit 1 detects the positional shifts between the plurality of images using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 .
  • the image composition unit 2 composites the plurality of images into one image by executing positioning and exposure compensation for the plurality of images on the basis of the positional shift information calculated by the positional shift detection unit 1.
  • the foreign substance area detection unit 5 On the basis of the plurality of images and the positional shift information calculated by the positional shift detection unit 1, the foreign substance area detection unit 5 detects foreign substance area information such as the position and size of a foreign substance using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 .
  • An image correction unit 3 generates an output image 11 by correcting the composite image on the basis of the composite image obtained by the image composition unit 2, the positional shift information between the plurality of images obtained by the image composition unit 2, and the foreign substance area information obtained by the foreign substance area detection unit 5.
  • Fig. 9 shows an image obtained by superposing a plurality of given images (two given images in this case) without any positioning.
  • reference numerals 20a and 20b denote areas where the image quality is degraded due to the presence of a foreign substance.
  • reference numeral 22c denotes a positional shift vector calculated by the positional shift detection unit 1.
  • the positional shift detection unit 1 When the positional shift detection unit 1 appropriately calculates the positional shift information, it is possible to detect corresponding feature points of a finite number of feature points extracted with reference to contrast or the like, other than feature points generated by dirt. In contrast, since a feature point generated by dirt such as a feature point 24 shown in Fig. 9 does not shift between the plurality of images, it has no point corresponding to the head of a shift vector given by the positional shift information. This principle allows the detection of an image of a foreign substance.
  • the position of a foreign substance is specified to give the specified position as foreign substance area information to the image correction unit 3.
  • the image correction unit 3 corrects the image using the method shown in the first embodiment to generate an output image.
  • the finally obtained image is the one which is generated using information about areas free from any image deterioration due to the presence of a so-called foreign substance to result in a high-quality image.
  • Using more than two images makes it possible to more accurately take a measure against image deterioration even over a wider range.
  • the second embodiment has exemplified the case in which a plurality of images which are continuously shot with appropriate exposure are composited. However, when a plurality of images are captured with appropriate exposure without camera shake, it may be unnecessary to composite these images. In this case, a foreign substance area in one image alone may be corrected by an image of a corresponding area in the other image. This also makes it possible to eliminate the influence of the foreign substance from the image.
  • the first and second embodiments have exemplified the case in which a plurality of images are composited in the camera.
  • the camera may be used only to capture a plurality of images to cause an image processing apparatus outside the camera to execute an image composition process and foreign substance removal process.
  • an image processing apparatus such as a PC (personal computer) is configured to incorporate the image processing unit 52 or 52a. Via a detachable recording medium or by connecting the camera and the PC, the PC can receive a plurality of images 10a, 10b,..., a plurality of images 13a, 13b,..., and foreign substance information to be input. With this arrangement, the image processing apparatus outside the camera can attain an image composition process and foreign substance removal process.
  • the first and second embodiments it is possible to provide a digital camera and image processing apparatus capable of suppressing the capture of a foreign substance in the captured image to result in a high-quality image.
  • a foreign substance can be detected in the digital camera simultaneously with image capture, or by the image processing apparatus such as a PC after image capture. This obviates the need for prompting the user to execute an operation in a cleaning mode to result in an improvement in user-friendliness. It is also possible to appropriately execute a correction operation even for a foreign substance adhered immediately before image capture.
  • a storage medium (or recording medium) which records software program codes for implementing the functions of the above-described embodiments is supplied to the system or apparatus.
  • the computer or CPU or MPU
  • the program codes read out from the storage medium implement the functions of the above-described embodiments by themselves, and the storage medium which stores the program codes constitutes the present invention.
  • the present invention incorporates the following case. That is, the functions of the above-described embodiments are implemented when the operating system (OS) running on the computer performs part or all of actual processing on the basis of the instructions of the program codes.
  • the present invention also incorporates the following case. That is, the program codes read out from the storage medium are written in the memory of a function expansion card inserted into the computer or a function expansion unit connected to the computer. After that, the functions of the above-described embodiments are implemented when the CPU of the function expansion card or function expansion unit performs part or all of actual processing on the basis of the instructions of the program codes.
  • the storage medium stores program codes corresponding to the above-described procedures.
  • An image capturing apparatus includes an image sensor (42) which photo-electrically converts an object image, a memory (49) which stores foreign substance area information associated with a foreign substance adhering to an optical member disposed in front of the image sensor, a positional shift detection unit (44) which detects the mutual positional shifts between a plurality of images generated by sensing almost the same object by the image sensor, an image composition unit (44) which positions and composites the plurality of images on the basis of the detection result obtained by the positional shift detection unit, and an image correction unit (44) which corrects at least parts of the plurality of images on the basis of the detection result obtained by the positional shift detection unit and the foreign substance area information stored in the memory.

Description

    BACKGROUND OF THE INVENTION Field of the Invention
  • The present invention relates to a technique for suppressing the influence of a foreign substance adhering to the neighboring portion of an image sensor on the image quality in a digital camera or the like.
  • Description of the Related Art
  • In recent years, digitization of cameras are making rapid progress and, especially, a so-called digital single-lens reflex camera is becoming popular, which has the same optical layout as that of the conventional single-lens reflex camera and in which an image sensor that executes photo-electric conversion substitutes for a film. The digital single-lens reflex camera requires no film rewinding/replacement operations. Once a foreign substance enters the neighboring portion of the image sensor in the process of lens replacement or the like, the digital single-lens reflex camera continues to shoot images in which the foreign substance is captured. This degrades the quality of a series of captured images.
  • Japanese Patent Laid-Open No. 2004-172820 discloses a method of detecting a foreign substance from a plurality of images. According to the invention disclosed in Japanese Patent Laid-Open No. 2004-172820 , the user acquires a plurality of images in advance, detects an unchangeable portion such as contrast over the plurality of images, and detects the position of the foreign substance on the basis of the unchangeable portion. The user can obtain a high-quality image by taking a picture after appropriately removing the foreign substance in a cleaning mode or the like.
  • Japanese Patent Laid-Open No. 2004-222231 discloses a method of correcting a change in luminance due to the presence of a foreign substance from a reference image. According to the invention disclosed in Japanese Patent Laid-Open No. 2004-222231 , the camera captures an image of an object having a uniform luminance serving as a reference to generate a transmittance map from the luminance distribution obtained upon image capture. After that, the user appropriately executes gain correction for the captured image to correct a change in transmittance due to the presence of a foreign substance. This makes it possible to attain a high-quality image.
  • Japanese Patent Laid-Open No. 2000-341582 discloses a method of positioning a plurality of images. According to the invention disclosed in Japanese Patent Laid-Open No. 2000-341582 , a plurality of images are positioned with reference to a feature point with high reliability to generate one image from the plurality of images. Appropriate composition attains a composite image having a wider dynamic range.
  • Unfortunately, the above-described conventional techniques pose the following problems.
  • The invention in Japanese Patent Laid-Open No. 2004-172820 requires the user to execute an operation such as a cleaning mode, so it cannot cope with a foreign substance adhered immediately before image capture.
  • The invention in Japanese Patent Laid-Open No. 2004-222231 also requires the user to execute an operation such as a cleaning mode, so it cannot cope with a foreign substance adhered immediately before image capture. In addition, if the reference image is an inappropriate one which is susceptible to texture, appropriate gain correction is impossible.
  • The invention in Japanese Patent Laid-Open No. 2000-341582 cannot cope with degradation in image quality due to the adhesion of a foreign substance.
  • Document US 2004207738 A1 discloses that a sensor matrix is mechanically shifted by a preset vector to remove sensor-resistant impurities relative to their reproducing effect in digital photography. From the comparison of image storage (data) before and after the mechanical shift and from the thereby resulting comparison signal matrix it can be detected where sensor-fast impurities are present since through the mechanical shift there is also the image of the imaging beam shifted but not the image of the impurities.
  • Document US 2004 0041936 A1 relates to an electronic camera having self-detection function of foreign materials and a control program thereof. The electronic camera includes: an image sensor having an image pickup plane on an image plane of an optical system; a formation changing section changing a state of image formation of the optical system on the image pickup plane; and a control section driving and controlling the image sensor and formation changing section. The control section drives the formation changing section to set a plurality of states of image formation different from each other, and it drives the image sensor to obtain a plurality of images in each of the states and compares the plurality of images to find a part of the image which has not varied with a change in the state of image formation, and determines the found part as image of a foreign material.
  • The WO 2006/022229 A1 discloses an image pickup optical device capable of preventing degradation of quality of a picked-up image due to dust or flaw in an image pickup optical system of the image pickup optical device.
  • SUMMARY OF THE INVENTION
  • The present invention has been made in consideration of the above problems, and obtains an image which is hardly influenced by a foreign substance even when it adheres to the neighboring portion of an image sensor in an image capturing apparatus.
  • According to an aspect of the present invention, there is provided an image processing apparatus as specified in claim 1.
  • According to a further aspect of the present invention, there is provided an image processing method as specified in claim 6.
  • Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • Fig. 1 is a block diagram showing the arrangement of a digital camera according to the first embodiment of the present invention;
    • Fig. 2 is a block diagram showing the arrangement of an image processing apparatus according to the first embodiment;
    • Fig. 3 is a block diagram showing an image processing unit extracted from the image processing apparatus shown in Fig. 2, and the data sequence of an image processing method according to the first embodiment;
    • Figs. 4A and 4B are schematic views showing two images to be given to a positional shift detection unit;
    • Fig. 4C is a schematic view when the two images shown in Figs. 4A and 4B are superposed without any positioning;
    • Fig. 4D is a schematic view when the two images shown in Figs. 4A and 4B are superposed after positioning;
    • Figs. 5A to 5C are schematic views showing two images before composition and a composited image;
    • Fig. 6 is a block diagram showing the internal arrangement of an image processing apparatus according to the second embodiment;
    • Figs. 7A and 7B are schematic views showing the relationship between an aperture stop and an image of a foreign substance;
    • Fig. 8 is a block diagram showing an image processing unit extracted from the image processing apparatus shown in Fig. 6, and the data sequence of an image processing method according to the second embodiment; and
    • Fig. 9 is a schematic view showing an image obtained by superposing two given images without any positioning.
    DESCRIPTION OF THE EMBODIMENTS
  • Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
  • (First Embodiment)
  • Fig. 1 is a block diagram showing the arrangement of a digital camera according to the first embodiment of the present invention.
  • Referring to Fig. 1, reference numeral 41 denotes a photographing optical system which forms an object image; 41a, an aperture stop which is accommodated in the photographing optical system 41 and adjusts the amount of light that enters an image sensor 42; and 42, the image sensor which photo-electrically converts the object image. On the front surface of the image sensor 42, an optical member 42a such as a lowpass filter and cover filter is arranged in its proximity. A foreign substance adheres to the surface of the optical member 42a. The adhered foreign substance is captured in the object image on the image sensor 42 as a shadow. Reference numeral 43 denotes an A/D converter which converts an analog image signal output from the image sensor 42 into a digital signal; and 44, an image processing apparatus which processes the digital image signal output from the A/D converter 43. Reference numeral 45 denotes a lens system control unit which controls the lens position of the photographing optical system 41 and the degree of opening of the aperture stop 41a; and 46, various sensors such as an AF (Auto Focus) sensor and AE (Auto Exposure) sensor. Reference numeral 47 denotes a camera control unit which controls the operation of the overall digital camera; 48, an I/O which interfaces with a release switch, display, and the like; and 49, a memory which stores captured images and various types of information.
  • The digital camera acquires the user's operation via the I/O 48 and executes power ON/OFF, an image capturing operation, and the like in accordance with the user's instruction. Upon receiving an image capturing operation instruction, the camera control unit 47 decides an appropriate image capturing condition on the basis of the information obtained from the various sensors 46 or image sensor 42, and sets an appropriate lens position via the lens system control unit 45. The output signal from the image sensor 42 is digitized via the A/D converter 43 after exposure, undergoes an appropriate image process by the image processing apparatus 44, and is saved in the memory 49. A display (not shown) displays the image via the I/O 48 as needed.
  • The image processing apparatus 44 generally executes processes such as white balance adjustment, RGB development, and compression encoding. The image processing apparatus 44 according to the first embodiment comprises a composition unit which composites a plurality of images to generate one output image, in addition to the above processes.
  • Fig. 2 is a block diagram showing the arrangement of the image processing apparatus 44.
  • Referring to Fig. 2, reference numeral 44 denotes the image processing apparatus; 52, an image processing unit which composites and corrects a plurality of images 10a, 10b,...; 53, a white balance adjustment unit; 54, an RGB developing unit; and 55, a compression encoding unit.
  • In the first embodiment, when, e.g., the user designates a mode for correcting image deterioration due to the presence of a foreign substance from a plurality of images, the plurality of images 10a, 10b,... are composited to generate one output image 11, although a detailed description thereof will be omitted. After that, the output image 11 undergoes an appropriate image process to obtain a high-quality compressed image, in which deterioration due to the presence of the foreign substance is corrected.
  • Fig. 3 is a block diagram mainly showing the image processing unit 52 extracted from the image processing apparatus 44 shown in Fig. 2, and the data sequence of an image processing method according to the first embodiment. An image processing operation according to the first embodiment will be described with reference to Fig. 3.
  • The plurality of images 10a, 10b,... shown in Figs. 2 and 3 will be explained here.
  • The plurality of images 10a, 10b,... shown in Figs. 2 and 3 are images obtained by continuously shooting the same object. In the first embodiment, assume that one image is generated by continuously shooting the same object and generating and compositing a plurality of images. The reason why the same object is continuously shot to composite the plurality of captured images is as follows.
  • As is well known, camera shake occurs and leads to image deterioration when a shutter speed enough for image capture cannot be attained because the object exhibits low luminance. In this case, a plurality of underexposure images are captured by increasing the shutter speed from a level at which appropriate exposure can be attained but camera shake is likely to occur to a level at which camera shake is less likely to occur. These plurality of images are composited to generate one image with appropriate exposure. That is, this technique captures and composites a plurality of images with underexposure which makes camera shake negligible to obtain an image with appropriate exposure which makes the influence of camera shake inconspicuous.
  • In the first embodiment, the plurality of images 10a, 10b,... shown in Figs. 2 and 3 are images obtained by continuous shooting for the above purpose. Note that the plurality of images 10a, 10b,... are not limited to images captured for camera shake correction, and may be images captured to be composited for another purpose.
  • An image processing operation according to the first embodiment will be explained with reference to Fig. 3.
  • As shown in Fig. 3, a positional shift detection unit 1 and image composition unit 2 receive the plurality of images 10a, 10b,.... The positional shift detection unit 1 detects the positional shifts between the plurality of images using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 . The image composition unit 2 composites the plurality of images into one image by executing positioning and exposure compensation for the plurality of images on the basis of the positional shift information calculated by the positional shift detection unit 1. An image correction unit 3 generates an output image 11 by correcting the composite image on the basis of the composite image obtained by the image composition unit 2, the positional shift information between the plurality of images obtained by the image composition unit 2, and foreign substance area information stored in a memory 4. The foreign substance area information on the memory 4 is information about an area on the neighboring portion of the image sensor, within which a foreign substance detected using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 or 2004-222231 exists. This foreign substance area information is information associated with the position and size of a foreign substance existing on the optical member 42a when it is seen in a captured image.
  • The operations of the positional shift detection unit 1 and image composition unit 2 will be explained with reference to Figs. 4A to 4D.
  • Figs. 4A to 4D exemplify a case in which two images are to be composited, for descriptive convenience.
  • Figs. 4A and 4B show a plurality of images (two images in this case) to be given to the positional shift detection unit 1. Fig. 4C is a schematic view when the two images shown in Figs. 4A and 4B are superposed without any positioning.
  • The positional shift detection unit 1 calculates the positional shifts between a plurality of images (two images in this case) using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 . Fig. 4C shows the positional shift vector between the two images calculated at this time. The image composition unit 2 composites the two images by executing positioning on the basis of the positional shift vector between the two images calculated by the positional shift detection unit 1. Fig. 4D exemplifies the composited image.
  • Assume here that a foreign substance adheres to the optical member 42a. Referring to Figs. 4A to 4D, reference numerals 20a and 20b denote areas in the two images, where the image quality is degraded due to the adhesion of the foreign substance to the optical member 42a. Reference numerals 21a and 21b denote identical objects in these two images.
  • When the two images are captured not to have the same picture composition, the positional shift vector calculated by the positional shift detection unit 1 takes a value other than zero. That is, a vector 22c in Fig. 4C takes a value other than a zero vector. Even when the objects 21a and 21b are identical objects, their positions on the window do not coincide with each other.
  • The image of the foreign substance adhering to the optical member 42a always exists at the fixed place on the window. That is, the areas 20a and 20b where the image quality is degraded due to the adhesion of the foreign substance coincide with each other in Fig. 4C in which the two images are superposed without any positioning. In contrast, when the two images are appropriately positioned and composed as shown in Fig. 4D, the positions of the objects 21a and 21b coincide with each other while those of the areas 20a and 20b where the image quality is degraded due to the adhesion of the foreign substance do not coincide with each other. That is, the composited image shown in Fig. 4D has the two areas 20a and 20b where the image quality is degraded.
  • The operation of the image correction unit 3 serving as the main part of the first embodiment will be explained with reference to Figs. 3 and 5A to 5C.
  • The image correction unit 3 receives positional shift information (positional shift vector) calculated by the positional shift detection unit 1, a composite image composited by the image composition unit 2, and foreign substance information (e.g., the position and size of a foreign substance) stored in the memory 4 in advance. The foreign substance information is calculated in advance using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 or 2004-222231 , and is accumulated on the memory 4.
  • Figs. 5A to 5C are schematic views showing two images before composition and a composited image, in which Figs. 5A and 5B show two images before composition, and Fig. 5C shows a composited image.
  • Referring to Figs. 5A to 5C, the information on the memory 4 is given as, e.g., the position and size of the foreign substance. As shown in Figs. 5A and 5B, the foreign substance areas 20a and 20b in the two images before compositions are specified. As has been described with reference to Figs. 4A to 4D, the positional shift detection unit 1 has already detected the positional shift information of the two images. This makes it possible to easily specify the position of the foreign substance area 20a as a foreign substance area 23b when the image shown in Fig. 5A is positioned and superposed on the image shown in Fig. 5B. Similarly, it is possible to specify the position of the foreign substance area 20b as a foreign substance area 23a when the image shown in Fig. 5B is positioned and superposed on the image shown in Fig. 5A.
  • Each of the foreign substance areas 23a and 23b is assumed to be a partial image corresponding to an area where the image quality is degraded in the image which does not contain the area 23a or 23b. These partial images are corrected using the images 23a and 23b free from any image deterioration. Image correction is executed using an appropriate method of, e.g., interchanging the area 20a with the area 23b, interchanging the area 20b with the area 23a, or changing the weight in weighted mean averaging of the areas 20a and 23b and the areas 20b and 23a.
  • The finally obtained image is the one which is generated using information about areas free from any image deterioration due to the presence of a so-called foreign substance to result in a high-quality image. Using more than two images makes it possible to more accurately take a measure against image deterioration even over a wider range.
  • As described above, according to the first embodiment, it is possible to suppress the capture of a foreign substance in the captured image to result in a high-quality image.
  • (Second Embodiment)
  • The block arrangement of a digital camera in the second embodiment is the same as that in the first embodiment shown in Fig. 1 except for the internal arrangement of the image processing apparatus 44.
  • Fig. 6 is a block diagram showing the internal arrangement of an image processing apparatus 44 according to the second embodiment. Referring to Fig. 6, reference numeral 44 denotes the image processing apparatus; 52a, a multiple image processing unit; 53, a white balance adjustment unit; 54, an RGB developing unit; and 55, a compression encoding unit. The multiple image processing unit 52a differs from the image processing unit 52 according to the first embodiment shown in Fig. 2 in that a foreign substance area detection unit 5 is provided.
  • The operation of the foreign substance area detection unit 5 according to the second embodiment will be explained with reference to Fig. 6.
  • When, e.g., the user designates a mode for correcting image deterioration due to the presence of a foreign substance from a plurality of images, the foreign substance area detection unit 5 acquires a plurality of images 10a, 10b,... with appropriate exposure in an aperture state designated by the user, and a plurality of images 13a, 13b,... obtained by narrowing down an aperture stop 41a to an extent larger than a predetermined value.
  • Figs. 7A and 7B are schematic views showing the relationship between an aperture stop and an image of a foreign substance. Fig. 7A is a schematic view when the aperture stop is opened (i.e., the F-number is small). Fig. 7B is a schematic view when the aperture stop is narrowed down (i.e., the F-number is large).
  • Referring to Figs. 7A and 7B, reference numerals 61a and 61b denote pupils; 62a and 62b, ranges on an image sensor, where the influence of a foreign substance reaches; 63, a foreign substance; and 64a and 64b, graphs schematically showing the magnitude of the influence of the foreign substance on the image sensor. Obviously, narrowing down the aperture stop 41a narrows the range where the influence of the foreign substance reaches, and increases the peak at the same time to lead to a greater influence on the image signal. As a result, narrowing down the aperture stop 41a makes it possible to make the image of the foreign substance clear to facilitate the detection of it. This makes it possible to improve the reliability of foreign substance detection.
  • The multiple image processing unit 52a detects a foreign substance using the narrowed image. On the basis of the foreign substance area information detected by this operation, the multiple image processing unit 52a corrects a plurality of images with appropriate exposure in an aperture state designated by the user to obtain an output image 11. This makes it possible to obtain a high-quality image corrected on the basis of more accurate foreign substance area information.
  • Fig. 8 is a block diagram mainly showing the image processing unit 52a extracted from the image processing apparatus 44 shown in Fig. 6, and the data sequence of an image processing method according to the second embodiment. An image processing operation according to the second embodiment will be explained with reference to Fig. 8.
  • As shown in Fig. 8, a positional shift detection unit 1 and image composition unit 2 receive the plurality of images 10a, 10b,.... The positional shift detection unit 1 detects the positional shifts between the plurality of images using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2000-341582 . The image composition unit 2 composites the plurality of images into one image by executing positioning and exposure compensation for the plurality of images on the basis of the positional shift information calculated by the positional shift detection unit 1. On the basis of the plurality of images and the positional shift information calculated by the positional shift detection unit 1, the foreign substance area detection unit 5 detects foreign substance area information such as the position and size of a foreign substance using, e.g., the method disclosed in Japanese Patent Laid-Open No. 2004-172820 . An image correction unit 3 generates an output image 11 by correcting the composite image on the basis of the composite image obtained by the image composition unit 2, the positional shift information between the plurality of images obtained by the image composition unit 2, and the foreign substance area information obtained by the foreign substance area detection unit 5.
  • The operations of the positional shift detection unit 1, image composition unit 2, and image correction unit 3 are the same as those in the first embodiment, and a repetitive description thereof will be omitted.
  • The operation of the foreign substance area detection unit 5 will be explained with reference to Fig. 9.
  • Fig. 9 shows an image obtained by superposing a plurality of given images (two given images in this case) without any positioning. Referring to Fig. 9, reference numerals 20a and 20b denote areas where the image quality is degraded due to the presence of a foreign substance. Also referring to Fig. 9, reference numeral 22c denotes a positional shift vector calculated by the positional shift detection unit 1.
  • When the positional shift detection unit 1 appropriately calculates the positional shift information, it is possible to detect corresponding feature points of a finite number of feature points extracted with reference to contrast or the like, other than feature points generated by dirt. In contrast, since a feature point generated by dirt such as a feature point 24 shown in Fig. 9 does not shift between the plurality of images, it has no point corresponding to the head of a shift vector given by the positional shift information. This principle allows the detection of an image of a foreign substance.
  • With, e.g., the above-described method, the position of a foreign substance is specified to give the specified position as foreign substance area information to the image correction unit 3.
  • The image correction unit 3 corrects the image using the method shown in the first embodiment to generate an output image.
  • The finally obtained image is the one which is generated using information about areas free from any image deterioration due to the presence of a so-called foreign substance to result in a high-quality image. Using more than two images makes it possible to more accurately take a measure against image deterioration even over a wider range.
  • The second embodiment has exemplified the case in which a plurality of images which are continuously shot with appropriate exposure are composited. However, when a plurality of images are captured with appropriate exposure without camera shake, it may be unnecessary to composite these images. In this case, a foreign substance area in one image alone may be corrected by an image of a corresponding area in the other image. This also makes it possible to eliminate the influence of the foreign substance from the image.
  • The first and second embodiments have exemplified the case in which a plurality of images are composited in the camera. However, the camera may be used only to capture a plurality of images to cause an image processing apparatus outside the camera to execute an image composition process and foreign substance removal process.
  • More specifically, an image processing apparatus such as a PC (personal computer) is configured to incorporate the image processing unit 52 or 52a. Via a detachable recording medium or by connecting the camera and the PC, the PC can receive a plurality of images 10a, 10b,..., a plurality of images 13a, 13b,..., and foreign substance information to be input. With this arrangement, the image processing apparatus outside the camera can attain an image composition process and foreign substance removal process.
  • As has been described above, according to the first and second embodiments, it is possible to provide a digital camera and image processing apparatus capable of suppressing the capture of a foreign substance in the captured image to result in a high-quality image. A foreign substance can be detected in the digital camera simultaneously with image capture, or by the image processing apparatus such as a PC after image capture. This obviates the need for prompting the user to execute an operation in a cleaning mode to result in an improvement in user-friendliness. It is also possible to appropriately execute a correction operation even for a foreign substance adhered immediately before image capture.
  • (Other Embodiment)
  • The object of each embodiment is achieved even by the following method. That is, a storage medium (or recording medium) which records software program codes for implementing the functions of the above-described embodiments is supplied to the system or apparatus. The computer (or CPU or MPU) of the system or apparatus reads out and executes the program codes stored in the storage medium. In this case, the program codes read out from the storage medium implement the functions of the above-described embodiments by themselves, and the storage medium which stores the program codes constitutes the present invention. In addition to the case in which the functions of the above-described embodiments are implemented when the readout program codes are executed by the computer, the present invention incorporates the following case. That is, the functions of the above-described embodiments are implemented when the operating system (OS) running on the computer performs part or all of actual processing on the basis of the instructions of the program codes.
  • The present invention also incorporates the following case. That is, the program codes read out from the storage medium are written in the memory of a function expansion card inserted into the computer or a function expansion unit connected to the computer. After that, the functions of the above-described embodiments are implemented when the CPU of the function expansion card or function expansion unit performs part or all of actual processing on the basis of the instructions of the program codes.
  • When the present invention is applied to the storage medium, the storage medium stores program codes corresponding to the above-described procedures.
  • While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions. An image capturing apparatus includes an image sensor (42) which photo-electrically converts an object image, a memory (49) which stores foreign substance area information associated with a foreign substance adhering to an optical member disposed in front of the image sensor, a positional shift detection unit (44) which detects the mutual positional shifts between a plurality of images generated by sensing almost the same object by the image sensor, an image composition unit (44) which positions and composites the plurality of images on the basis of the detection result obtained by the positional shift detection unit, and an image correction unit (44) which corrects at least parts of the plurality of images on the basis of the detection result obtained by the positional shift detection unit and the foreign substance area information stored in the memory.

Claims (8)

  1. An image processing apparatus comprising:
    acquisition means (52) for acquiring a plurality of images generated by continuously shooting the same object by an image capturing apparatus, and foreign substance area information which is associated with a foreign substance adhering to an optical member (42a) disposed in front of an image sensor (42) of the image capturing apparatus and contains at least position information of the foreign substance;
    positional shift detection means (1) for detecting mutual positional shifts between the plurality of images; and
    image composition means (2) for positioning and compositing the plurality of images on the basis of the detection result obtained by said positional shift detection means, thereby generating one image, wherein the image composition means positions the plurality of images such that portions of the images not corresponding to the position of the foreign substance are coinciding, characterized by:
    image correction means (3) for correcting at least part of the plurality of images on the basis of positions of foreign substance areas obtained by said positional shift detection means and the foreign substance area information,
    wherein said image correction means corrects a foreign substance area (20a, 20b), in which the foreign substance is captured, in a first image as one of the plurality of images by an area (23b), in which the foreign substance is not captured, in a second image different from the first image of the plurality of images, and in the second image as one of the plurality of images by an area (23a), in which the foreign substance is not captured, in the first image.
  2. The apparatus according to claim 1, wherein the foreign substance area in which the foreign substance is captured and the foreign substance area in which the foreign substance is not captured are images, corresponding to the same portion of the object, in the first image and the second image.
  3. An image processing apparatus according to claim 1 further comprising,
    foreign substance area detection means (5) for detecting foreign substance area information,
    wherein said foreign substance area detection means detects, as a foreign substance area, an area which exhibits a positional shift different from positional shifts of the other portions within a window in the detection result obtained by said positional shift detection means.
  4. The apparatus according to claim 3, wherein said foreign substance area detection means receives an image different from the plurality of images.
  5. The apparatus according to claim 4, wherein said foreign substance area detection means receives an image captured with a smaller aperture diameter than a predetermined aperture diameter.
  6. An image processing method comprising:
    an acquisition step of acquiring a plurality of images generated by continuously shooting the same object by an image capturing apparatus, and foreign substance area information which is associated with a foreign substance adhering to an optical member (42a) disposed in front of an image sensor (42) of the image capturing apparatus and contains at least position information of the foreign substance;
    a positional shift detection step of detecting mutual positional shifts between the plurality of images; and
    an image composition step of positioning and compositing the plurality of images on the basis of the detection result obtained in the positional shift detection step, thereby generating one image, wherein the plurality of images is positioned such that portions of the images not corresponding to the position of the foreign substance are coinciding,
    characterized by:
    an image correction step of correcting at least parts of the plurality of images on the basis of positions of foreign substance areas obtained from the positional shift detection step and the foreign substance area information,
    wherein said image correction step corrects a foreign substance area (20a, 20b), in which the foreign substance is captured, in a first image as one of the plurality of images by an area (23b), in which the foreign substance is not captured, in a second image different from the first image of the plurality, of images, and in the second image as one of the plurality of images by an area (23a), in which the foreign substance is not captured, in the first image.
  7. An image processing method according to claim 6, further comprising:
    foreign substance area detection step (5) for detecting foreign substance area information,
    wherein in said positional shift detection step, positional shift information is detected by detecting corresponding feature points of a finite number of feature points, and
    wherein in said foreign substance area detection step, an area which has no point corresponding to the head of a shift vector given by the positional shift information is detected as a foreign substance area.
  8. A program for causing a computer to execute an image processing method defined in claims 6 or 7.
EP07106043.8A 2006-04-14 2007-04-12 Image capturing apparatus, control method therefor, image processing apparatus, image processing method, and program Expired - Fee Related EP1845709B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006112750A JP4757085B2 (en) 2006-04-14 2006-04-14 IMAGING DEVICE AND ITS CONTROL METHOD, IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM

Publications (3)

Publication Number Publication Date
EP1845709A2 EP1845709A2 (en) 2007-10-17
EP1845709A3 EP1845709A3 (en) 2011-01-19
EP1845709B1 true EP1845709B1 (en) 2014-07-09

Family

ID=38283172

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07106043.8A Expired - Fee Related EP1845709B1 (en) 2006-04-14 2007-04-12 Image capturing apparatus, control method therefor, image processing apparatus, image processing method, and program

Country Status (5)

Country Link
US (1) US8274582B2 (en)
EP (1) EP1845709B1 (en)
JP (1) JP4757085B2 (en)
KR (1) KR100829470B1 (en)
CN (1) CN100525384C (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8888680B2 (en) * 2008-07-07 2014-11-18 Olympus Medical Systems Corp. Method and apparatus for foreign matter detection for blood content sensors
JP5231119B2 (en) * 2008-07-31 2013-07-10 オリンパス株式会社 Display device
US8130278B2 (en) * 2008-08-01 2012-03-06 Omnivision Technologies, Inc. Method for forming an improved image using images with different resolutions
JP5106335B2 (en) * 2008-09-24 2012-12-26 キヤノン株式会社 Imaging apparatus, control method thereof, and program
JP5131257B2 (en) * 2009-08-27 2013-01-30 カシオ計算機株式会社 Display control apparatus and display control program
JP5462560B2 (en) * 2009-09-08 2014-04-02 キヤノン株式会社 Image processing device
JP2011078047A (en) * 2009-10-02 2011-04-14 Sanyo Electric Co Ltd Imaging apparatus
US8606009B2 (en) * 2010-02-04 2013-12-10 Microsoft Corporation High dynamic range image generation and rendering
EP2503364A1 (en) 2011-03-22 2012-09-26 Koninklijke Philips Electronics N.V. Camera system comprising a camera, camera, method of operating a camera and method for deconvoluting a recorded image
DE102011077296B4 (en) * 2011-06-09 2020-12-10 Carl Zeiss Smt Gmbh Method and device for determining the position of a first structure relative to a second structure or a part thereof
SE1250048A1 (en) * 2012-01-24 2013-07-25 Wesdyne Sweden Ab An image enhancement device for noise reduction in digital images
JP6115069B2 (en) * 2012-10-17 2017-04-19 セイコーエプソン株式会社 Electronic device, control device for electronic device, driving method for electronic device, driving method for electro-optical device
JP6348770B2 (en) * 2014-05-08 2018-06-27 日産自動車株式会社 Camera device and three-dimensional object detection device
WO2016157607A1 (en) * 2015-03-27 2016-10-06 富士フイルム株式会社 Camera device, image processing device, and image processing method
CN105227851B (en) * 2015-11-09 2019-09-24 联想(北京)有限公司 Image processing method and image collecting device
WO2020196536A1 (en) * 2019-03-26 2020-10-01 株式会社小糸製作所 Photographing system and image processing device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006022229A1 (en) * 2004-08-25 2006-03-02 Matsushita Electric Industrial Co., Ltd. Image pickup optical device, pickup image processing system, and pickup image processing program

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212626A (en) * 1996-02-01 1997-08-15 Hitachi Ltd Image editing method and device therefor
JPH11272855A (en) * 1998-03-23 1999-10-08 Leben:Kk Method for removing obstacle from picked-up image
JP3461482B2 (en) * 1999-02-24 2003-10-27 オリンパス光学工業株式会社 Digital camera and dust position detection method for digital camera
JP4284570B2 (en) * 1999-05-31 2009-06-24 ソニー株式会社 Imaging apparatus and method thereof
GB9921331D0 (en) * 1999-09-09 1999-11-10 Pandora Int Ltd Film restoration system
KR100466458B1 (en) * 1999-09-20 2005-01-14 마츠시타 덴끼 산교 가부시키가이샤 Device for assisting automobile driver
EP1374564A2 (en) * 2001-03-30 2004-01-02 Sinar AG Digital photography method and digital camera
US6928192B2 (en) * 2001-04-19 2005-08-09 Koninklijke Philips Electronics N.V. User interface for interactive removal of defects in an image sequence
US7120315B2 (en) * 2002-03-18 2006-10-10 Creo Il., Ltd Method and apparatus for capturing images using blemished sensors
JP4179079B2 (en) * 2002-08-30 2008-11-12 株式会社ニコン Electronic camera and control program thereof
JP3826878B2 (en) * 2002-11-19 2006-09-27 コニカミノルタフォトイメージング株式会社 Imaging device
EP1583356B1 (en) * 2002-12-27 2013-04-10 Nikon Corporation Image processing device and image processing program
JP4466015B2 (en) 2002-12-27 2010-05-26 株式会社ニコン Image processing apparatus and image processing program
JP2004341582A (en) 2003-05-13 2004-12-02 Nippon Telegr & Teleph Corp <Ntt> Multi-point connection terminal device, connection method and program for connecting between multi-point connection terminal devices, and recording medium with its program recorded thereon
US7209601B2 (en) * 2003-07-22 2007-04-24 Omnivision Technologies, Inc. CMOS image sensor using high frame rate with frame addition and movement compensation
US7206461B2 (en) * 2003-09-30 2007-04-17 Fotonation Vision Limited Digital image acquisition and processing system
JP2006038582A (en) 2004-07-26 2006-02-09 Dainippon Screen Mfg Co Ltd Detection of flaw due to regional division of image
JP4480147B2 (en) * 2004-09-13 2010-06-16 キヤノン株式会社 Imaging apparatus and control method thereof
US7450778B2 (en) * 2005-03-17 2008-11-11 Hewlett-Packard Development Company, L.P. Artifact reduction in a digital video
US7548659B2 (en) * 2005-05-13 2009-06-16 Microsoft Corporation Video enhancement
US7440608B2 (en) * 2005-05-31 2008-10-21 Hewlett-Packard Development Company, L.P. Method and system for detecting image defects
JP2008072565A (en) * 2006-09-15 2008-03-27 Ricoh Co Ltd Imaging device and defective pixel correction method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006022229A1 (en) * 2004-08-25 2006-03-02 Matsushita Electric Industrial Co., Ltd. Image pickup optical device, pickup image processing system, and pickup image processing program
US20070268383A1 (en) * 2004-08-25 2007-11-22 Matsushita Electric Industrial Co., Ltd. Imaging Optical Instrument, Captured Image Processing System, and Captured Image Processing Program

Also Published As

Publication number Publication date
US8274582B2 (en) 2012-09-25
JP4757085B2 (en) 2011-08-24
KR100829470B1 (en) 2008-05-16
JP2007286846A (en) 2007-11-01
CN101056354A (en) 2007-10-17
KR20070102395A (en) 2007-10-18
EP1845709A3 (en) 2011-01-19
EP1845709A2 (en) 2007-10-17
CN100525384C (en) 2009-08-05
US20070242140A1 (en) 2007-10-18

Similar Documents

Publication Publication Date Title
EP1845709B1 (en) Image capturing apparatus, control method therefor, image processing apparatus, image processing method, and program
EP1808014B1 (en) Camera and image processing method for camera
US20080025650A1 (en) Image processing apparatus, control method therefor, and program
US20090135270A1 (en) Imaging apparatus and recording medium
JP2007201534A (en) Imaging apparatus
US20080049117A1 (en) Digital camera capable of displaying and/or recording movie image
JP2008278444A (en) Imaging apparatus
JP2009177503A (en) Imaging apparatus
KR100819811B1 (en) Photographing apparatus, and photographing method
US8576306B2 (en) Image sensing apparatus, image processing apparatus, control method, and computer-readable medium
JP2007329686A (en) Imaging apparatus and its control method
JP5217451B2 (en) Imaging device
JP4953770B2 (en) Imaging device
JP2010035131A (en) Imaging apparatus and imaging method
JP2003101862A (en) Image pickup device and image pickup method
JP5618765B2 (en) Imaging apparatus and control method thereof
JP2009017427A (en) Imaging device
JP5383361B2 (en) Imaging apparatus, control method therefor, and program
JP2006243609A (en) Autofocus device
JP5644180B2 (en) Imaging apparatus, imaging method, and program
JP2012239079A (en) Photographing device
JP2007184787A (en) Digital camera
JP2003315665A (en) Camera
JP4905225B2 (en) Imaging device
JP2011180545A (en) Imaging device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK YU

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

17P Request for examination filed

Effective date: 20110719

AKX Designation fees paid

Designated state(s): DE FR GB

17Q First examination report despatched

Effective date: 20130221

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602007037532

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04N0005217000

Ipc: H04N0005357000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 5/357 20110101AFI20130904BHEP

INTG Intention to grant announced

Effective date: 20130916

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20140409

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602007037532

Country of ref document: DE

Effective date: 20140821

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602007037532

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20150410

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20151231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150430

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20160430

Year of fee payment: 10

Ref country code: GB

Payment date: 20160427

Year of fee payment: 10

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602007037532

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20170412

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171103

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170412