WO2010067740A1 - Image processing apparatus and image processing method, and data processing apparatus and data processing method - Google Patents

Image processing apparatus and image processing method, and data processing apparatus and data processing method Download PDF

Info

Publication number
WO2010067740A1
WO2010067740A1 PCT/JP2009/070273 JP2009070273W WO2010067740A1 WO 2010067740 A1 WO2010067740 A1 WO 2010067740A1 JP 2009070273 W JP2009070273 W JP 2009070273W WO 2010067740 A1 WO2010067740 A1 WO 2010067740A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging lens
image capturing
optical transfer
image
transfer information
Prior art date
Application number
PCT/JP2009/070273
Other languages
French (fr)
Inventor
Tomoe Kikuchi
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Priority to US13/122,421 priority Critical patent/US8730371B2/en
Priority to KR1020117015511A priority patent/KR101246738B1/en
Priority to EP09831844.7A priority patent/EP2377308B1/en
Priority to CN200980149785.6A priority patent/CN102246505B/en
Priority to JP2011516182A priority patent/JP5162029B2/en
Publication of WO2010067740A1 publication Critical patent/WO2010067740A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • H04N25/615Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4" involving a transfer function modelling the optical system, e.g. optical transfer function [OTF], phase transfer function [PhTF] or modulation transfer function [MTF]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/663Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras

Definitions

  • the present invention relates to an image processing apparatus and image processing method, and a data processing apparatus and data processing method, which correct a degraded captured image.
  • An image recovery process which recovers an image free from any degradation when an image captured by an image capturing apparatus such as a digital still camera suffers degradation caused by, for example, aberrations is known.
  • an image recovery algorithm a method of expressing image degradation characteristics by a point spread function (PSF) and recovering an image free from any degradation based on the PSF is known.
  • PSF point spread function
  • Japanese Patent Laid-Open No. 62-127976 discloses an invention which corrects a blur by a filtering process having the inverse characteristics of the PSF.
  • Japanese Patent Laid-Open No. 2004- 205802 discloses an invention which generates a Wiener filter from the PSF, and recovers a degraded image using the Wiener filter.
  • Japanese Patent Laid-Open No. 2000-020691 discloses an invention which obtains a high-quality recovered image using characteristic information of an image capturing apparatus .
  • (x, y) be position coordinates on a frame
  • o(x, y) be an image free from any degradation (to be referred to as a subject image hereinafter)
  • z (x, y) be an image which is degraded due to out-of-focus, aberrations, camera shake, and so forth (to be referred to as a degraded image hereinafter)
  • p(x, y) be information PSF of a point spread due to a blur.
  • O(u, v) is the spectrum of o (x, y)
  • P(u, v) is the spectrum of p (x, y)
  • P(u, v) is a modulation transfer function (MTF) as the absolute value of an optical transfer function (OTF) as the two-dimensional Fourier transform of the PSF.
  • MTF modulation transfer function
  • OTF optical transfer function
  • the spectrum 0(u, v) of the subject image can be calculated by calculating their spectra and using equation (4) obtained by modifying equation (3) . Then, by computing the inverse Fourier transform of the spectrum calculated by equation (4) , the subject image o(x, y) can be obtained.
  • ⁇ (u, v) z(u, v)/p(u, v) ... ( 4 )
  • the MTF of the degradation often includes a frequency where its value becomes zero.
  • the zero MTF value means existence of a frequency component which is not transmitted (information is lost) by degradation. If a frequency where the MTF value becomes zero exists, the subject image cannot be perfectly recovered. Therefore, the inverse filter of the MTF often includes a frequency at which a filter coefficient becomes infinity, and the spectrum value of the subject image becomes indefinite at that frequency.
  • image recovery often uses a Wiener filter expressed by: P(u, v)/jp(u, v)
  • Japanese Patent Laid-Open No. 4-088765 estimates the PSF according to the subject distance, and uses it in recovery of image degradation.
  • Japanese Patent Laid- Open No. 2000-020691 executes a recovery process by correcting the PSF at the time of use of a flash by focusing attention on the fact that a luminance change of a subject during a shutter open period is large at the time of use of the flash and is different from the PSF during the shutter open period when the flash is not used (luminance change is uniform) .
  • recovery process information (recovery filter coefficients, PSF data of the overall image capturing apparatus, etc.) of an overall image capturing apparatus including imaging lenses and a camera body is stored as a database in a camera body or image process software. Then, upon execution of a recovery process, recovery process information according to imaging conditions need only be acquired from the database.
  • the aforementioned method is effective for a digital camera having a fixed combination of an imaging lens and camera body.
  • pieces of recovery process information have to be held in correspondence with all combinations of the imaging lenses and camera body.
  • the amount of data becomes very large, and it is difficult for each imaging lens or camera body having a limited memory size to hold the recovery process information.
  • the recovery process information is fixed data corresponding to a combination of a certain imaging lens and camera body. For this reason, every time a new model of a camera body or imaging lens appears, recovery process information corresponding to a combination of the new model and existing model has to be created, and the new recovery process information has to be reflected on the existing database. Such operation forces camera bodies and imaging lenses users to perform cumbersome operations.
  • an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit.
  • an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; and an output section, configured to output a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
  • an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and an output section, configured to selectively execute, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens.
  • a data processing apparatus for an interchangeable imaging lens of an image capturing apparatus, the data processing apparatus comprising: a receiver, configured to receive characteristic information of an image capturing unit of the image capturing apparatus; an acquisition section, configured to acquire optical transfer information according to a lens setting of the imaging lens; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and a transmitter, configured to transmit the correction filter to the image capturing apparatus .
  • an image processing method of an image capturing apparatus using an interchangeable imaging lens comprising steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit .
  • an image processing method of an image capturing apparatus using an interchangeable imaging lens comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; and outputting a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
  • an image processing method of an image capturing apparatus using an interchangeable imaging lens comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens/ acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and selectively executing, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens.
  • a method of a data processing apparatus for an interchangeable imaging lens of an image capturing apparatus comprising the steps of: receiving characteristic information of an image capturing unit of the image capturing apparatus; acquiring optical transfer information according to a lens setting of the imaging lens; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and transmitting the correction filter to the image capturing apparatus.
  • degradation of an image captured by an image capturing apparatus which uses an interchangeable imaging lens can be corrected.
  • Fig. 1 is a view showing the outer appearance of a digital camera.
  • Fig. 2 is a longitudinal sectional view of the digital camera.
  • Fig. 3 is a view showing the configuration of an image capturing device.
  • Fig. 4 is a block diagram showing an arrangement associated with control, image capturing, and image processes of the digital camera.
  • Fig. 5 is a block diagram for explaining an arrangement for executing a recovery process.
  • Fig. 6 is a view showing the concept of a data structure held by a lens characteristic value memory .
  • Fig. 7 is a graph showing an example of an OTF of an image height "I" at a certain wavelength.
  • F ⁇ g. 8 is a flowchart for explaining a recovery process.
  • Fig. 9 is a flowchart for explaining the process for generating coefficients of a recovery filter.
  • Figs. 1OA and 1OB are graphs illustrating sub-sampling .
  • Fig. 11 is a graph showing an example of spectral transmittance characteristics of RGB filters.
  • Fig. 12 is a chart illustrating generation of recovery filters.
  • Fig. 13 is a flowchart for explaining a recovery process according to the second embodiment.
  • Fig. 14 is a flowchart for explaining the processing sequence according to the third embodiment.
  • FIGs. 15 and 16 are flowcharts for explaining the processing sequence according to the fourth embodiment .
  • Fig. 17 is a view showing the concept of a data structure held by a lens characteristic value memory according to the fifth embodiment.
  • Fig. 18 is a flowchart for explaining a process for generating recovery filters according to the sixth embodiment.
  • BEST MODE FOR CARRYING OUT THE INVENTION [0044] An image processing apparatus and image processing method, and a data processing apparatus and data processing method according to embodiments of the present invention will be described in detail hereinafter with reference to the drawings .
  • Fig. 1 shows the outer appearance of a digital camera.
  • a viewfinder eyepiece 111 On an upper portion of a camera body 100, a viewfinder eyepiece 111, an auto-exposure (AE) lock button 114, a button 113 used to select auto-focus (AF) points, and a release button 112 used to start an image capturing operation are arranged. Also, an imaging mode selection dial 117, display unit 409, digital dial 411, and the like are arranged.
  • the digital dial 411 serves as a multifunction signal input unit used to input a numerical value and to switch an imaging mode together with other operation buttons.
  • the display unit 409 of an LCD panel displays imaging conditions such as a shutter speed, stop, and imaging mode, and other kinds of information.
  • a liquid crystal display (LCD) monitor 417 which displays an image captured by the camera, a capturing image, various setting screens, and the like, a switch 121 used to turn on/off display of the LCD monitor 417, cross keys 116, a menu button 124, and the like are arranged. Since the LCD monitor 417 is of transmission type, the user cannot view an image by driving only the LCD monitor 417. For this reason, a backlight is required on the rear surface of the LCD monitor 417, as will be described later.
  • the cross keys 116 have four buttons laid out at upper, lower, right, and left positions, and a setting button laid out at the central position, and is used to select and to instruct execution of menu items displayed on the LCD monitor 417.
  • the menu button 124 is used to display a menu screen on the LCD monitor 417. For example, when the user wants to select and set an imaging mode, he or she presses the menu button 124, selects a desired imaging mode by operating the upper, lower, right, and left buttons of the cross keys 116, and then presses the setting button while the desired imaging mode is selected, thus completing the setting of the imaging mode. Note that the menu button 124 and cross keys 116 are also used to set an AF mode (to be described later) .
  • Fig. 2 is a longitudinal sectional view of the digital camera.
  • An imaging lens 200 of an imaging optical system is an interchangeable lens for the camera body 100, the imaging lens 200 is attachable to and detachable from the camera body via a lens mount 202.
  • a mirror 203 which is laid out in an imaging optical path having an imaging optical axis 201 as the center, can be quickly returned between a position where it guides subject light from the imaging lens 200 to a viewfinder optical system (slant position) and an escape position outside the imaging optical path.
  • the subject light guided to the viewfinder optical system by the mirror 203 forms an image on a focusing screen 204.
  • the subject light that has been transmitted through the focusing screen 204 passes through a condenser lens 205 and pentagonal roof prism 206, which are arranged to enhance the visibility of a viewfinder, and is guided to an eyepiece lens 208 and photometry sensor 207.
  • a first curtain 210 and second curtain 209 form a focal plane shutter (mechanical shutter) , and are opened and closed to expose, for a required period of time, an image capturing device 418 as a charge coupled device (CCD) or CMOS sensor, which is laid out behind these curtains.
  • the image capturing device 418 is held on a printed circuit board 211.
  • Another printed circuit board 215 is laid out behind the printed circuit board 211, and the LCD monitor 417 and a backlight 416 are arranged on the opposite surface of the printed circuit board 215.
  • Fig. 3 is a view showing the configuration of the image capturing device 418.
  • the image capturing device 418 is of single plate type, and the layout of color filters has a typical Bayer matrix.
  • the camera body 100 includes a recording medium 419a on which image data are recorded, and a battery 217 as a portable power supply. Note that the recording medium 419a and battery 217 are detachable from the camera body 100.
  • Fig. 4 is a block diagram showing an arrangement associated with control, image capturing, and image processes of the digital camera.
  • a microcomputer (CPU) 402 controls the operations of the overall camera such as processing of image data output from the image capturing device 418 and display control of the LCD monitor 417.
  • a switch (SWl) 405 is turned on at a half stroke position of the release button 112 (halfway pressing state) .
  • the camera is ready to capture an image.
  • a switch (SW2) 406 is turned on at a full stroke position of the release button 112 (full pressing state) .
  • the switch (SW2) 406 is turned on, the camera body 100 starts an image capturing operation.
  • a lens controller 407 communicates with the imaging lens 200 and executes drive control of the imaging lens 200 and that of an aperture in an AF mode.
  • a display controller 408 controls the display unit 409 and a display unit (not shown) inside the viewfinder.
  • a switch sense unit 410 is an interface used to transmit signals output from a large number of switches and keys including the aforementioned digital dial 411 to the CPU 402.
  • a flash controller 412 is grounded through an X sync 412a, and executes emission control and light control of an external flash.
  • the recording medium 419a such as a hard disk or memory card is attached.
  • a distance measuring unit 413 detects a defocus amount with respect to a subject to attain AF control.
  • a photometry unit 414 measures the luminance of a subject, and controls an exposure time.
  • a shutter controller 415 controls the mechanical shutter so as to appropriately expose the image capturing device 418.
  • the LCD monitor 417 and backlight 416 form a display device, as described above.
  • An image processor 425 includes a digital signal processor (DSP) and the like. Furthermore, to the CPU 402, an analog-to-digital converter (A/D) 423, a buffer memory 424 used to buffer image data, and the like are connected.
  • a camera characteristic value memory 428 is a nonvolatile memory which stores various characteristics of the camera body 100.
  • a lens characteristic value memory 429 is a nonvolatile memory which is included in a body of the imaging lens 200, and stores various characteristics of the imaging lens 200.
  • a recovery filter generator 430 receives lens characteristic values corresponding to the device settings at the time of image capturing from the lens characteristic value memory 429 when the switch (SW2) 406 is turned on to set an image capturing state, as will be described in detail later. Furthermore, the recovery filter generator 430 reads out camera characteristic values corresponding to the device settings at the time of image capturing from the camera characteristic value memory 428, and generates a recovery filter as a correction filter used to correct a degraded capturing image.
  • Fig. 5 is a block diagram for explaining an arrangement for executing a recovery process.
  • the shutter controller 415 transmits, as lens setting information, an F-value, a distance to a subject (imaging distance) , and a zoom position acquired from the lens controller 407 and the like to a microcontroller (MPU) 431 of the imaging lens 200.
  • the MPU 431 reads out OTF data as optical transfer information of the imaging lens 200 corresponding to the received lens setting information from the lens characteristic value memory 429, and transmits the OTF data to the recovery filter generator 430.
  • the recovery filter generator 430 reads out device characteristics of the camera body 100 from the camera characteristic value memory 428.
  • the device characteristics include a sensor pitch, optical low- pass filter (LPF) information, spectral transmittance characteristics of RGB filters, and the like of the image capturing device 418.
  • LPF optical low- pass filter
  • the recovery filter generator 430 generates coefficients of recovery filters based on the received OTF data and the readout device characteristics, and transmits the coefficients of the recovery filters to the image processor 425.
  • Fig. 6 is a view showing the concept of a data structure held by the lens characteristic value memory 429.
  • the lens characteristic value memory 429 has a pointer table 501 which describes pointers indicating addresses of OTF tables that store OTF groups according to lens setting information.
  • a zoom position for example, a zoom range is divided into ten positions, and indices ranging from 0 to 9 are assigned to these positions from the wide- angle end to the telephoto end.
  • indices ranging from 0 to 9 are assigned to these positions from the wide- angle end to the telephoto end.
  • a range from the shortest imaging distance to infinity is divided into ten positions. Therefore, the number of pointers described in the pointer table 501 amounts to the number of F-values x 10 2 .
  • the number of OTF data used in generation of one recovery filter is 310. That is, the OTF data transmitted by the MPU 431 are a set of OTF data corresponding to all image heights and respective wavelengths according to the lens setting information.
  • Fig. 7 is a graph showing an example of an OTF of an image height "I" at a certain wavelength. Note that since the OTF is a complex number, Fig. 7 shows an MTF as the absolute value of the OTF.
  • the spatial frequency in the horizontal direction (x-direction) in an image is represented by fx
  • that in the vertical direction (y-direction) is represented by fy
  • a unit of the spatial frequency is indicated by line pairs per mm (Ip/mm or line pairs/mm) .
  • Line pairs serve as an index of a resolution, and represent how many pairs of black and white lines each having an equal width are included per mm.
  • Fig. 8 is a flowchart for explaining the recovery process.
  • the recovery filter generator 430 of the camera body 100 receives the OTF data (SlOl), and reads out device characteristics of the camera body 100 from the camera characteristic value memory 428 (S708) . Then, as will be described in detail later, the recovery filter generator 430 generates coefficients of recovery filters based on the received OTF data and the readout device characteristics (S709) , and transmits the coefficients of the recovery filters to the image processor 425 (S710) .
  • the image processor 425 applies a developing process such as demosaicing to capturing data read out from the buffer memory 424 (S711) .
  • the capturing data is data before demosaicing (developing process) (to be also referred to as RAW data hereinafter) obtained by converting a signal output from the image capturing device 418 as an image capturing unit into digital data by the A/D 423.
  • RAW data demosaicing
  • the image capturing device 418 forms an image capturing unit, but a combination of the image capturing device 418 and A/D 423 may often be called an image capturing unit.
  • the camera body 100 may often be called an image capturing unit.
  • the image processor 425 corrects a degraded image by applying a recovery filter process to the image data that has undergone the developing process using the received coefficients of the recovery filters (S712) . Then, the image processor 425 outputs the image data after the recovery filter process to the display controller 408 or recording medium drive 419 (S713) .
  • Fig. 9 is a flowchart for explaining the process (S709) for generating the coefficients of the recovery filters.
  • the recovery filter generator 430 acquires a unit of the spatial frequency of OTF data (S801) .
  • the unit of the spatial frequency in this embodiment is Ip/mm.
  • the Nyquist frequency is unconditionally expressed by 0.5 lp/pixel, and the sampling frequency is expressed by 1.0 lp/pixel.
  • the Nyquist frequency is 100 Ip/mm. If the sensor pitch is 2.0 ⁇ m, the Nyquist frequency is 250 Ip/mm.
  • the recovery filter generator 430 sub- samples OTF data equal to or lower than the Nyquist frequency of those converted into a sensor pitch-based frequency space using the recovery filter size to be finally generated (S804) .
  • Figs. 1OA and 1OB are graphs illustrating sub-sampling.
  • the Nyquist frequency is 100 Ip/mm (0.5 lp/pixel)
  • MTFs within a region sandwiched between the broken lines shown in Fig. 1OA are extracted.
  • Fig. 1OB is a graph showing the distribution of MTFs equal to or lower than the Nyquist frequency. Assuming that the MTFs are distributed up to 500 Ip/mm, as shown in Fig.
  • the recovery filter size may be a predetermined fixed value or may be decided by user's designation or depending on other parameters (the number of pixels, ISO speed, etc.) set at the time of image capturing.
  • the recovery filter generator 430 acquires the spectral transmittance characteristics of RGB filters from the device characteristics, and multiplies the 17 x 17 wavelength-dependent OTF data by the acquired characteristics, thereby converting them into OTF data for respective RGB components (S805) .
  • Fig. 11 is a graph showing an example of the spectral transmittance characteristics of the RGB filters. As shown in Fig. 11, OTF data of respective wavelengths are multiplied by the transmittances of the corresponding wavelengths of the filters of respective components to normalize the OTF data by the sum totals of these products, thereby obtaining OTF data for respective RGB components.
  • the recovery filter generator 430 acquires the optical LPF information from the device characteristics, and applies low-pass filter characteristics indicated by the optical LPF information to the OTF data for respective RGB components (S806) . In this manner, OTF data for respective RGB components upon combination of the imaging lens 200 and camera body 100 are calculated. [0094] The recovery filter generator 430 calculates the coefficients of recovery filters for respective RGB components using the OTF data obtained in step S806 (S807) . Note that since the recovery filter generation method has been described above, a detailed description thereof will not be repeated. In this way, three filters each including 17 x 17 real values are generated as the recovery filters of a real space.
  • Fig. 12 is a chart illustrating generation of the recovery filters, and shows the data flow until the recovery filters of the real space for respective RGB components are generated from OTF data.
  • OTF data for respective frequencies which are independent from the image capturing device, are prepared for the imaging lens 200.
  • the camera body 100 side converts the OTF data acquired from the imaging lens 200 based on the sensor pitch, the spectral transmittance characteristics of the RGB filters, and the optical LPF characteristics of the image capturing device 418.
  • the camera body side can generate OTF data corresponding to each combination of the imaging lens and camera body. That is, an imaging lens which holds OTF data and a camera body which holds the device characteristics of the image capturing device allow to generate OTF data corresponding to a specific combination of an imaging lens as a new model and an existing camera body (or vice versa) , and recovery filters can be generated.
  • the imaging lens holds OTF data
  • the camera body holds the device characteristics of the image capturing device
  • they can be independently developed.
  • the imaging lens and camera body need only hold their own information, the data amount to be held to generate recovery filters can be suppressed in both the imaging lens and camera body.
  • the need for troublesome operations such as updating of firmware can be obviated for the user.
  • the OTF data are complex numbers, and are two-dimensional data defined by the frequency space in the vertical and horizontal directions. Therefore, even only OTF data corresponding to certain device characteristic information require a very large data amount. Hence, the data amount of OTF data to be transmitted from the imaging lens 200 to the camera body 100, in other words, the data communication amount between the imaging lens 200 and camera body 100 becomes still very large.
  • the second embodiment will explain an arrangement in which a MPU 431 of an imaging lens 200 calculates the coefficients of recovery filters, and transmits the calculated coefficients to an image processor 425.
  • Fig. 13 is a flowchart for explaining a recovery process according to the second embodiment. Note that the same step numbers in Fig. 13 denote the same processes as in Fig. 8, and a detailed description thereof will not be repeated.
  • the image processor 425 reads out device characteristics of a camera body 100 from a camera characteristic value memory 428 (S721) . Then, the image processor 425 transmits the device characteristics to the imaging lens 200 (S722) . The MPU 431 of the imaging lens 200 receives the device characteristics (S723) . Note that since the device characteristics do not depend on the imaging conditions, the processes in steps S721 to S723 need only be executed once at the time of execution of an initialization process upon power ON after the imaging lens 200 and camera body 100 are connected.
  • the MPU 431 Upon reception of a notification of an image capturing state from a shutter controller 415 (S724), the MPU 431 acquires lens setting information from a lens controller 407 (S725) , and reads out OTF data corresponding to the lens setting information from a lens characteristic value memory 429 (S726) . Then, the MPU 431 generates the coefficients of recovery filters by executing the same processing sequence as in Fig. 9 based on the received device characteristics and the readout OTF data (S727), and transmits the coefficients of the recovery filters to the camera body 100 (S728) .
  • the image processor 425 of the camera body 100 receives the coefficients of the recovery filters (S729) . Then, the image processor 425 executes a developing process (S711) , recovery filter process (S712), and output process (S713) in the same manner as in the processing sequence shown in Fig. 8. [0106] According to this arrangement, data to be transmitted from the imaging lens 200 to the camera body 100 are, for example, sets of real value data of three 17 x 17 recovery filters for each image height, and the data communication amount can be greatly reduced compared to transmission of a set of OTF data. [0107] In general, the MPU 431 included in the imaging lens 200 has an arithmetic power inferior to the CPU 402 included in the camera body 100.
  • the imaging lens 200 when the imaging lens 200 generates recovery filters, an arithmetic process of the recovery filters requires more time than the case in which the camera body 100 generates recovery filters. However, the data communication amount between the imaging lens 200 and camera body 100 can be greatly reduced. [0108] In the example of the above description, all the device characteristics are transmitted to the imaging lens 200, and the imaging lens 200 generates the coefficients of the recovery filters. However, a method of transmitting only the sensor pitch to the imaging lens 200 is available. Then, a recovery filter generator 430 receives OTF data whose data amount is reduced after sub-sampling, and executes subsequent processes (application of the spectral transmittance characteristics of RGB filters and optical LPF characteristics, and Fourier transformation) . In this manner, although the data communication amount is reduced less than the processing sequence shown in Fig. 13, the arithmetic process load on the MPU 431 can be reduced.
  • the recovery filter generator 430 receives, for example, the frequency characteristics data of the three 17 x 17 recovery filters for each image height, and executes the subsequent processes (application of the optical LPF characteristics, and Fourier transformation) .
  • the data communication amount can be reduced as in the processing sequence shown in Fig. 13, and the arithmetic process load on the MPU 431 can be reduced.
  • the Fourier transformation requires a heaviest arithmetic process load, when at least the Fourier transformation is excluded from the processes of the MPU 431, the arithmetic process load on the MPU 431 can be significantly reduced.
  • the first embodiment has exemplified the case in which a recovery process is executed by generating recovery filters in the camera body 100.
  • the second embodiment has exemplified the case in which a recovery process is executed by generating recovery filters in the imaging lens 200.
  • the CPU 402 included in the camera body 100 and the MPU 431 included in the imaging lens 200 are powerless, and require much time to generate the recovery filters.
  • a MPU 431 Upon reception of a notification of an image capturing state from a shutter controller 415 (S724) , a MPU 431 acquires lens setting information from a lens controller 407 (S731) , and reads out OTF data corresponding to the lens setting information from a lens characteristic value memory 429 (S732) . Then, the MPU 431 transmits the readout OTF data to a camera body 100 (S733) .
  • An image processor 425 of the camera body 100 receives the OTF data (S734), reads out device characteristics of the camera body 100 from a camera characteristic value memory 428 (S735) , and acquires capturing data from the buffer memory 424 (S736) . Then, the image processor 425 saves the OTF data, device characteristics, and capturing data in a file, and outputs the file to a recording medium drive 419 (S737) .
  • a file of a RAW data format stored in the recording medium 419a normally saves lens setting information as information at the time of image capturing together with capturing data.
  • the OTF data received from the imaging lens 200, and the sensor pitch, the spectral transmittance characteristics of RGB filters, and the optical LPF characteristics as the device characteristics of the camera body 100 are saved in a file.
  • OTF data whose data amount is reduced after sub- sampling, the spectral transmittance characteristics of RGB filters, and the optical LPF characteristics may be saved in a file. In this manner, the data size of a file to be stored in the recording medium 419a can be reduced.
  • the first and second embodiments have exemplified the case in which recovery filters are generated, and a recovered image that has undergone a recovery process is stored as a file in the recording medium 419a.
  • the third embodiment has exemplified the case in which a file which saves information required to generate recovery filters together with capturing data is stored in the recording medium 419a.
  • the recovery process is executed based on the processing sequence of the first or second embodiment, and a recovered image that has undergone the recovery process is stored as a file in the recording medium 419a.
  • the imaging mode is the RAW mode
  • capturing data and information required to generate recovery filters may be saved in a file based on the processing sequence of the third embodiment .
  • the fourth embodiment will exemplify a case in which a process of a level according to an imaging mode is executed. That is, when the imaging mode set in a camera body 100 is the JPEG mode, the recovery process is executed using a common recovery filter which is independent from lens setting information at the time of image capturing, and a recovered image is saved in a file in a JPEG format. When the imaging mode is the RAW mode, recovery filters which are generated according to the lens setting information at the time of image capturing are saved in a file together with capturing data.
  • FIGs. 15 and 16 are flowcharts for explaining the processing sequence according to the fourth embodiment. Note that the same step numbers in Figs. 15 and 16 denote the same processes as in Figs. 8 and 14, and a detailed description thereof will not be repeated.
  • a MPU 431 of an imaging lens 200 acquires OTF data required to generate a common recovery filter from a lens characteristic value memory 429 (S741) , and transmits the acquired OTF data to the camera body 100 (S742) .
  • a recovery filter generator 430 of the camera body 100 receives the OTF data (S743) , and acquires device characteristics of the camera body 100 from the camera characteristic value memory 428 (S744) .
  • the recovery filter generator 430 calculates coefficients of the common recovery filter from the received OTF data and the acquired device characteristics (S745) , and transmits the calculated coefficients to the image processor 425 (S746) .
  • the common recovery filter remains unchanged if the combination of an imaging lens 200 and camera body 100 is fixed. Therefore, the processes in steps S741 to S746 need only be executed once at the time of execution of an initialization process upon power ON after the imaging lens 200 and camera body 100 are connected.
  • the common recovery filter generation process is the same as that in Fig. 9.
  • the image processor 425 Upon reception of a notification of an image capturing state from a CPU 402 (S747), the image processor 425 inquires the CPU 402 of the set imaging mode (S748) . The image processor 425 then determines the imaging mode (S749) . If the imaging mode is the JPEG mode, the image processor 425 executes a developing process (S711) , recovery filter process (S712), and output process (S713) in the same manner as in the processing sequence shown in Fig. 8. Note that the image processor 425 uses the common recovery filter in the recovery filter process (S712) . [0129] On the other hand, if the imaging mode is the RAW mode, the image processor 425 requests the imaging lens 200 to send OTF data (S750) . In response to this request, the MPU 431 acquires lens setting information (S731) , reads out OTF data (S732), and transmits the OTF data (S733) in the same manner as in the processing sequence in Fig. 14.
  • the image processor 425 receives the OTF data (S734) , acquires device characteristics (S735) , acquires capturing data (S736) , and outputs a file that saves the OTF data, device characteristics, and capturing data (S737) in the same manner as in the processing sequence shown in Fig. 14. [0131] In this manner, according to the fourth embodiment, the processes are selectively executed according to the imaging modes. As a result, when the JPEG mode is set, the recovery process is executed using the common recovery filter.
  • the recovery precision lowers, the need for the process for generating recovery filters for each image capturing can be obviated, and the arithmetic process power and arithmetic process time required for the recovery process can be reduced.
  • the RAW mode when the RAW mode is set, a file that saves the OTF data, device characteristics, and capturing data can be output to allow an external apparatus to execute an optimal recovery process.
  • the lens characteristic value memory 429 holds 310 OTF data for each combination of parameters
  • Fig. 17 is a view showing the concept of a data structure held by a lens characteristic value memory 429 according to the fifth embodiment.
  • the lens characteristic value memory 429 has a pointer table 1701 which describes pointers indicating addresses of OTF tables that store OTF groups according to lens setting information.
  • the pointer table 1701 has OTF group addresses indicating addresses of OTF tables (not shown) in a memory for respective combinations of parameters .
  • the fifth embodiment is characterized in that identical or nearly identical OTF tables are shared to reduce the memory size of the lens characteristic value memory 429.
  • identical OTF data may often be used at that time even when the imaging distance is changed.
  • a pointer which points to an identical OTF table is stored for distant landscape image capturing processes from oo to 1250 mm.
  • pointers which point to different OTF tables for respective combinations of parameters are stored for close view image capturing processes of 625 mm or less.
  • Fig. 17 shows that an OTF table for a
  • the memory size of the lens characteristic value memory 429 can be reduced by sharing the OTF tables.
  • the memory size of the lens characteristic value memory 429 can be reduced.
  • OTF data are saved in the lens characteristic value memory 429 in correspondence with wavelengths having a step size of 10 nm in the wavelength range from 400 to 700 nm.
  • the step size may be set to be that (e.g., 1 nm) smaller than the spectral transmittance characteristics of RGB filters.
  • each data amount is increased to 10 times, if both the data have a step size of 1 nm, more precise OTF data can be obtained, and the recovery process precision is improved.
  • the memory size of the lens characteristic value memory 429 can be reduced as a whole.
  • the subsequent calculation amount can be reduced by extracting OTF data (MTF data) within a region sandwiched between Nyquist frequencies.
  • Fig. 18 is a flowchart for explaining the process (S709) for generating recovery filters according to the sixth embodiment.
  • a recovery filter generator 430 sets a calculation mode based on a user's input or imaging conditions (e.g., a setting of a high-speed continuous shot mode) (S1801) .
  • the calculation mode includes a high-precision mode and high-speed mode. Then, the process branches depending on whether the set calculation mode is the high-precision or high-speed mode (S1802) . If the high-precision mode is set, the recovery filter generator 430 loads OTF data up to the sampling frequency from a lens characteristic value memory 429 (S1803) . On the other hand, if the highspeed mode is set, the recovery filter generator 430 loads OTF data within a region sandwiched between the Nyquist frequencies from the lens characteristic value memory 429 (S1804) . Then, the recovery filter generator 430 generates recovery filters based on the OTF data loaded in step S1803 or S1804 (S1805) .
  • a recovery filter can be generated according to a user's demand (as to whether or not to recover at high speed or with high precision) and an image capturing state
  • lens setting information includes the types of switching filters. Then, the imaging lens holds OTF data according to respective switching filters in the lens characteristic value memory 429.
  • the lens setting information may include the spectral transmittance characteristics of the imaging lens itself.
  • the products of the spectral transmittance characteristics of RGB filters of the camera body and those of the imaging lens are used in place of the spectral transmittance characteristics of the RGB filters.
  • An infrared cut filter or ultraviolet cut filter may often be laid out on the image capturing device. The spectral transmittance characteristics of these filters can be included in those of RGB filters of the camera body. Furthermore, the spectral transmittance characteristics of the RGB filters of the camera body can be held in consideration of the spectral sensitivity characteristics of the image capturing device.
  • aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment ( s ), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment ( s ) .
  • the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Structure And Mechanism Of Cameras (AREA)
  • Image Processing (AREA)

Abstract

Optical transfer information of an imaging lens is input from the imaging lens, and characteristic information of an image capturing unit of an image capturing apparatus is acquired. The optical transfer information is converted into that, which depends on the characteristics of the image capturing unit, based on the characteristic information. A correction filter, which corrects degradation of an image captured via the imaging lens, is generated based on the optical transfer information which depends on the characteristics of the image capturing unit.

Description

DESCRIPTION
IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD, AND DATA PROCESSING APPARATUS AND DATA PROCESSING
METHOD
TECHNICAL FIELD
[0001] The present invention relates to an image processing apparatus and image processing method, and a data processing apparatus and data processing method, which correct a degraded captured image.
BACKGROUND ART
[0002] An image recovery process which recovers an image free from any degradation when an image captured by an image capturing apparatus such as a digital still camera suffers degradation caused by, for example, aberrations is known. As an image recovery algorithm, a method of expressing image degradation characteristics by a point spread function (PSF) and recovering an image free from any degradation based on the PSF is known.
[0003] Japanese Patent Laid-Open No. 62-127976 discloses an invention which corrects a blur by a filtering process having the inverse characteristics of the PSF. Also, Japanese Patent Laid-Open No. 2004- 205802 discloses an invention which generates a Wiener filter from the PSF, and recovers a degraded image using the Wiener filter. Furthermore, Japanese Patent Laid-Open No. 2000-020691 discloses an invention which obtains a high-quality recovered image using characteristic information of an image capturing apparatus .
[Principle of Image Recovery]
[0004] Let (x, y) be position coordinates on a frame, o(x, y) be an image free from any degradation (to be referred to as a subject image hereinafter), z (x, y) be an image which is degraded due to out-of-focus, aberrations, camera shake, and so forth (to be referred to as a degraded image hereinafter), and p(x, y) be information PSF of a point spread due to a blur. These three pieces of information satisfy: z(x, y) = o(x, y) * p(x, y) ... ( 1 )
[0005] In the equation (1), the symbol "*" represents a convolution operation. Therefore, equation (1) can be rewritten as an integral formula expressed by: z(x, y) = J jo(x, y)p(x - x', y - y')dx'dy' ... (2)
[0006] The Fourier transform of equation (2) onto a spatial frequency (u, v) domain is computed as: Z(u, v) = θ(u, v) • p(u, v) ... ( 3 ) where Z (u, v) is the spectrum of z (x, y) ,
O(u, v) is the spectrum of o (x, y) , and P(u, v) is the spectrum of p (x, y) . [0007] Note that P(u, v) is a modulation transfer function (MTF) as the absolute value of an optical transfer function (OTF) as the two-dimensional Fourier transform of the PSF.
[0008] If p(x, y)' as the PSF can be detected by an arbitrary method in addition to the degraded image z (x, y) , the spectrum 0(u, v) of the subject image can be calculated by calculating their spectra and using equation (4) obtained by modifying equation (3) . Then, by computing the inverse Fourier transform of the spectrum calculated by equation (4) , the subject image o(x, y) can be obtained. θ(u, v) = z(u, v)/p(u, v) ... ( 4 )
[0009] Note that 1/P(u, v) is called an inverse filter.
[0010] The MTF of the degradation often includes a frequency where its value becomes zero. The zero MTF value means existence of a frequency component which is not transmitted (information is lost) by degradation. If a frequency where the MTF value becomes zero exists, the subject image cannot be perfectly recovered. Therefore, the inverse filter of the MTF often includes a frequency at which a filter coefficient becomes infinity, and the spectrum value of the subject image becomes indefinite at that frequency. [0011] In order to prevent an inverse filter coefficient from becoming infinity, image recovery often uses a Wiener filter expressed by: P(u, v)/jp(u, v)|2 + c} ... (5) where c is a constant having a very small value.
[0012] In order to recover the subject image from the degraded image, acquisition of an accurate PSF (or OTF, MTF) is desired.
[0013] As is well known, the PSF changes depending on the image height, zoom, stop, and subject position. Therefore, a method of calculating the PSF according to these pieces of imaging information and feeding it back to the recovery process has been proposed. For example, Japanese Patent Laid-Open No. 4-088765 estimates the PSF according to the subject distance, and uses it in recovery of image degradation. Japanese Patent Laid- Open No. 2000-020691 executes a recovery process by correcting the PSF at the time of use of a flash by focusing attention on the fact that a luminance change of a subject during a shutter open period is large at the time of use of the flash and is different from the PSF during the shutter open period when the flash is not used (luminance change is uniform) .
[0014] Recovery process methods are disclosed in various references. However, in practice, these references have never discussed what holds data of a recovery filter, what kind of information is to be held, how to hold the data, and how to create a recovery filter using it. In particular, there is no discussion which assumes a single-lens reflex camera using interchangeable imaging lenses and considers a plurality of combinations of imaging lenses and a camera body.
[0015] As a simplest method, recovery process information (recovery filter coefficients, PSF data of the overall image capturing apparatus, etc.) of an overall image capturing apparatus including imaging lenses and a camera body is stored as a database in a camera body or image process software. Then, upon execution of a recovery process, recovery process information according to imaging conditions need only be acquired from the database.
[0016] The aforementioned method is effective for a digital camera having a fixed combination of an imaging lens and camera body. However, in case of a single-lens reflex camera using interchangeable imaging lenses, pieces of recovery process information have to be held in correspondence with all combinations of the imaging lenses and camera body. In this case, the amount of data becomes very large, and it is difficult for each imaging lens or camera body having a limited memory size to hold the recovery process information. The recovery process information is fixed data corresponding to a combination of a certain imaging lens and camera body. For this reason, every time a new model of a camera body or imaging lens appears, recovery process information corresponding to a combination of the new model and existing model has to be created, and the new recovery process information has to be reflected on the existing database. Such operation forces camera bodies and imaging lenses users to perform cumbersome operations.
DISCLOSURE OF INVENTION
[0017] In one aspect, an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit.
[0018] In another aspect, an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; and an output section, configured to output a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
[0019] In another aspect, an image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and an output section, configured to selectively execute, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens.
[0020] In another aspect, a data processing apparatus for an interchangeable imaging lens of an image capturing apparatus, the data processing apparatus comprising: a receiver, configured to receive characteristic information of an image capturing unit of the image capturing apparatus; an acquisition section, configured to acquire optical transfer information according to a lens setting of the imaging lens; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and a transmitter, configured to transmit the correction filter to the image capturing apparatus .
[0021] In another aspect, an image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit .
[0022] In another aspect, an image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; and outputting a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
[0023] In another aspect, an image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens/ acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and selectively executing, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens. [0024] In another aspect, a method of a data processing apparatus for an interchangeable imaging lens of an image capturing apparatus, the method comprising the steps of: receiving characteristic information of an image capturing unit of the image capturing apparatus; acquiring optical transfer information according to a lens setting of the imaging lens; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and transmitting the correction filter to the image capturing apparatus. [0025] According to these aspects, degradation of an image captured by an image capturing apparatus which uses an interchangeable imaging lens can be corrected. [0026] Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings .
BRIEF DESCRIPTION OF DRAWINGS
[0027] Fig. 1 is a view showing the outer appearance of a digital camera.
[0028] Fig. 2 is a longitudinal sectional view of the digital camera.
[0029] Fig. 3 is a view showing the configuration of an image capturing device.
[0030] Fig. 4 is a block diagram showing an arrangement associated with control, image capturing, and image processes of the digital camera. [0031] Fig. 5 is a block diagram for explaining an arrangement for executing a recovery process. [0032] Fig. 6 is a view showing the concept of a data structure held by a lens characteristic value memory .
[0033] Fig. 7 is a graph showing an example of an OTF of an image height "I" at a certain wavelength.
[0034] Fφg. 8 is a flowchart for explaining a recovery process.
[0035] Fig. 9 is a flowchart for explaining the process for generating coefficients of a recovery filter.
[0036] Figs. 1OA and 1OB are graphs illustrating sub-sampling .
[0037] Fig. 11 is a graph showing an example of spectral transmittance characteristics of RGB filters.
[0038] Fig. 12 is a chart illustrating generation of recovery filters.
[0039] Fig. 13 is a flowchart for explaining a recovery process according to the second embodiment.
[0040] Fig. 14 is a flowchart for explaining the processing sequence according to the third embodiment.
[0041] Figs. 15 and 16 are flowcharts for explaining the processing sequence according to the fourth embodiment .
[0042] Fig. 17 is a view showing the concept of a data structure held by a lens characteristic value memory according to the fifth embodiment.
[0043] Fig. 18 is a flowchart for explaining a process for generating recovery filters according to the sixth embodiment. BEST MODE FOR CARRYING OUT THE INVENTION [0044] An image processing apparatus and image processing method, and a data processing apparatus and data processing method according to embodiments of the present invention will be described in detail hereinafter with reference to the drawings .
First Embodiment [Arrangement of Camera]
[0045] Fig. 1 shows the outer appearance of a digital camera.
[0046] On an upper portion of a camera body 100, a viewfinder eyepiece 111, an auto-exposure (AE) lock button 114, a button 113 used to select auto-focus (AF) points, and a release button 112 used to start an image capturing operation are arranged. Also, an imaging mode selection dial 117, display unit 409, digital dial 411, and the like are arranged.
[0047] The digital dial 411 serves as a multifunction signal input unit used to input a numerical value and to switch an imaging mode together with other operation buttons. The display unit 409 of an LCD panel displays imaging conditions such as a shutter speed, stop, and imaging mode, and other kinds of information. [0048] On the back surface of the camera body 100, a liquid crystal display (LCD) monitor 417, which displays an image captured by the camera, a capturing image, various setting screens, and the like, a switch 121 used to turn on/off display of the LCD monitor 417, cross keys 116, a menu button 124, and the like are arranged. Since the LCD monitor 417 is of transmission type, the user cannot view an image by driving only the LCD monitor 417. For this reason, a backlight is required on the rear surface of the LCD monitor 417, as will be described later.
[0049] The cross keys 116 have four buttons laid out at upper, lower, right, and left positions, and a setting button laid out at the central position, and is used to select and to instruct execution of menu items displayed on the LCD monitor 417.
[0050] The menu button 124 is used to display a menu screen on the LCD monitor 417. For example, when the user wants to select and set an imaging mode, he or she presses the menu button 124, selects a desired imaging mode by operating the upper, lower, right, and left buttons of the cross keys 116, and then presses the setting button while the desired imaging mode is selected, thus completing the setting of the imaging mode. Note that the menu button 124 and cross keys 116 are also used to set an AF mode (to be described later) . [0051] Fig. 2 is a longitudinal sectional view of the digital camera. [0052] An imaging lens 200 of an imaging optical system is an interchangeable lens for the camera body 100, the imaging lens 200 is attachable to and detachable from the camera body via a lens mount 202.
[0053] A mirror 203, which is laid out in an imaging optical path having an imaging optical axis 201 as the center, can be quickly returned between a position where it guides subject light from the imaging lens 200 to a viewfinder optical system (slant position) and an escape position outside the imaging optical path.
[0054] The subject light guided to the viewfinder optical system by the mirror 203 forms an image on a focusing screen 204. The subject light that has been transmitted through the focusing screen 204 passes through a condenser lens 205 and pentagonal roof prism 206, which are arranged to enhance the visibility of a viewfinder, and is guided to an eyepiece lens 208 and photometry sensor 207.
[0055] A first curtain 210 and second curtain 209 form a focal plane shutter (mechanical shutter) , and are opened and closed to expose, for a required period of time, an image capturing device 418 as a charge coupled device (CCD) or CMOS sensor, which is laid out behind these curtains. The image capturing device 418 is held on a printed circuit board 211. Another printed circuit board 215 is laid out behind the printed circuit board 211, and the LCD monitor 417 and a backlight 416 are arranged on the opposite surface of the printed circuit board 215.
[0056] Fig. 3 is a view showing the configuration of the image capturing device 418. The image capturing device 418 is of single plate type, and the layout of color filters has a typical Bayer matrix. [0057] Furthermore, the camera body 100 includes a recording medium 419a on which image data are recorded, and a battery 217 as a portable power supply. Note that the recording medium 419a and battery 217 are detachable from the camera body 100. [0058] Fig. 4 is a block diagram showing an arrangement associated with control, image capturing, and image processes of the digital camera. [0059] A microcomputer (CPU) 402 controls the operations of the overall camera such as processing of image data output from the image capturing device 418 and display control of the LCD monitor 417. [0060] A switch (SWl) 405 is turned on at a half stroke position of the release button 112 (halfway pressing state) . When the switch (SWl) 405 is turned on, the camera is ready to capture an image. A switch (SW2) 406 is turned on at a full stroke position of the release button 112 (full pressing state) . When the switch (SW2) 406 is turned on, the camera body 100 starts an image capturing operation. [0061] A lens controller 407 communicates with the imaging lens 200 and executes drive control of the imaging lens 200 and that of an aperture in an AF mode. A display controller 408 controls the display unit 409 and a display unit (not shown) inside the viewfinder. A switch sense unit 410 is an interface used to transmit signals output from a large number of switches and keys including the aforementioned digital dial 411 to the CPU 402.
[0062] A flash controller 412 is grounded through an X sync 412a, and executes emission control and light control of an external flash. To a recording medium drive 419, the recording medium 419a such as a hard disk or memory card is attached.
[0063] A distance measuring unit 413 detects a defocus amount with respect to a subject to attain AF control. A photometry unit 414 measures the luminance of a subject, and controls an exposure time. A shutter controller 415 controls the mechanical shutter so as to appropriately expose the image capturing device 418. The LCD monitor 417 and backlight 416 form a display device, as described above.
[0064] An image processor 425 includes a digital signal processor (DSP) and the like. Furthermore, to the CPU 402, an analog-to-digital converter (A/D) 423, a buffer memory 424 used to buffer image data, and the like are connected. [0065] A camera characteristic value memory 428 is a nonvolatile memory which stores various characteristics of the camera body 100. A lens characteristic value memory 429 is a nonvolatile memory which is included in a body of the imaging lens 200, and stores various characteristics of the imaging lens 200.
[0066] A recovery filter generator 430 receives lens characteristic values corresponding to the device settings at the time of image capturing from the lens characteristic value memory 429 when the switch (SW2) 406 is turned on to set an image capturing state, as will be described in detail later. Furthermore, the recovery filter generator 430 reads out camera characteristic values corresponding to the device settings at the time of image capturing from the camera characteristic value memory 428, and generates a recovery filter as a correction filter used to correct a degraded capturing image.
[Image Process]
Arrangement of Recovery Process
[0067] Fig. 5 is a block diagram for explaining an arrangement for executing a recovery process. [0068] When an image capturing state is set, the shutter controller 415 transmits, as lens setting information, an F-value, a distance to a subject (imaging distance) , and a zoom position acquired from the lens controller 407 and the like to a microcontroller (MPU) 431 of the imaging lens 200. The MPU 431 reads out OTF data as optical transfer information of the imaging lens 200 corresponding to the received lens setting information from the lens characteristic value memory 429, and transmits the OTF data to the recovery filter generator 430. [0069] The recovery filter generator 430 reads out device characteristics of the camera body 100 from the camera characteristic value memory 428. The device characteristics include a sensor pitch, optical low- pass filter (LPF) information, spectral transmittance characteristics of RGB filters, and the like of the image capturing device 418.
[0070] The recovery filter generator 430 generates coefficients of recovery filters based on the received OTF data and the readout device characteristics, and transmits the coefficients of the recovery filters to the image processor 425.
[0071] Fig. 6 is a view showing the concept of a data structure held by the lens characteristic value memory 429.
[0072] The lens characteristic value memory 429 has a pointer table 501 which describes pointers indicating addresses of OTF tables that store OTF groups according to lens setting information. Note that as for a zoom position, for example, a zoom range is divided into ten positions, and indices ranging from 0 to 9 are assigned to these positions from the wide- angle end to the telephoto end. Also, as for an imaging distance, a range from the shortest imaging distance to infinity is divided into ten positions. Therefore, the number of pointers described in the pointer table 501 amounts to the number of F-values x 102.
[0073] An OTF table 502 holds an OTF group corresponding to a zoom position "1", F-value = 2.8, and imaging distance = 2500 mm. The OTF group is a set of 310 (= 10 x 31) OTF data obtained by dividing an image height into ten positions (indices 0 to 9) , and also dividing a wavelength range from 400 nm to 700 nm to have a step size of 10 nm. In other words, the number of OTF data used in generation of one recovery filter is 310. That is, the OTF data transmitted by the MPU 431 are a set of OTF data corresponding to all image heights and respective wavelengths according to the lens setting information.
[0074] Fig. 7 is a graph showing an example of an OTF of an image height "I" at a certain wavelength. Note that since the OTF is a complex number, Fig. 7 shows an MTF as the absolute value of the OTF. In Fig. 7, the spatial frequency in the horizontal direction (x-direction) in an image is represented by fx, that in the vertical direction (y-direction) is represented by fy, and a unit of the spatial frequency is indicated by line pairs per mm (Ip/mm or line pairs/mm) . Line pairs serve as an index of a resolution, and represent how many pairs of black and white lines each having an equal width are included per mm.
[0075] Note that when the imaging conditions are sorted out in more detail, OTF data more suitable for the imaging conditions can be obtained, and an optimal recovery filter to the imaging conditions can be generated. However, when the imaging conditions are sorted out in more detail, the OTF data amount increases accordingly, and the memory size of the lens characteristic value memory 429 has to be increased.
Recovery Process
[0076] Fig. 8 is a flowchart for explaining the recovery process.
[0077] Upon reception of a notification indicating the switch (SW2) 406 = ON from the CPU 402, that is, when an image capturing state is set (S701) , the shutter controller 415 acquires lens setting information from the lens controller 407 (S702) . Then, the shutter controller 415 transmits the acquired lens setting information to the imaging lens 200 (S703) . [0078] Upon reception of the lens setting information (S704), the MPU 431 of the imaging lens 200 reads out OTF data corresponding to the lens setting information from the lens characteristic value memory 429 (S705), and transmits the readout OTF data to the camera body 100 (S706) .
[0079] The recovery filter generator 430 of the camera body 100 receives the OTF data (SlOl), and reads out device characteristics of the camera body 100 from the camera characteristic value memory 428 (S708) . Then, as will be described in detail later, the recovery filter generator 430 generates coefficients of recovery filters based on the received OTF data and the readout device characteristics (S709) , and transmits the coefficients of the recovery filters to the image processor 425 (S710) .
[0080] The image processor 425 applies a developing process such as demosaicing to capturing data read out from the buffer memory 424 (S711) . Note that the capturing data is data before demosaicing (developing process) (to be also referred to as RAW data hereinafter) obtained by converting a signal output from the image capturing device 418 as an image capturing unit into digital data by the A/D 423. Also, note that at least the image capturing device 418 forms an image capturing unit, but a combination of the image capturing device 418 and A/D 423 may often be called an image capturing unit. Alternatively, the camera body 100 may often be called an image capturing unit. [0081] The image processor 425 corrects a degraded image by applying a recovery filter process to the image data that has undergone the developing process using the received coefficients of the recovery filters (S712) . Then, the image processor 425 outputs the image data after the recovery filter process to the display controller 408 or recording medium drive 419 (S713) .
Generation of Recovery Filter
[0082] Fig. 9 is a flowchart for explaining the process (S709) for generating the coefficients of the recovery filters.
[0083] The recovery filter generator 430 acquires a unit of the spatial frequency of OTF data (S801) . Note that the unit of the spatial frequency in this embodiment is Ip/mm.
[0084] The recovery filter generator 430 then acquires a sensor pitch of the image capturing device 418 from the device characteristics (S802) . Assume that the sensor pitch is 5.0 μm in this embodiment. [0085] The recovery filter generator 430 converts the unit of the spatial frequency of the OTF data into a unit lp/pixel of the sensor pitch base (pixel pitch base) (S803) using the sensor pitch by: fx [lp/pixel] = fx [lp/mm] x 5.0 [jura] fy [lp/pixel] = fy [lp/mm] x 5.0 [μm] ... (6) [0086] By converting the unit of the spatial frequency into lp/pixel, the Nyquist frequency is unconditionally expressed by 0.5 lp/pixel, and the sampling frequency is expressed by 1.0 lp/pixel. When the sensor pitch is 5.0 μm, the Nyquist frequency is 100 Ip/mm. If the sensor pitch is 2.0 μm, the Nyquist frequency is 250 Ip/mm.
[0087] The recovery filter generator 430 sub- samples OTF data equal to or lower than the Nyquist frequency of those converted into a sensor pitch-based frequency space using the recovery filter size to be finally generated (S804) .
[0088] Figs. 1OA and 1OB are graphs illustrating sub-sampling. Fig. 1OA is a graph showing the distribution of MTFs (the absolute values of OTFs) in the frequency fx direction at a frequency fy = 0 of OTF data at a certain wavelength. As described above, when the sensor pitch is 5 μm, the Nyquist frequency is 100 Ip/mm (0.5 lp/pixel), and MTFs within a region sandwiched between the broken lines shown in Fig. 1OA are extracted. Fig. 1OB is a graph showing the distribution of MTFs equal to or lower than the Nyquist frequency. Assuming that the MTFs are distributed up to 500 Ip/mm, as shown in Fig. 1OA, when MTFs equal to or lower than 100 Ip/mm are extracted, the data amount is reduced to 1/25, and the subsequent calculation amount can be greatly reduced, as described by: (100 + 100)7(500 + 500)2 = 1/25 ... (7) [0089] The recovery filter generator 430 sub- samples the extracted OTF data in correspondence with the recovery filter size. For example, assuming that the recovery filter size is 17 x 17, the recovery filter generator 430 samples OTF data every 0.0625 lp/pixel, as indicated by arrows T in Fig. 1OB. [0090] In this way, upon completion of step S804, 17 x 17 OTF data corresponding to wavelengths = 400, 410, 420, 690, and 700 nm can be obtained. Note that the recovery filter size may be a predetermined fixed value or may be decided by user's designation or depending on other parameters (the number of pixels, ISO speed, etc.) set at the time of image capturing. [0091] The recovery filter generator 430 acquires the spectral transmittance characteristics of RGB filters from the device characteristics, and multiplies the 17 x 17 wavelength-dependent OTF data by the acquired characteristics, thereby converting them into OTF data for respective RGB components (S805) . [0092] Fig. 11 is a graph showing an example of the spectral transmittance characteristics of the RGB filters. As shown in Fig. 11, OTF data of respective wavelengths are multiplied by the transmittances of the corresponding wavelengths of the filters of respective components to normalize the OTF data by the sum totals of these products, thereby obtaining OTF data for respective RGB components.
[0093] The recovery filter generator 430 acquires the optical LPF information from the device characteristics, and applies low-pass filter characteristics indicated by the optical LPF information to the OTF data for respective RGB components (S806) . In this manner, OTF data for respective RGB components upon combination of the imaging lens 200 and camera body 100 are calculated. [0094] The recovery filter generator 430 calculates the coefficients of recovery filters for respective RGB components using the OTF data obtained in step S806 (S807) . Note that since the recovery filter generation method has been described above, a detailed description thereof will not be repeated. In this way, three filters each including 17 x 17 real values are generated as the recovery filters of a real space.
[0095] Fig. 12 is a chart illustrating generation of the recovery filters, and shows the data flow until the recovery filters of the real space for respective RGB components are generated from OTF data. As described above, OTF data for respective frequencies, which are independent from the image capturing device, are prepared for the imaging lens 200. Then, the camera body 100 side converts the OTF data acquired from the imaging lens 200 based on the sensor pitch, the spectral transmittance characteristics of the RGB filters, and the optical LPF characteristics of the image capturing device 418.
[0096] In this way, the camera body side can generate OTF data corresponding to each combination of the imaging lens and camera body. That is, an imaging lens which holds OTF data and a camera body which holds the device characteristics of the image capturing device allow to generate OTF data corresponding to a specific combination of an imaging lens as a new model and an existing camera body (or vice versa) , and recovery filters can be generated.
[0097] Of course, since the imaging lens holds OTF data, and the camera body holds the device characteristics of the image capturing device, they can be independently developed. Also, since the imaging lens and camera body need only hold their own information, the data amount to be held to generate recovery filters can be suppressed in both the imaging lens and camera body. Furthermore, when an imaging lens or camera body as a new model is used, the need for troublesome operations such as updating of firmware can be obviated for the user.
Second Embodiment
[0098] An image process according to the second embodiment of the present invention will be described below. Note that the same reference numerals in the second embodiment denote the same components as in the first embodiment, and a detailed description thereof will not be repeated.
[0099] As described above, in order to cope with the sensor pitch = 5 μm, data of spatial frequencies up to data of the Nyquist frequency = 100 Ip/mm need only be included. When a finer sensor pitch is set, data of spatial frequencies up to 250 Ip/mm have to be included for 2.0 μm, and those up to 500 Ip/mm have to be included for 1 μm.
[0100] Furthermore, the OTF data are complex numbers, and are two-dimensional data defined by the frequency space in the vertical and horizontal directions. Therefore, even only OTF data corresponding to certain device characteristic information require a very large data amount. Hence, the data amount of OTF data to be transmitted from the imaging lens 200 to the camera body 100, in other words, the data communication amount between the imaging lens 200 and camera body 100 becomes still very large. [0101] Thus, the second embodiment will explain an arrangement in which a MPU 431 of an imaging lens 200 calculates the coefficients of recovery filters, and transmits the calculated coefficients to an image processor 425. Note that the MPU 431 such as a one- chip microcontroller executes data processes to be described later according to programs stored in its internal read-only memory (ROM) using its internal random access memory (RAM) as a work memory. [0102] Fig. 13 is a flowchart for explaining a recovery process according to the second embodiment. Note that the same step numbers in Fig. 13 denote the same processes as in Fig. 8, and a detailed description thereof will not be repeated.
[0103] The image processor 425 reads out device characteristics of a camera body 100 from a camera characteristic value memory 428 (S721) . Then, the image processor 425 transmits the device characteristics to the imaging lens 200 (S722) . The MPU 431 of the imaging lens 200 receives the device characteristics (S723) . Note that since the device characteristics do not depend on the imaging conditions, the processes in steps S721 to S723 need only be executed once at the time of execution of an initialization process upon power ON after the imaging lens 200 and camera body 100 are connected. [0104] Upon reception of a notification of an image capturing state from a shutter controller 415 (S724), the MPU 431 acquires lens setting information from a lens controller 407 (S725) , and reads out OTF data corresponding to the lens setting information from a lens characteristic value memory 429 (S726) . Then, the MPU 431 generates the coefficients of recovery filters by executing the same processing sequence as in Fig. 9 based on the received device characteristics and the readout OTF data (S727), and transmits the coefficients of the recovery filters to the camera body 100 (S728) .
[0105] The image processor 425 of the camera body 100 receives the coefficients of the recovery filters (S729) . Then, the image processor 425 executes a developing process (S711) , recovery filter process (S712), and output process (S713) in the same manner as in the processing sequence shown in Fig. 8. [0106] According to this arrangement, data to be transmitted from the imaging lens 200 to the camera body 100 are, for example, sets of real value data of three 17 x 17 recovery filters for each image height, and the data communication amount can be greatly reduced compared to transmission of a set of OTF data. [0107] In general, the MPU 431 included in the imaging lens 200 has an arithmetic power inferior to the CPU 402 included in the camera body 100. For this reason, when the imaging lens 200 generates recovery filters, an arithmetic process of the recovery filters requires more time than the case in which the camera body 100 generates recovery filters. However, the data communication amount between the imaging lens 200 and camera body 100 can be greatly reduced. [0108] In the example of the above description, all the device characteristics are transmitted to the imaging lens 200, and the imaging lens 200 generates the coefficients of the recovery filters. However, a method of transmitting only the sensor pitch to the imaging lens 200 is available. Then, a recovery filter generator 430 receives OTF data whose data amount is reduced after sub-sampling, and executes subsequent processes (application of the spectral transmittance characteristics of RGB filters and optical LPF characteristics, and Fourier transformation) . In this manner, although the data communication amount is reduced less than the processing sequence shown in Fig. 13, the arithmetic process load on the MPU 431 can be reduced.
[0109] Likewise, a method of transmitting the sensor pitch and the spectral transmittance characteristics of RGB filters to the imaging lens 200 is also available. Then, the recovery filter generator 430 receives, for example, the frequency characteristics data of the three 17 x 17 recovery filters for each image height, and executes the subsequent processes (application of the optical LPF characteristics, and Fourier transformation) . In this way, the data communication amount can be reduced as in the processing sequence shown in Fig. 13, and the arithmetic process load on the MPU 431 can be reduced. [0110] In particular, since the Fourier transformation requires a heaviest arithmetic process load, when at least the Fourier transformation is excluded from the processes of the MPU 431, the arithmetic process load on the MPU 431 can be significantly reduced.
Third Embodiment
[0111] An image process according to the third embodiment of the present invention will be described below. Note that the same reference numerals in the third embodiment denote the same components as in the first and second embodiments, and a detailed description thereof will not be repeated. [0112] The first embodiment has exemplified the case in which a recovery process is executed by generating recovery filters in the camera body 100. The second embodiment has exemplified the case in which a recovery process is executed by generating recovery filters in the imaging lens 200. However, in general, the CPU 402 included in the camera body 100 and the MPU 431 included in the imaging lens 200 are powerless, and require much time to generate the recovery filters. [0113] The third embodiment will exemplify a case in which information required to generate recovery filters and capturing data are saved in a file, and software which runs on an external computer generates the recovery filters and executes a recovery process. [0114] Fig. 14 is a flowchart for explaining the processing sequence according to the third embodiment. [0115] Upon reception of a notification of an image capturing state from a shutter controller 415 (S724) , a MPU 431 acquires lens setting information from a lens controller 407 (S731) , and reads out OTF data corresponding to the lens setting information from a lens characteristic value memory 429 (S732) . Then, the MPU 431 transmits the readout OTF data to a camera body 100 (S733) .
[0116] An image processor 425 of the camera body 100 receives the OTF data (S734), reads out device characteristics of the camera body 100 from a camera characteristic value memory 428 (S735) , and acquires capturing data from the buffer memory 424 (S736) . Then, the image processor 425 saves the OTF data, device characteristics, and capturing data in a file, and outputs the file to a recording medium drive 419 (S737) . [0117] A file of a RAW data format stored in the recording medium 419a normally saves lens setting information as information at the time of image capturing together with capturing data. Therefore, in a file to be output in this embodiment, new areas for saving the device characteristic values of the camera body 100 and OTF data are added to existing areas for saving RAW data and lens setting information. [0118] The recovery filter generation process and recovery process by the software which runs on the external computer are the same as those in the aforementioned methods.
[0119] In the example of the above description, the OTF data received from the imaging lens 200, and the sensor pitch, the spectral transmittance characteristics of RGB filters, and the optical LPF characteristics as the device characteristics of the camera body 100 are saved in a file. Alternatively, OTF data whose data amount is reduced after sub- sampling, the spectral transmittance characteristics of RGB filters, and the optical LPF characteristics may be saved in a file. In this manner, the data size of a file to be stored in the recording medium 419a can be reduced.
[Modification]
[0120] The first and second embodiments have exemplified the case in which recovery filters are generated, and a recovered image that has undergone a recovery process is stored as a file in the recording medium 419a. The third embodiment has exemplified the case in which a file which saves information required to generate recovery filters together with capturing data is stored in the recording medium 419a.
[0121] When an imaging mode set in the camera body
100 is the JPEG mode, the recovery process is executed based on the processing sequence of the first or second embodiment, and a recovered image that has undergone the recovery process is stored as a file in the recording medium 419a. On the other hand, when the imaging mode is the RAW mode, capturing data and information required to generate recovery filters may be saved in a file based on the processing sequence of the third embodiment .
Fourth Embodiment
[0122] An image process according to the fourth embodiment of the present invention will be described below. Note that the same reference numerals in the fourth embodiment denote the same components as in the first to third embodiments, and a detailed description thereof will not be repeated.
[0123] The fourth embodiment will exemplify a case in which a process of a level according to an imaging mode is executed. That is, when the imaging mode set in a camera body 100 is the JPEG mode, the recovery process is executed using a common recovery filter which is independent from lens setting information at the time of image capturing, and a recovered image is saved in a file in a JPEG format. When the imaging mode is the RAW mode, recovery filters which are generated according to the lens setting information at the time of image capturing are saved in a file together with capturing data.
[0124] Figs. 15 and 16 are flowcharts for explaining the processing sequence according to the fourth embodiment. Note that the same step numbers in Figs. 15 and 16 denote the same processes as in Figs. 8 and 14, and a detailed description thereof will not be repeated.
[0125] A MPU 431 of an imaging lens 200 acquires OTF data required to generate a common recovery filter from a lens characteristic value memory 429 (S741) , and transmits the acquired OTF data to the camera body 100 (S742) .
[0126] A recovery filter generator 430 of the camera body 100 receives the OTF data (S743) , and acquires device characteristics of the camera body 100 from the camera characteristic value memory 428 (S744) . The recovery filter generator 430 calculates coefficients of the common recovery filter from the received OTF data and the acquired device characteristics (S745) , and transmits the calculated coefficients to the image processor 425 (S746) . [0127] Note that the common recovery filter remains unchanged if the combination of an imaging lens 200 and camera body 100 is fixed. Therefore, the processes in steps S741 to S746 need only be executed once at the time of execution of an initialization process upon power ON after the imaging lens 200 and camera body 100 are connected. The common recovery filter generation process is the same as that in Fig. 9. [0128] Upon reception of a notification of an image capturing state from a CPU 402 (S747), the image processor 425 inquires the CPU 402 of the set imaging mode (S748) . The image processor 425 then determines the imaging mode (S749) . If the imaging mode is the JPEG mode, the image processor 425 executes a developing process (S711) , recovery filter process (S712), and output process (S713) in the same manner as in the processing sequence shown in Fig. 8. Note that the image processor 425 uses the common recovery filter in the recovery filter process (S712) . [0129] On the other hand, if the imaging mode is the RAW mode, the image processor 425 requests the imaging lens 200 to send OTF data (S750) . In response to this request, the MPU 431 acquires lens setting information (S731) , reads out OTF data (S732), and transmits the OTF data (S733) in the same manner as in the processing sequence in Fig. 14.
[0130] The image processor 425 receives the OTF data (S734) , acquires device characteristics (S735) , acquires capturing data (S736) , and outputs a file that saves the OTF data, device characteristics, and capturing data (S737) in the same manner as in the processing sequence shown in Fig. 14. [0131] In this manner, according to the fourth embodiment, the processes are selectively executed according to the imaging modes. As a result, when the JPEG mode is set, the recovery process is executed using the common recovery filter. In this case, although the recovery precision lowers, the need for the process for generating recovery filters for each image capturing can be obviated, and the arithmetic process power and arithmetic process time required for the recovery process can be reduced. On the other hand, when the RAW mode is set, a file that saves the OTF data, device characteristics, and capturing data can be output to allow an external apparatus to execute an optimal recovery process.
[0132] Of course, when a JPEG + RAW mode is set as the imaging mode, a file that saves JPEG data processed using the common recovery filter, OTF data, device characteristics, and capturing data can be output.
Fifth Embodiment
[0133] An image process according to the fifth embodiment of the present invention will be described below. Note that the same reference numerals in the fifth embodiment denote the same components as in the first to fourth embodiments, and a detailed description thereof will not be repeated.
[0134] In the example of Fig. 6 in the first embodiment, the lens characteristic value memory 429 holds 310 OTF data for each combination of parameters
(each combination of the zoom position, F-value, and imaging distance) , and requires a very large storage capacity.
[0135] Fig. 17 is a view showing the concept of a data structure held by a lens characteristic value memory 429 according to the fifth embodiment.
[0136] The lens characteristic value memory 429 according to the fifth embodiment has a pointer table 1701 which describes pointers indicating addresses of OTF tables that store OTF groups according to lens setting information. The pointer table 1701 has OTF group addresses indicating addresses of OTF tables (not shown) in a memory for respective combinations of parameters .
[0137] The fifth embodiment is characterized in that identical or nearly identical OTF tables are shared to reduce the memory size of the lens characteristic value memory 429. For example, when an image of a distant landscape is to be captured (when the imaging distance is large) , identical OTF data may often be used at that time even when the imaging distance is changed. In the data structure example shown in Fig. 17, in case of a "zoom position '0' and F-value = 2.8", a pointer which points to an identical OTF table is stored for distant landscape image capturing processes from oo to 1250 mm. On the other hand, pointers which point to different OTF tables for respective combinations of parameters are stored for close view image capturing processes of 625 mm or less.
[0138] Also, Fig. 17 shows that an OTF table for a
"zoom position 1O1, F-value = 3.5, and imaging distance
= 400 mm" is the same as that for a "zoom position 1I',
F-value = 2.8, and imaging distance = 625 mm".
[0139] In order to implement the aforementioned sharing of OTF tables, the following processes are executed.
[0140] (1) OTF tables corresponding to all combinations of parameters are generated.
[0141] (2) The number of OTF tables that can be stored in the lens characteristic value memory 429 is calculated based on the memory size of the lens characteristic value memory 429.
[0142] (3) Representative OTF tables as many as the number of OTF tables calculated in process (2) are selected from those generated in process (1) .
[0143] (4) The OTF tables selected in process (3) are stored in the lens characteristic value memory 429.
[0144] In the fifth embodiment, the memory size of the lens characteristic value memory 429 can be reduced by sharing the OTF tables.
[0145] In addition, by changing the step sizes of wavelengths of OTF tables for respective OTF tables, the memory size of the lens characteristic value memory 429 can be reduced. In the example of the description of the first embodiment, OTF data are saved in the lens characteristic value memory 429 in correspondence with wavelengths having a step size of 10 nm in the wavelength range from 400 to 700 nm. However, the step size may be set to be that (e.g., 1 nm) smaller than the spectral transmittance characteristics of RGB filters. In this case, although each data amount is increased to 10 times, if both the data have a step size of 1 nm, more precise OTF data can be obtained, and the recovery process precision is improved. For example, by setting coarse step sizes for OTF tables corresponding to small data change amounts, and fine step sizes for OTF tables corresponding to large data change amounts, the memory size of the lens characteristic value memory 429 can be reduced as a whole.
Sixth Embodiment
[0146] An image process according to the sixth embodiment of the present invention will be described below. Note that the same reference numerals in the sixth embodiment denote the same components as in the first to fifth embodiments, and a detailed description thereof will not be repeated.
[0147] In the description of the example of Figs.
1OA and 1OB in the first embodiment, the subsequent calculation amount can be reduced by extracting OTF data (MTF data) within a region sandwiched between Nyquist frequencies.
[0148] However, in order to generate a recovery filter having higher precision, OTF data equal to or higher than the Nyquist frequency often might as well be used. For example, by acquiring OTF data up to a frequency (sampling frequency) twice the Nyquist frequency, a recovery filter can be designed in consideration of an aliasing effect of frequencies that exceed the Nyquist frequency component. [0149] Fig. 18 is a flowchart for explaining the process (S709) for generating recovery filters according to the sixth embodiment.
[0150] A recovery filter generator 430 sets a calculation mode based on a user's input or imaging conditions (e.g., a setting of a high-speed continuous shot mode) (S1801) . The calculation mode includes a high-precision mode and high-speed mode. Then, the process branches depending on whether the set calculation mode is the high-precision or high-speed mode (S1802) . If the high-precision mode is set, the recovery filter generator 430 loads OTF data up to the sampling frequency from a lens characteristic value memory 429 (S1803) . On the other hand, if the highspeed mode is set, the recovery filter generator 430 loads OTF data within a region sandwiched between the Nyquist frequencies from the lens characteristic value memory 429 (S1804) . Then, the recovery filter generator 430 generates recovery filters based on the OTF data loaded in step S1803 or S1804 (S1805) .
[0151] With the aforementioned processes, a recovery filter can be generated according to a user's demand (as to whether or not to recover at high speed or with high precision) and an image capturing state
(whether or not a high-speed continuous shot mode is set) upon execution of the recovery process.
Modification of Embodiments
[0152] In case of imaging lenses such as super- telephoto lenses incorporating switching filters, lens setting information includes the types of switching filters. Then, the imaging lens holds OTF data according to respective switching filters in the lens characteristic value memory 429.
[0153] Furthermore, the lens setting information may include the spectral transmittance characteristics of the imaging lens itself. In this case, upon generation of recovery filters, the products of the spectral transmittance characteristics of RGB filters of the camera body and those of the imaging lens are used in place of the spectral transmittance characteristics of the RGB filters. [0154] An infrared cut filter or ultraviolet cut filter may often be laid out on the image capturing device. The spectral transmittance characteristics of these filters can be included in those of RGB filters of the camera body. Furthermore, the spectral transmittance characteristics of the RGB filters of the camera body can be held in consideration of the spectral sensitivity characteristics of the image capturing device.
Other Embodiments
[0155] Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment ( s ), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment ( s ) . For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium) .
[0156] While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions. [0157] This application claims the benefit of Japanese Patent Application No. 2008-315031, filed December 10, 2008, which is hereby incorporated by reference herein in its entirety.

Claims

1. An image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit .
2. An image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus/ and an output section, configured to output a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens.
3. The image processing apparatus according to claim 1 or 2, wherein the imaging lens outputs optical transfer information corresponding to lens setting information of the imaging lens.
4. The image processing apparatus according to claim 1 or 2, further comprising a transmitter configured to transmit lens setting information of the imaging lens to the imaging lens, wherein the imaging lens outputs optical transfer information corresponding to the received lens setting information.
5. An image processing apparatus for an image capturing apparatus using an interchangeable imaging lens, the image processing apparatus comprising: an input section, configured to input optical transfer information of the imaging lens from the imaging lens; an acquisition section, configured to acquire characteristic information of an image capturing unit of the image capturing apparatus; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and a generator, configured to generate a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and an output section, configured to selectively execute, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens.
6. The image processing apparatus according to claim 5, wherein when the imaging mode is a JPEG mode, said output section outputs image data corrected using a correction filter which is generated from optical transfer information common to lens settings of the imaging lens .
7. The image processing apparatus according to claim 5, wherein when the imaging mode is a RAW mode, said output section outputs the file which saves optical transfer information according to a lens setting of the imaging lens .
8. A data processing apparatus for an interchangeable imaging lens of an image capturing apparatus,, the data processing apparatus comprising: a receiver, configured to receive characteristic information of an image capturing unit of the image capturing apparatus; an acquisition section, configured to acquire optical transfer information according to a lens setting of the imaging lens; a converter, configured to convert the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; a generator, configured to generate a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and a transmitter, configured to transmit the correction filter to the image capturing apparatus.
9. An image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit.
10. An image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; and outputting a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
11. An image processing method of an image capturing apparatus using an interchangeable imaging lens, the method comprising the steps of: inputting optical transfer information of the imaging lens from the imaging lens; acquiring characteristic information of an image capturing unit of the image capturing apparatus; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; and generating a correction filter based on the optical transfer information that depends on the characteristics of the image capturing unit; and selectively executing, according to an imaging mode of the image capturing apparatus, outputting of image data obtained by correcting degradation of an image captured via the imaging lens using the correction filter, or outputting of a file which saves the optical transfer information, the characteristic information, and capturing data captured via the imaging lens .
12. A method of a data processing apparatus for an interchangeable imaging lens of an image capturing apparatus, the method comprising the steps of: receiving characteristic information of an image capturing unit of the image capturing apparatus; acquiring optical transfer information according to a lens setting of the imaging lens; converting the optical transfer information into optical transfer information that depends on characteristics of the image capturing unit, based on the characteristic information; generating a correction filter, which corrects degradation of an image captured via the imaging lens, based on the optical transfer information that depends on the characteristics of the image capturing unit; and transmitting the correction filter to the image capturing apparatus .
13. A computer-readable storage medium storing a computer-executable program for causing a processor to perform a method in accordance with any one of claims 9 to 12.
PCT/JP2009/070273 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method WO2010067740A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/122,421 US8730371B2 (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method
KR1020117015511A KR101246738B1 (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method
EP09831844.7A EP2377308B1 (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method
CN200980149785.6A CN102246505B (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method
JP2011516182A JP5162029B2 (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008315031 2008-12-10
JP2008-315031 2008-12-10

Publications (1)

Publication Number Publication Date
WO2010067740A1 true WO2010067740A1 (en) 2010-06-17

Family

ID=42242728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/070273 WO2010067740A1 (en) 2008-12-10 2009-11-26 Image processing apparatus and image processing method, and data processing apparatus and data processing method

Country Status (6)

Country Link
US (1) US8730371B2 (en)
EP (1) EP2377308B1 (en)
JP (1) JP5162029B2 (en)
KR (1) KR101246738B1 (en)
CN (2) CN103327237B (en)
WO (1) WO2010067740A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102445818A (en) * 2010-09-30 2012-05-09 株式会社尼康 Interchangeable lens and camera body
JP2012093712A (en) * 2010-09-30 2012-05-17 Nikon Corp Interchangeable lens and camera body
JP2012156716A (en) * 2011-01-25 2012-08-16 Canon Inc Image processing device, imaging device, image processing method, and program
JP2013020610A (en) * 2011-06-14 2013-01-31 Canon Inc Image processing apparatus, image processing method and program
US8515254B2 (en) 2010-04-20 2013-08-20 Canon Kabushiki Kaisha Video editing apparatus and video editing method
EP2688285A3 (en) * 2012-07-20 2014-07-23 Canon Kabushiki Kaisha Image capture apparatus and control method thereof, and lens unit
US9030597B2 (en) 2010-09-30 2015-05-12 Nikon Corporation Interchangeable lens and camera body
WO2016002126A1 (en) * 2014-07-04 2016-01-07 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, image processing program, and storage medium
EP3029630A1 (en) * 2014-12-01 2016-06-08 Canon Kabushiki Kaisha Control apparatus, image processing apparatus, lens apparatus, image processing system, control method, image processing method, program, and storage medium
RU2599628C2 (en) * 2012-03-21 2016-10-10 Кэнон Кабусики Кайся Image capturing device

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5274391B2 (en) * 2009-06-29 2013-08-28 キヤノン株式会社 Interchangeable lens camera and control method thereof
JP5494963B2 (en) * 2009-11-09 2014-05-21 株式会社リコー Camera system
JP4931266B2 (en) * 2010-08-27 2012-05-16 キヤノン株式会社 Image processing method, image processing apparatus, and image processing program
JP5361976B2 (en) * 2011-08-25 2013-12-04 キヤノン株式会社 Image processing program, image processing method, image processing apparatus, and imaging apparatus
JP5882789B2 (en) * 2012-03-01 2016-03-09 キヤノン株式会社 Image processing apparatus, image processing method, and program
US9674431B2 (en) 2013-02-01 2017-06-06 Canon Kabushiki Kaisha Image pickup apparatus, image processing apparatus, image processing method, and non-transitory computer-readable storage medium
US9424629B2 (en) * 2013-02-01 2016-08-23 Canon Kabushiki Kaisha Image pickup apparatus, image processing apparatus, image processing method, and non-transitory computer-readable storage medium
WO2015015949A1 (en) * 2013-08-01 2015-02-05 富士フイルム株式会社 Imaging device, imaging method, and image processing device
JP5969139B2 (en) 2013-11-08 2016-08-17 富士フイルム株式会社 Camera system, camera body and communication method
JP5965553B2 (en) 2013-11-08 2016-08-10 富士フイルム株式会社 Camera system, camera body, interchangeable lens, and communication method
JP6305217B2 (en) * 2014-06-03 2018-04-04 キヤノン株式会社 Information processing apparatus and control method therefor, camera system, program, and storage medium
JP6627207B2 (en) * 2014-08-01 2020-01-08 株式会社ニコン Lens barrel and camera body
JP6478511B2 (en) 2014-08-01 2019-03-06 キヤノン株式会社 Image processing method, image processing apparatus, compound eye imaging apparatus, image processing program, and storage medium
JP6440467B2 (en) * 2014-11-21 2018-12-19 キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP6573354B2 (en) * 2014-11-28 2019-09-11 キヤノン株式会社 Image processing apparatus, image processing method, and program
WO2016167188A1 (en) * 2015-04-16 2016-10-20 富士フイルム株式会社 Image capturing device, image processing device, image processing method, program, and recording medium
JP6486182B2 (en) * 2015-04-22 2019-03-20 キヤノン株式会社 Image processing apparatus, imaging apparatus, and image processing program
JP6535524B2 (en) * 2015-06-30 2019-06-26 オリンパス株式会社 Imaging device
US10109126B2 (en) * 2016-01-12 2018-10-23 Chi-Wei Chiu Biological recognition lock system
JP6351690B2 (en) * 2016-11-02 2018-07-04 キヤノン株式会社 Signal processing apparatus, signal processing method, computer program, lens unit

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003051979A (en) * 2001-08-03 2003-02-21 Minolta Co Ltd Electronic camera provided with image processing function and a computer-readable recording medium recording program enabling the electronic camera to realize the image processing function
JP2003189236A (en) * 2001-10-09 2003-07-04 Seiko Epson Corp Output picture adjustment for picture data
JP2003244621A (en) * 2002-02-19 2003-08-29 Canon Inc Image processing method and image processing apparatus, control program for the image processing apparatus, and storage medium
US20080088728A1 (en) 2006-09-29 2008-04-17 Minoru Omaki Camera
WO2008044591A1 (en) * 2006-10-06 2008-04-17 Sharp Kabushiki Kaisha Imaging device, image reproducing device, image printing device, imaging device control method, image correcting method for image reproducing device, and image correcting method for image printing device
US20080239099A1 (en) 2007-03-30 2008-10-02 Pentax Corporation Camera

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62127976A (en) 1985-11-29 1987-06-10 Kyocera Corp Image recording processor
JPH0488765A (en) 1990-07-31 1992-03-23 Sony Corp Video signal processor
US6243529B1 (en) * 1993-03-30 2001-06-05 Canon Kabushiki Kaisha Recording/reproducing apparatus capable of synthesizing fractions of image data obtained from plural reads of a track
JPH0795538A (en) * 1993-09-17 1995-04-07 Canon Inc Image recording and reproducing device
JPH07203363A (en) * 1993-12-28 1995-08-04 Canon Inc Recording and reproducing device
JPH09116911A (en) * 1995-10-20 1997-05-02 Canon Inc Image pickup system
US6822758B1 (en) * 1998-07-01 2004-11-23 Canon Kabushiki Kaisha Image processing method, system and computer program to improve an image sensed by an image sensing apparatus and processed according to a conversion process
JP2000020691A (en) 1998-07-01 2000-01-21 Canon Inc Image processing device and method, image-pickup device, control method, and storage medium therefor
JP4369585B2 (en) 2000-02-07 2009-11-25 富士フイルム株式会社 Image processing device
JP4370780B2 (en) 2002-12-25 2009-11-25 株式会社ニコン Blur correction camera system, blur correction camera, image restoration device, and blur correction program
JP4534756B2 (en) * 2004-12-22 2010-09-01 ソニー株式会社 Image processing apparatus, image processing method, imaging apparatus, program, and recording medium
CN101258740A (en) * 2005-07-28 2008-09-03 京瓷株式会社 Imaging device and image processing method
JP4749959B2 (en) * 2006-07-05 2011-08-17 京セラ株式会社 Imaging device, manufacturing apparatus and manufacturing method thereof
WO2008081575A1 (en) * 2006-12-27 2008-07-10 Nikon Corporation Distortion correcting method, distortion correcting device, distortion correcting program, and digital camera
JP2008172321A (en) * 2007-01-09 2008-07-24 Olympus Imaging Corp Image pickup device for performing electric image recovery processing
JP5121234B2 (en) * 2007-01-12 2013-01-16 キヤノン株式会社 Data management apparatus and method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003051979A (en) * 2001-08-03 2003-02-21 Minolta Co Ltd Electronic camera provided with image processing function and a computer-readable recording medium recording program enabling the electronic camera to realize the image processing function
JP2003189236A (en) * 2001-10-09 2003-07-04 Seiko Epson Corp Output picture adjustment for picture data
JP2003244621A (en) * 2002-02-19 2003-08-29 Canon Inc Image processing method and image processing apparatus, control program for the image processing apparatus, and storage medium
US20080088728A1 (en) 2006-09-29 2008-04-17 Minoru Omaki Camera
WO2008044591A1 (en) * 2006-10-06 2008-04-17 Sharp Kabushiki Kaisha Imaging device, image reproducing device, image printing device, imaging device control method, image correcting method for image reproducing device, and image correcting method for image printing device
US20080239099A1 (en) 2007-03-30 2008-10-02 Pentax Corporation Camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2377308A4

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515254B2 (en) 2010-04-20 2013-08-20 Canon Kabushiki Kaisha Video editing apparatus and video editing method
US9030597B2 (en) 2010-09-30 2015-05-12 Nikon Corporation Interchangeable lens and camera body
JP2012093712A (en) * 2010-09-30 2012-05-17 Nikon Corp Interchangeable lens and camera body
CN102445818A (en) * 2010-09-30 2012-05-09 株式会社尼康 Interchangeable lens and camera body
JP2012156716A (en) * 2011-01-25 2012-08-16 Canon Inc Image processing device, imaging device, image processing method, and program
JP2013020610A (en) * 2011-06-14 2013-01-31 Canon Inc Image processing apparatus, image processing method and program
US9848115B2 (en) 2012-03-21 2017-12-19 Canon Kabushiki Kaisha Image capturing apparatus capable of adjusting optical characteristics of lens unit attachable thereto
RU2599628C2 (en) * 2012-03-21 2016-10-10 Кэнон Кабусики Кайся Image capturing device
US9742967B2 (en) 2012-07-20 2017-08-22 Canon Kabushiki Kaisha Image capture apparatus capable of correcting effects of optical characteristics of a lens unit on an image and control method thereof, and lens unit
EP2688285A3 (en) * 2012-07-20 2014-07-23 Canon Kabushiki Kaisha Image capture apparatus and control method thereof, and lens unit
EP3407595A1 (en) * 2012-07-20 2018-11-28 Canon Kabushiki Kaisha Image capture apparatus and control method thereof, and lens unit
WO2016002126A1 (en) * 2014-07-04 2016-01-07 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, image processing program, and storage medium
JP2016019017A (en) * 2014-07-04 2016-02-01 キヤノン株式会社 Image processing apparatus, imaging device, image processing method, image processing program, and storage medium
US10026157B2 (en) 2014-07-04 2018-07-17 Canon Kabushiki Kaisha Image processing apparatus, image pickup apparatus, image processing method, image processing program, and storage medium
EP3029630A1 (en) * 2014-12-01 2016-06-08 Canon Kabushiki Kaisha Control apparatus, image processing apparatus, lens apparatus, image processing system, control method, image processing method, program, and storage medium
US9990698B2 (en) 2014-12-01 2018-06-05 Canon Kabushiki Kaisha Control apparatus, lens apparatus, and non-transitory computer-readable storage medium that determine data as coefficient data corresponding to an order less than a predetermined order of an approximation function

Also Published As

Publication number Publication date
JP2012505562A (en) 2012-03-01
EP2377308B1 (en) 2015-10-28
EP2377308A4 (en) 2013-01-09
CN102246505A (en) 2011-11-16
KR101246738B1 (en) 2013-03-25
EP2377308A1 (en) 2011-10-19
JP5162029B2 (en) 2013-03-13
CN102246505B (en) 2014-08-13
CN103327237B (en) 2016-08-24
CN103327237A (en) 2013-09-25
US20110187874A1 (en) 2011-08-04
US8730371B2 (en) 2014-05-20
KR20110102407A (en) 2011-09-16

Similar Documents

Publication Publication Date Title
EP2377308B1 (en) Image processing apparatus and image processing method, and data processing apparatus and data processing method
JP5969139B2 (en) Camera system, camera body and communication method
JP6173156B2 (en) Image processing apparatus, imaging apparatus, and image processing method
US8081231B2 (en) Lens-interchangeable camera and method of controlling the same
JP5237978B2 (en) Imaging apparatus and imaging method, and image processing method for the imaging apparatus
US20090021600A1 (en) Image pickup device and control method thereof
JP5766077B2 (en) Image processing apparatus and image processing method for noise reduction
JP2012049773A (en) Imaging apparatus and method, and program
JP2008091999A (en) Camera
JP2018107526A (en) Image processing device, imaging apparatus, image processing method and computer program
CN104813227A (en) Image capture device, image capture method, and program
JP2008092440A (en) Camera, and image processing program
JP2019068117A (en) Image processing apparatus and control method of the same, and imaging apparatus
JP2021177646A (en) Imaging element, imaging device, image data processing method, and program
JP2007013270A (en) Imaging apparatus
KR101476648B1 (en) Digital photographing apparatus and the controlling method for the same
JP6202982B2 (en) Image processing apparatus, control method thereof, and control program
JP6043381B2 (en) Imaging system, imaging apparatus, control method, and program
JP2016134817A (en) Compound-eye imaging apparatus and compound-eye imaging control program
JP2022137918A (en) Imaging apparatus and method for controlling the same
JP2020102793A (en) Image processing device, imaging apparatus, and control method
JP2014154913A (en) Image processing apparatus, image processing method and program
CN116458168A (en) Detection device, imaging device, detection method, and program
JP2002006127A (en) Optical low pass filter and image input device using the same
JP2017017384A (en) Imaging device, control method for the same, control device, lens device, program and storage medium

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980149785.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09831844

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13122421

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2009831844

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011516182

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20117015511

Country of ref document: KR

Kind code of ref document: A